So I know this problem is simple but I'm having trouble with it:
I have a right triangle as follows:
90degree corner lines up with Y/X vertices, and the point there is O.
Directly above point O is point Y(0,y).
The point to the right of point O is point X(x,0).
Find the slope of OM if M is the midpoint of XY.
I have it as -y/x, but that doesn't seem right?
I have a right triangle as follows:
90degree corner lines up with Y/X vertices, and the point there is O.
Directly above point O is point Y(0,y).
The point to the right of point O is point X(x,0).
Find the slope of OM if M is the midpoint of XY.
I have it as -y/x, but that doesn't seem right?