Linear algebra distance problem.

LostInCalculus

New member
Joined
Mar 1, 2008
Messages
3
Find the distance from the point P (-3, 2, -7) to the line that passes through the points Q (-4, 3, 0) and R (-2, 1, -2).

I started it via getting the direction vector between R and Q, (-4, 3, 0)-(-2, 1, 2) = (-2, 2, 2)

Then got vector v from R to P (-2, 1, -2)-(-3, 2, -7) = (1, -1, 5)

Proj of v on d = (v*d/||d||^2)*d then gave me -1/2 (-2, 2, 2) ==> (1, -1, -1)

Now this is where I am a little confused, the projection is the vector or the point at the right angle? I am guessing the vector because from here I did the distance between (-3, 2, -7) and (1, -1, -1), getting sqrt (61) and that is wrong.

So I am not sure how to get from here to an actual distance between the right angle on the line passing through those two points with what I have so far.

Update, I also tried doing the distance formula via the point sqrt (1^2+1^2+1^2) due to a problem I saw that was similar, sqrt (3) came back as wrong as well.
 
You must make up your mind.

Q-R is your basis. You have that.

P-R is the vector to be projected. Call it \(\displaystyle \alpha\)

The projection, you have. (1, -1, -1). Call it \(\displaystyle \beta\)

The vector you wish to measure is \(\displaystyle \alpha - \beta\)

Get your notation a little cleaner and you'll have it. There are a few sign problems, too. Be more careful.
 
Top