Hi,
I came across a list of "linear algebra identities" in a book and almost all of them are super obvious, but there's one that I'm having trouble with -- so I don't know if I'm making a dumb mistake or if there's a typo in the book...
Notation note: Below, I use ||a|| to represent the magnitude of vector a
Here's the statement:
Given vectors a and b :
||a||^2 + ||b||^2 = ||a+b||^2
------------
I can picture cases where this statement can be made true -- like if a is <1,0> and b is <0,2> -- in which case it would look just like a standard application of the Pythagorean theorem...
but, as an example, in the case of a being <3,2> and b being <-1, 0> -- wouldn't it be false?
Thanks
I came across a list of "linear algebra identities" in a book and almost all of them are super obvious, but there's one that I'm having trouble with -- so I don't know if I'm making a dumb mistake or if there's a typo in the book...
Notation note: Below, I use ||a|| to represent the magnitude of vector a
Here's the statement:
Given vectors a and b :
||a||^2 + ||b||^2 = ||a+b||^2
------------
I can picture cases where this statement can be made true -- like if a is <1,0> and b is <0,2> -- in which case it would look just like a standard application of the Pythagorean theorem...
but, as an example, in the case of a being <3,2> and b being <-1, 0> -- wouldn't it be false?
Thanks