Let V be a vector space with finite dimensional. Let subset S of V be:
. . . . ..S0={f∈V∗∣∣∣ f(u)=0 ∀u∈S}
Prove that if W1 and W2 are subspaces of V then:
. . . . .(W1∩W2)0=W10+W20
I succeed in proving that the right side is contained in the left side of the equation, but I didn't succeed to prove the opposite direction.
I think I should do here something with bases for each of the spaces here, but I don't know how.
Please help me here...
. . . . ..S0={f∈V∗∣∣∣ f(u)=0 ∀u∈S}
Prove that if W1 and W2 are subspaces of V then:
. . . . .(W1∩W2)0=W10+W20
I succeed in proving that the right side is contained in the left side of the equation, but I didn't succeed to prove the opposite direction.
I think I should do here something with bases for each of the spaces here, but I don't know how.
Please help me here...
Last edited by a moderator: