Assume A and B are subsets of [0, 1] and that A is a subset of B. Prove m*(A)<=m*(B).
Here is my proof, but I am not sure if the italicized statement is true:
Since A is a subset of B, Let B =(B\A) U A.
Then m*(B) = m*(B\A) + m*(A) and since m*(B\A)>= 0 (by definition)
we get that m*(B)>=m*(A).
Thanks for any insight!
Here is my proof, but I am not sure if the italicized statement is true:
Since A is a subset of B, Let B =(B\A) U A.
Then m*(B) = m*(B\A) + m*(A) and since m*(B\A)>= 0 (by definition)
we get that m*(B)>=m*(A).
Thanks for any insight!