Zermelo
Junior Member
- Joined
- Jan 7, 2021
- Messages
- 148
The task is:
Let f be an Lebesgue integrable function on [0, 1], and let [imath]\lambda \in (0,1)[/imath]. If [imath]\int_Efd\mu = 0[/imath] for every set [imath]E \subset [0, 1][/imath] such that [imath]\mu E = \lambda[/imath], prove that [imath]f = 0[/imath] almost everywhere.
This question would be relatively easy to prove if f is non-negative. But in the general case, I can't figure out the proof.
I think it would be best to re-use some results, like
Or
The problem here is that every statement [imath]\int_Efd\mu = 0[/imath] doesn't have to mean that f = 0, instead it can mean that [imath]\int_{E_+}f^+d\mu = \int_{E_-}f^-d\mu,\ E_+ U E_- = E[/imath]. The fact that this holds for every E with measure lambda is quite powerful, and my idea is to construct one such set E1 that contains E+, which would then mean that [imath]\int_{E_+}f^+d\mu = \int_{E_1 \setminus E_+}f^-d\mu[/imath]. I'm hoping that there's a smart way of constructing many of those sets E1, E2, ... (even maybe a countable infinity of them), each containing E+, and each with measure lambda. Then, that would mean that [imath]\int_{E_+}f^+d\mu = \int_{E_n \setminus E_+}f^-d\mu[/imath], for all n in N. This is quite a strong statement, and if I construct these sets in a proper way, I might be able to prove that this can't hold.
Does anyone have a better idea, or maybe can help with finishing up my idea?
Let f be an Lebesgue integrable function on [0, 1], and let [imath]\lambda \in (0,1)[/imath]. If [imath]\int_Efd\mu = 0[/imath] for every set [imath]E \subset [0, 1][/imath] such that [imath]\mu E = \lambda[/imath], prove that [imath]f = 0[/imath] almost everywhere.
This question would be relatively easy to prove if f is non-negative. But in the general case, I can't figure out the proof.
I think it would be best to re-use some results, like
Or
The problem here is that every statement [imath]\int_Efd\mu = 0[/imath] doesn't have to mean that f = 0, instead it can mean that [imath]\int_{E_+}f^+d\mu = \int_{E_-}f^-d\mu,\ E_+ U E_- = E[/imath]. The fact that this holds for every E with measure lambda is quite powerful, and my idea is to construct one such set E1 that contains E+, which would then mean that [imath]\int_{E_+}f^+d\mu = \int_{E_1 \setminus E_+}f^-d\mu[/imath]. I'm hoping that there's a smart way of constructing many of those sets E1, E2, ... (even maybe a countable infinity of them), each containing E+, and each with measure lambda. Then, that would mean that [imath]\int_{E_+}f^+d\mu = \int_{E_n \setminus E_+}f^-d\mu[/imath], for all n in N. This is quite a strong statement, and if I construct these sets in a proper way, I might be able to prove that this can't hold.
Does anyone have a better idea, or maybe can help with finishing up my idea?