From: Introduction To Analysis, fifth edition, by Edward Gaughan (1998), Exercise 2.24.
Suppose f:[a,b] - R and define g:[a,b] - R as follows: g(x)= sup { f(t): a<= t <= x }.
Prove that g has a limit at xo if f has a limit at xo and limt-xof(t) = f(xo).
In the course of trying to work out the proof of this problem I considered the following scenario:
Let a = 0, b=1. Let x0=3\4 and let f(t) = (2t-1)-1 for all t in [0,1/2) U (1/2,1], while f(t)=0 if t = 1/2.
Then lim t - xo f(t) = 2 but g(x) is unbounded on [a,b] = [0,1] (because of the singularity at t = 1/2). Then f(t) has a limit at t= xo but g is not defined. Therefore the statement to be proved appears to be false. Where is my logical error? Thanks for your help.
Max.
Suppose f:[a,b] - R and define g:[a,b] - R as follows: g(x)= sup { f(t): a<= t <= x }.
Prove that g has a limit at xo if f has a limit at xo and limt-xof(t) = f(xo).
In the course of trying to work out the proof of this problem I considered the following scenario:
Let a = 0, b=1. Let x0=3\4 and let f(t) = (2t-1)-1 for all t in [0,1/2) U (1/2,1], while f(t)=0 if t = 1/2.
Then lim t - xo f(t) = 2 but g(x) is unbounded on [a,b] = [0,1] (because of the singularity at t = 1/2). Then f(t) has a limit at t= xo but g is not defined. Therefore the statement to be proved appears to be false. Where is my logical error? Thanks for your help.
Max.