Hello,
My book proves the increase/decrease test by using the mean value theorem. I have a problem which I think is similar, but i'm not sure if i've solved it correctly.
1) Use the Mean Value Theorem to show that f(x)=1/x^2 decreases on any interval to the right of the origin.
Mean Value Theorem:
Continuous on closed interval? Differentiable on open interval?
I have the following:
f(x) is continuous and differentiable for all numbers in its domain as a rational function, domain in this case being (-infinity,0) U (0,+infinity). This domain includes any closed interval where x>0, so f(x) is continuous and differentiable on some interval [a,b] with a>0. Then, according to MVT, there is a number c where:
f(b)-f(a)=f'(c)(b-a)
Now here's the part i'm not sure if I can do to 'prove' this. What I did was this:
f'(x) = -2/x^3. When x>0, f'(x) must always be negative.
So we can say that since f'(c)<0 (negative) when x>0, and b-a>0 since a is less than b, the right side of the equation above must always be negative when x>0. So:
f(b)-f(a)<0, or f(b)<f(a)
So, if a<b, f(a)<f(b) when x>0, which means the function must always be decreasing to the right of the origin.
My book proves the increase/decrease test by using the mean value theorem. I have a problem which I think is similar, but i'm not sure if i've solved it correctly.
1) Use the Mean Value Theorem to show that f(x)=1/x^2 decreases on any interval to the right of the origin.
Mean Value Theorem:
Continuous on closed interval? Differentiable on open interval?
I have the following:
f(x) is continuous and differentiable for all numbers in its domain as a rational function, domain in this case being (-infinity,0) U (0,+infinity). This domain includes any closed interval where x>0, so f(x) is continuous and differentiable on some interval [a,b] with a>0. Then, according to MVT, there is a number c where:
f(b)-f(a)=f'(c)(b-a)
Now here's the part i'm not sure if I can do to 'prove' this. What I did was this:
f'(x) = -2/x^3. When x>0, f'(x) must always be negative.
So we can say that since f'(c)<0 (negative) when x>0, and b-a>0 since a is less than b, the right side of the equation above must always be negative when x>0. So:
f(b)-f(a)<0, or f(b)<f(a)
So, if a<b, f(a)<f(b) when x>0, which means the function must always be decreasing to the right of the origin.