Lets say I have the function: f(x) = x^5 - x^10 and want to find the global max on the interval 0 <= x <= 1
if I set the first derivative of f(x) = 0:
5x^4 - 10x^9 = 0
x = .870551 or x = 0
I know how to test to see if it's a min or max by taking the second derivative, plugging in a point and seeing if it's less than or greater than 0 (or = to 0 for inflection)
Now my professor always says to check end points and always plugs in both ends of an interval into f(x) before checking for critical points.... why?
if I set the first derivative of f(x) = 0:
5x^4 - 10x^9 = 0
x = .870551 or x = 0
I know how to test to see if it's a min or max by taking the second derivative, plugging in a point and seeing if it's less than or greater than 0 (or = to 0 for inflection)
Now my professor always says to check end points and always plugs in both ends of an interval into f(x) before checking for critical points.... why?