Directions: Use Lagrange multipliers to find the max and min values of f subject to the given constraint. Also, find the points at which these extremes occur.
f(x,y,z) = xyz; x^2 + y^2 + z^2 = 1
gradient of f = < yz, xz, xy> , gradient of g = <2x, 2y, 2z>
So,
yz = (lambda)2x
xz = (lambda)2y
xy = (lambda)2z
So,
yz = (lambda)2x
yz / 2x = (lambda)
So, back substitution:
xz = (yz /2x)(2y)
y^2 = 1
and,
xy = (yz / 2x)(2z)
y^2 = 1
Now subbing into the constraint:
x^2 + (1) + (1) = 1
x^2 = -1
This is where I get stuck.
I want to take the square root and sub the answers into f(x,y,z). This is fine for y and z, but I can't do it for x because I'd have to take the square root of -1, which I can't do.
What am I doing wrong?
f(x,y,z) = xyz; x^2 + y^2 + z^2 = 1
gradient of f = < yz, xz, xy> , gradient of g = <2x, 2y, 2z>
So,
yz = (lambda)2x
xz = (lambda)2y
xy = (lambda)2z
So,
yz = (lambda)2x
yz / 2x = (lambda)
So, back substitution:
xz = (yz /2x)(2y)
y^2 = 1
and,
xy = (yz / 2x)(2z)
y^2 = 1
Now subbing into the constraint:
x^2 + (1) + (1) = 1
x^2 = -1
This is where I get stuck.
I want to take the square root and sub the answers into f(x,y,z). This is fine for y and z, but I can't do it for x because I'd have to take the square root of -1, which I can't do.
What am I doing wrong?