Constrained Minimization: Prove Local min => Global min

kingwinner

New member
Joined
Jul 23, 2011
Messages
10
optim4.JPG

Note: A and Q are matrices and x is some point in Rd.

Attempted Proof:
("o" below means dot product)
Let M={x|Ax=c} and f(x)=(1/2)x o Qx - b o x
Suppose x0 is a local min point.
I will use proof by contradiction. Suppose, on the contrary, that x0 is NOT a global min point. Then there must exist a point x1 s.t. f(x1)<f(x0).

Let x(s) = (1-s)x0 + s x1, s is any real number. (x: R->Rd, this is parametric equation of the straight line through x0 and x1)
Note that x(0)=x0, x(1)=x1

Let φ: R->R be defined by φ(s)=f(x(s)), s is any real number. Then φ has a local min at s=0.
=> φ'(0)=0 and φ''(0)≥0 (1st and 2nd order necessary conditions for a local min)

By chain rule, 0 = φ'(0) = x1 o Qx0 - x0 o Qx0 - b o (x1-x0)
and 0 ≤ φ''(0) = x1 o Qx1 - x1 o Qx0 - x0 o Qx1 + x0 o Qx0

and I'm stuck here...how can I arrive at a contradiction from here? I tried adding them to get φ'(0)+φ''(0)≥0, but it doesn't seem to help because it's a mess and doesn't seem to simplify to f(x1)-f(x0).

Any help would be very much appreciated!:)
 
Last edited:
View attachment 1702

Note: A and Q are matrices and x is some point in Rd.

Attempted Proof:
("o" below means dot product)
Let M={x|Ax=c} and f(x)=(1/2)x o Qx - b o x
Suppose x0 is a local min point.
I will use proof by contradiction. Suppose, on the contrary, that x0 is NOT a global min point. Then there must exist a point x1 s.t. f(x1)<f(x0).

Let x(s) = (1-s)x0 + s x1, s is any real number. (x: R->Rd, this is parametric equation of the straight line through x0 and x1)
Note that x(0)=x0, x(1)=x1

Let φ: R->R be defined by φ(s)=f(x(s)), s is any real number. Then φ has a local min at s=0.
=> φ'(0)=0 and φ''(0)≥0 (1st and 2nd order necessary conditions for a local min)

By chain rule, 0 = φ'(0) = x1 o Qx0 - x0 o Qx0 - b o (x1-x0)
and 0 ≤ φ''(0) = x1 o Qx1 - x1 o Qx0 - x0 o Qx1 + x0 o Qx0

and I'm stuck here...how can I arrive at a contradiction from here? I tried adding them to get φ'(0)+φ''(0)≥0, but it doesn't seem to help because it's a mess and doesn't seem to simplify to f(x1)-f(x0).

Any help would be very much appreciated!:)

Edit: Nevermind. I didn't read your problem carefully enough. I don't know. I would need to think about it more.
 
Last edited:
Top