Projected Gradient Descent: Minimize (1/2)(x_1)^2 + (1/2)(x_2)^2, sub. to 2-x_1-x_2=0

HD5450

New member
Joined
Dec 13, 2018
Messages
4
Screen Shot 2018-12-13 at 5.57.00 AM.jpg

Part (a) and part (b) is pretty easy but for part (c) I don't know how you would start the problem, I tried to look online for examples but all I found are general equations.

Any help would be appreciated.

Edit: The projected gradient descent is defined as xk+1=X(xkτkf(xk))\displaystyle x_{k+1} = \prod_X (x_k - \tau_k \nabla f(x_k)) where X(x)\displaystyle \prod_X(x) is orthogonal projection of x\displaystyle xon X\displaystyle X and τk\displaystyle \tau_k is the step size. I have also attempted to run the first iteration but I am stuck to trying to do the projection. I don't know how to do X((11)T)\displaystyle \prod_X((1 -1)^T)
 
Last edited:
View attachment 10658

Part (a) and part (b) is pretty easy but for part (c) I don't know how you would start the problem, I tried to look online for examples but all I found are general equations.

Any help would be appreciated.
attachment.php

Can you please define "gradient projection" method for us?
 
Sorry I forgot to include the formula

attachment.php

Can you please define "gradient projection" method for us?


The formula that I have for this is xk+1=X(xkτkf(xk)\displaystyle x_{k+1} = \prod_X(x_k - \tau_k \nabla f(x_k) where X(X)\displaystyle \prod_X(X) is the orthangonal projection and τk\displaystyle \tau_k is the step size.
 
attachment.php

Can you please define "gradient projection" method for us?

Gradient projection is defined as xk+1=X(xkτkf(xk))\displaystyle x_{k+1} = \prod_X (x_k - \tau_k \nabla f(x_k))
 
Top