Non convex optimization

combat1818

New member
Joined
Apr 10, 2023
Messages
1
Hi, I've been struggling with this task and would really appreciate some help:
We are given a function f:Rd>Rf: R^{d}->R it is differentiable, L-smooth and non-convex, it has a global minimum xx^{*}We use a version of gradient descent of the form: xt+1=xtηtf(xt)f(xt)+βtx_{t+1}=x_{t}- \eta_{t}\frac{\nabla f(x_{t})}{||\nabla f(x_{t})|| + \beta_{t}} where ηt,βt>0\eta_t,\beta_t>0We want to find ηt,βt\eta_t, \beta_t which do not depend on the parameter L and guarantee:
1Tt=0T1f(xt)=O~(T12)\frac{1}{T} \sum_{t=0}^{T-1} ||\nabla f(x_t)||=\tilde{O}(T^{\frac{-1}{2}})where the O tilde notation ignores logarithmic factors.
 
I've been struggling with this task
Hello. Please share your beginning work. Thank you!

  \;
 
Top