convex-optimization

Step size selection for gradient descent without access to function evaluations


Is there a way to select step size to do gradient descent when you only have access to gradient evaluations, but not function evaluations?

I know the function to be optimized over is convex, and given a point x, I have access to f'(x), but not f(x). Can I do anything other than a fixed step size rule for gradient descent in this case?


Solution

  • You may try to estimate the Lipschitz Constant of the function in a local area. Then you'll be able to estimate the bound on the step size.