pythonscipymathematical-optimizationnon-convex

LBFGS for non-convex objective function


I am using Scipy's implementation of LBFGS for minimizing a non-convex objective function. The result is not too bad. But the status of convergence is "ABNORMAL_TERMINATION_IN_LNSRCH".

Is it possible that this is because my objective function is non-convex? Or could this mean my gradients (analytically calculated manually and passed as argument to Scipy's LBFGS) are wrong?


Solution

  • It all fine. Normally L-BGFS, Gradient Descents are convex optimization methods. That means your optimization function should have a global minimum and it should be smooth. When the function is non-convex, it has different terrains which we know as local minima. So in this case when we use convex optimization methods for non convex function what happens is you optimization procedure can find a local minima which is not the perfect answer.