pythonscipymathematical-optimizationmeep

Optimize a function in scipy without explicitly defining the gradient


I'm currently trying to optimize a function using scipy. I have some constraints on the variables, and from this link: http://docs.scipy.org/doc/scipy-0.14.0/reference/tutorial/optimize.html, it looks like SLSQP is exactly what I want. In their example, they have a well defined explicit formula for the result in terms of the input, from which they find the gradient. I have an extremely disgustingly computationally intensive function which calculates how electromagnetic fields bounce off of metal walls, which by no means can be expressed in closed form (I'm using MEEP FDTD Python simulation, if you're interested). Is there an equivalent function build into scipy that finds the gradient of a function for you and then optimizes? Or, equivalently, is there a function built into scipy (any basic python library would be fine) which would find the gradient of a function for me, which I could then pass into this optimization program? Any suggestions would greatly be appreciated.


Solution

  • Since you cannot easily compute the gradient, it might pay off to use a gradient-free optimization algorithm. Here's an overview of some available in SciPy:

    http://scipy-lectures.github.io/advanced/mathematical_optimization/#gradient-less-methods

    There's also the basin hopping algorithm, which is similar to simulated annealing and not mentioned on that page:

    http://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.basinhopping.html