I have a function in Python:
def f(x):
return x[0]**3 + x[1]**2 + 7
# Actually more than this.
# No analytical expression
It's a scalar valued function of a vector.
How can I approximate the Jacobian and Hessian of this function in numpy or scipy numerically?
(Updated in late 2017 because there's been a lot of updates in this space.)
Your best bet is probably automatic differentiation. There are now many packages for this, because it's the standard approach in deep learning:
Another option is to approximate it with finite differences, basically just evaluating (f(x + eps) - f(x - eps)) / (2 * eps)
(but obviously with more effort put into it than that). This will probably be slower and less accurate than the other approaches, especially in moderately high dimensions, but is fully general and requires no code changes. numdifftools
seems to be the standard Python package for this.
You could also attempt to find fully symbolic derivatives with SymPy
, but this will be a relatively manual process.