When I take the x
reported by the optimization and plug it back into my function optimizee
, I don't get the value outputted by the optimization: -749.260. Rather I get -637.65, which is not as good...
I'm using SciPy's Nelder-Mead to optimize a nasty function I have put together:
result = minimize(optimizee, x0, method='Nelder-Mead', bounds=(
(0, 250),
(0, 250),
(0, 1),
(0, 1),
(0, 1),
(0, 1),
(0, 1),
(0, 1),
(0, 1),
(0, 1),
(0, 1),
(0, 1),
), options={'xatol': 1e-2, 'maxiter': 50000})
By raising maxiter, I get it to to converge and successfully return:
message: Optimization terminated successfully.
success: True
status: 0
fun: -749.2601549652912
x: [ 5.536e-03 2.500e+02 6.156e-04 0.000e+00 1.033e-04
2.533e-06 1.000e+00 5.640e-01 8.739e-01 3.828e-03
9.999e-01 1.000e+00]
nit: 1811
nfev: 2589
final_simplex: (array([[ 5.536e-03, 2.500e+02, ..., 9.999e-01,
1.000e+00],
[ 5.557e-03, 2.500e+02, ..., 9.999e-01,
1.000e+00],
...,
[ 5.788e-03, 2.500e+02, ..., 9.999e-01,
1.000e+00],
[ 5.782e-03, 2.500e+02, ..., 1.000e+00,
1.000e+00]], shape=(13, 12)), array([-7.493e+02, -7.493e+02, ..., -7.493e+02, -7.493e+02],
shape=(13,)))
Test your optimization by passing the result directly to the function
optimizee(result.x)
rather than by copying the values of result.x
printed to the console into a new function call.
When the values of result.x
are printed to the console, they are not printed with all significant digits and if your function is very sensitive to its inputs, you could get an unexpected result, as described above.