pythonoptimizationscipyevaldifferential-evolution

How to add several constraints to differential_evolution?


I have the same problem as in this question but don't want to add only one but several constraints to the optimization problem.

So e.g. I want to maximize x1 + 5 * x2 with the constraints that the sum of x1 and x2 is smaller than 5 and x2 is smaller than 3 (needless to say that the actual problem is far more complicated and cannot just thrown into scipy.optimize.minimize as this one; it just serves to illustrate the problem...).

I can to an ugly hack like this:

from scipy.optimize import differential_evolution
import numpy as np

def simple_test(x, more_constraints):

    # check wether all constraints evaluate to True
    if all(map(eval, more_constraints)):

        return -1 * (x[0] + 5 * x[1])

    # if not all constraints evaluate to True, return a positive number
    return 10

bounds = [(0., 5.), (0., 5.)]

additional_constraints = ['x[0] + x[1] <= 5.', 'x[1] <= 3']
result = differential_evolution(simple_test, bounds, args=(additional_constraints, ), tol=1e-6)
print(result.x, result.fun, sum(result.x))

This will print

[ 1.99999986  3.        ] -16.9999998396 4.99999985882

as one would expect.

Is there a better/ more straightforward way to add several constraints than using the rather 'dangerous' eval?


Solution

  • There is a proper solution to the problem described in the question, to enforce multiple nonlinear constraints with scipy.optimize.differential_evolution.

    The proper way is by using the scipy.optimize.NonlinearConstraint function.

    Here below I give a non-trivial example of optimizing the classic Rosenbrock function inside a region defined by the intersection of two circles.

    import numpy as np
    from scipy import optimize
    
    # Rosenbrock function
    def fun(x):
        return 100*(x[1] - x[0]**2)**2 + (1 - x[0])**2
    
    # Function defining the nonlinear constraints:
    # 1) x^2 + (y - 3)^2 < 4
    # 2) (x - 1)^2 + (y + 1)^2 < 13
    def constr_fun(x):
        r1 = x[0]**2 + (x[1] - 3)**2
        r2 = (x[0] - 1)**2 + (x[1] + 1)**2
        return r1, r2
    
    # No lower limit on constr_fun
    lb = [-np.inf, -np.inf]
    
    # Upper limit on constr_fun
    ub = [4, 13]
    
    # Bounds are irrelevant for this problem, but are needed
    # for differential_evolution to compute the starting points
    bounds = [[-2.2, 1.5], [-0.5, 2.2]]
    
    nlc = optimize.NonlinearConstraint(constr_fun, lb, ub)
    sol = optimize.differential_evolution(fun, bounds, constraints=nlc)
    
    # Accurate solution by Mathematica
    true = [1.174907377273171, 1.381484428610871]
    print(f"nfev = {sol.nfev}")
    print(f"x = {sol.x}")
    print(f"err = {sol.x - true}\n")
    

    This prints the following with default parameters:

    nfev = 636
    x = [1.17490808 1.38148613]
    err = [7.06260962e-07 1.70116282e-06]
    

    Here is a visualization of the function (contours) and the feasible region defined by the nonlinear constraints (shading inside the green line). The constrained global minimum is indicated by the yellow dot, while the magenta one shows the unconstrained global minimum.

    This constrained problem has an obvious local minimum at (x, y) ~ (-1.2, 1.4) on the boundary of the feasible region which will make local optimizers fail to converge to the global minimum for many starting locations. However, differential_evolution consistently finds the global minimum as expected.

    enter image description here