pythonoptimizationscipy

Why "if" can't be used in Scipy.optimize Inequality Constraint?


Consider a simple question using Scipy.optimize: Maximize(xy) s.t x^2+y^2=200. The right code is this :

import numpy as np
from scipy.optimize import minimize

def objective(var_tmp):
    x, y = var_tmp
    return -x * y  

def constraint(var_tmp):
    x, y = var_tmp
    return 200 - (x ** 2 + y ** 2)
initial_guess = [1, 1]
constraints = {'type': 'ineq', 'fun': constraint}
result = minimize(objective, initial_guess, constraints=constraints)
optimal_x, optimal_y = result.x
optimal_value = -result.fun
print(f"Optimal x: {optimal_x}")
print(f"Optimal y: {optimal_y}")
print(f"Maximum xy value: {optimal_value}")

Which gives the right answer 100.

However, if the constraint is written as follows:

def constraint(var_tmp):
    x, y = var_tmp
    if x ** 2 + y ** 2 <= 200:
        return 1
    return -1

It will give infinity as an answer. Why is the case?


Solution

  • By default, SciPy uses SLSQP to minimize a problem which has constraints. (Several other minimizers have support for constraints; see the "Constrained Minimization" section of the minimize() documentation.)

    SLSQP requires that its constraints be differentiable.

    Here is a passage from the SLSQP paper showing this. In this context, f is the function you are minimizing, and g is your equality and inequality constraints.

    ...where the problem functions f : R^n -> R^1 and g : R^n -> R^m are assumed to be continuously differentiable and have no specific structure

    Source: Kraft D (1988), A software package for sequential quadratic programming. Tech. Rep. DFVLR-FB 88-28, DLR German Aerospace Center — Institute for Flight Mechanics, Koln, Germany. page 8, section 2.1.1

    (I mentioned before that SciPy has multiple minimize methods which can handle constraints. However, I checked, and none of them seem to be able to handle non-differentiable constraints.)