pythonscipyscipy-optimizenonlinear-optimizationscipy-optimize-minimize

NonlinearConstraint in SciPy Optimize not working with vector bounds?


I asked ChatGPT and it wasn't helpful. Basically I want to implement a constraint on a Transformed vector of X, let's say T(X). How do I do that? Here is my code:

def TransformFunction(X):
    return Ta(X) 

def LowerBound(X):   
    return Tb(X) * 0.05

def UpperBound(X):
    return Tb(X) * 0.3

Where Ta(X) and Tb(X) are different transformations of a vector x, which is the output of the optimization. When I try to do:

Constraint = NonlinearConstraint(
    fun = TransformFunction
    , lb = LowerBound
    , ub = UpperBound
    )

I get the error float() argument must be a string or a real number, not 'function'

ChatGPT told me this should work. Basically I need the upper and lower bounds to be able to vary with the solution. Any solutions?


Solution

  • Per the NonlinearConstraint docs, the upper and lower bounds must be static values. Using upper and lower bounds from a function is not supported.

    As an alternative, consider dividing out the function.

    For example, your existing code is trying to do this:

    Tb(X) * 0.05 >= Ta(X) >= Tb(X) * 0.03
    

    If you divide each element by Tb(X), you get this:

    0.05 >= Ta(X) / Tb(X) >= 0.03
    

    (Note: this assumes that Tb(X) is positive. If Tb(X) is negative, you'll need to swap the bounds. If Tb(X) can be negative or positive depending on X, you'll need some other approach.)

    You may also want to add an epsilon to avoid a division by zero. Example:

    0.05 >= Ta(X) / (Tb(X) + 1e-6) >= 0.03