pythonoptimizationscipylinear-programming

Python linprog to maximise objective function


It has been a while since I have done this so I am a bit rusty, but equation is:

max t(C)*x
s.t. Ax <=b

And I have my A matrix of constraints which is (1448x1359) :

[[ 1.  1.  0. ...,  0.  0.  0.]

 ..., 


 [ 0.  0.  0. ...,  1.  1.  1.]]

Then I have my binding b (1448x1):

[ 1.  1.  7. ...,  2.  1.  2.]

And my objective function to be maximised which is a vector of ones (1359,1).

Now in other packages my maximised objective function is 841, however using linprog:

res = linprog(c=OBJ_N, A_ub=A, b_ub=b, options={"disp": True})

It optimised successfully to -0.0 so I wonder if I'm using the right command in python and have my constraints the right way around?

Edit: Ok that makes sense, it was trying to minimise. I have rewritten now (swapped c and b and transposed A to minimise).

# (max t(C)*x s.t. Ax <=b) = min t(b)*x s.t. ATy = c, y ≥ 0
# (i): minimise number of shops no bounds
ID = np.ones(len(w[0]))
print(ID)
print(ID.shape)  #1359

At = A.transpose()

need_divest = (A.dot(ID)) - 1
print(need_divest)
print(need_divest.shape)  #1448

res = linprog(c=need_divest, A_eq=At, b_eq=ID, options={"disp": True})
print(res)

However, I get "message: 'Optimzation failed. Unable to find a feasible starting point.'"


Solution

  • I guess you are probably minimizing instead of maximizing your objective function. Try with this (inserting a - in front of your objective function coefficients) :

    res = linprog(c=-OBJ_N, A_ub=A, b_ub=b, options={"disp": True})
    

    Your result should then be -841.

    This works simply because :

    min(f(x))=-max(-f(x))