pythonscipylinear-programming

Efficiently get shadow prices in scipy linprog


I have a huge linprog problem of almost 1k variables and restrictions. I can calculate the solution with scipy.optimize.linprog(method='simplex') but I need shadow prices (or opportunity costs) of ~100 inequalities.

I'm able to calculate them by adding 1 to the right side of the inequality and then solving that problem. Then I get the shadow price substracting the objective functions values for both solutions: shadow_price_i = f_max_original - f_max_i. Then repeat 100 times. This method works but it's painfully slow (1h).

Is there something I can do to obtain shadow prices quicker? Maybe some trick or functionality I'm missing...


Solution

  • Solve the dual problem and that will give you all the shadow prices with just one more call to linprog. Here is an example for a standard LP problem:

    import scipy.optimize as opt
    import numpy as np
    
    c = np.array([400, 200, 250])     # negative of objective function
    b = np.array([1000, 300, 625])    # constraint bounds
    A = np.array([[3, 1, 1.5],
                  [0.8, 0.2, 0.3],
                  [1, 1, 1]])        # constraints
    x1_bnds = (0, None)     # bounds on x1
    x2_bnds = (0, None)     # bounds on x2
    x3_bnds = (0, None)     # bounds on x3
    result = opt.linprog(-c, A_ub=A, b_ub=b, bounds=(x1_bnds, x2_bnds, x3_bnds))
    
    dual_c = b
    dual_b = -1.0 * c
    dual_A = -1.0 * np.transpose(A)
    result = opt.linprog(dual_c, A_ub=dual_A, b_ub=dual_b,
                         bounds=(x1_bnds, x2_bnds, x3_bnds))