optimizationjuliamixed-integer-programmingjulia-jump

Julia Jump : Getting all feasible solutions to mip


I would like to have instead of only the vector of optimal solution to a mip , all the feasible (suboptimal) vectors.
I found some old questions here, but I am not sure how they work.

First of all, is there any new library tool/way to do that automatically ?
I tried this but, it did nothing:

if termination_status(m) == MOI.FEASIBLE_POINT
    println(x)
end
optimize!(m);

If not, what's the easiest way?
I thought of scanning the optimal solution till I find the first non -zero decision variable, then constraint this variable to be zero and solving the model again.

for i in 1:active_variables
    if value.(z[i])==1
        @constraint(m, x[i] == 0)
        break
    end
end

optimize!(m);

But I see this problem with this method** :

  1. Ιf I constraint x[i] to be zero, in the next step I will want maybe to drop again this constraint? This comes down to whether there can exist two(or more) different solutions in which x[i]==1


Solution

  • The following is an example of all-solution finder for a boolean problem. Such problems are easier to handle since the solution space is easily enumerated (even though it can still grow exponentially big).

    First, let's get the packages and define the sample problem:

    using Random, JuMP, HiGHS, MathOptInterface
    
    function example_knapsack()
        profit = [5, 3, 2, 7, 4]
        weight = [2, 8, 4, 2, 5]
        capacity = 10
        minprofit = 10
        model = Model(HiGHS.Optimizer)
        set_silent(model)
        @variable(model, x[1:5], Bin)
        @objective(model, FEASIBILITY_SENSE, 0)
        @constraint(model, weight' * x <= capacity)
        @constraint(model, profit' * x >= minprofit)
        return model
    end
    

    (it is a knapsack problem from the JuMP docs).

    Next, we use recursion to explore the tree of all possible solutions. The tree does not go down branches with no solution (so the running time is not always exponential):

    function findallsol(model, x)
        perm = shuffle(1:length(x))
        res = Vector{Float64}[]
        _findallsol!(res, model, x, perm, 0)
        return res
    end
    
    function _findallsol!(res, model, x, perm, depth)
        n = length(x)
        depth > n && return
        optimize!(model)
        if termination_status(model) == MathOptInterface.OPTIMAL
            if depth == n
                push!(res, value.(x))
                return
            else
                idx = perm[depth+1]
                v = value(x[idx])
                newcon = @constraint(model, x[idx] == v)
                _findallsol!(res, model, x, perm, depth + 1)
                delete(model, newcon)
                newcon = @constraint(model, x[idx] == 1 - v)
                _findallsol!(res, model, x, perm, depth + 1)
                delete(model, newcon)
            end
        end
        return
    end
    

    Now we can:

    julia> m = example_knapsack()
    A JuMP Model
    Maximization problem with:
    Variables: 5
    ...
    Names registered in the model: x
    
    julia> res = findallsol(m, m.obj_dict[:x])
    5-element Vector{Vector{Float64}}:
     [1.0, 0.0, 0.0, 1.0, 1.0]
     [0.0, 0.0, 0.0, 1.0, 1.0]
     [1.0, 0.0, 1.0, 1.0, 0.0]
     [1.0, 0.0, 0.0, 1.0, 0.0]
     [0.0, 1.0, 0.0, 1.0, 0.0]
    

    And we get a vector with all the solutions.

    If the problem in question is a boolean problem, this method might be used, as is. In case it has non-boolean variables, the recursion will have to split the feasible space in some even fashion. For example, choosing a variable and cutting its domain in half, and recursing to each half with a smaller domain on this variable (to ensure termination).

    P.S. This is not the optimal method. This problem has been well studied. Possible terms to search for are 'model counting' (especially in the boolean domain).

    (UPDATE: Changed objective to use FEASIBLE)