mathematical-optimizationcplexmodelingopl

Implementing Constraints in OPL Using CPLEX


I've been working on addressing a specific constraint in OPL and attempted to write code to solve it. However, it's not functioning as expected. Could anyone provide guidance on the correct approach to do that.

Here is the constraint:

constraint image

The full code of the model in OPL : [Edited], the constraint is the second one :

// Example data
range J = 1..5;
range R = 1..3;
range T = 1..10;
int P[J] = [0,1,2,3,4]; 
int d[J] = [1, 1, 3, 5, 2];
int EF[J] = [1, 2, 1, 1, 1];
int LF[J] = [2, 3, 4, 6, 2];
int Tbar = sum(j in J) d[j];

// Resource usage matrix
int u[J][R] = [[1, 0, 2],
               [1, 1, 1], 
               [1, 1, 0], 
               [2, 0, 3], 
               [1, 1, 2]];

// Resource availability
int a[R]= [1,2,3];

// Decision Variables
dvar boolean x[J][T];

dexpr int CT = sum(j in J, t in EF[j]..LF[j])t*x[j][t];
minimize CT;

subject to {
  forall(j in J)
     sum (t in EF[j]..LF[j]) x[j][t] == 1;
  forall(j in J, i in {P[j]}) //no job must be started before all its predecessors have been completed.
     sum (t in EF[j]..LF[j]:t in T) ((t - d[j]) * x[j][t]) - 
     sum (t in EF[i]..LF[i]:t in T) t * x[i][t] >= 0;
  forall(r in R, t in 1..Tbar) {
     sum(j in J) u[j][r] * sum(q in maxl(t, EF[j])..minl(t + d[j] - 1, LF[j])) x[j][q] <= a[r];
  }
}

I've attempted to apply my knowledge in OPL to address this issue, but I'm encountering difficulties. I would appreciate any advice on how to resolve this.


Solution

  • You get many out of bounds

    The following model works better

    range J = 0..5;
    range R = 1..3;
    range T = 1..10;
    int P[J] = [0,1,2,3,4]; 
    int d[J] = [1, 1, 3, 5, 2];
    int EF[J] = [1, 2, 1, 1, 1];
    int LF[J] = [2, 3, 4, 6, 2];
    
    
    
    dvar int x[J][T];
    subject to {
      forall(j in J, i in {P[j]})
        sum (t in EF[j]..LF[j]:t in T) ((t - d[j]) * x[j][t]) - 
        sum (t in EF[i]..LF[i]:t in T) t * x[i][t] >= 0;    
    }
    

    and later on with your second model, the following works fine

    // Example data
    range J = 0..5;
    range R = 1..3;
    range T = 1..10;
    int P[J] = [0,1,2,3,4]; 
    int d[J] = [1, 1, 3, 5, 2];
    int EF[J] = [1, 2, 1, 1, 1];
    int LF[J] = [2, 3, 4, 6, 2];
    int Tbar = sum(j in J) d[j];
    
    // Resource usage matrix
    int u[J][R] = [[1, 0, 2],
                   [1, 1, 1], 
                   [1, 1, 0], 
                   [2, 0, 3], 
                   [1, 1, 2]];
    
    // Resource availability
    int a[R]= [1,2,3];
    
    // Decision Variables
    dvar boolean x[J][T];
    
    dexpr int CT = sum(j in J, t in EF[j]..LF[j]:t in T)t*x[j][t];
    minimize CT;
    
    subject to {
      forall(j in J)
         sum (t in EF[j]..LF[j]:t in T) x[j][t] <= 1;
      forall(j in J, i in {P[j]}) //no job must be started before all its predecessors have been completed.
         sum (t in EF[j]..LF[j]:t in T) ((t - d[j]) * x[j][t]) - 
         sum (t in EF[i]..LF[i]:t in T) t * x[i][t] >= 0;
      forall(r in R, t in 1..Tbar) {
         sum(j in J) u[j][r] * sum(q in maxl(t, EF[j])..minl(t + d[j] - 1, LF[j])) x[j][q] <= a[r];
      }
    }