I'm building a Python Sudoku Solver to solve the classic and non-classic sudokus using the cp_model
from Google's ortools
library. Specifically, I'm trying to write a method that solves a Sudoku Variant in which every row/column/subgrid contains 9 unique numbers between 0 - 11. For instance, if row 1 of the sudoku has numbers [1, 2, 3, 4, 5, 6, 7, 8, 9], rows 2-9 cannot have the same numbers in any permutation. However, row 2 can have the numbers [0, 1, 2, 3, 4, 5, 6, 7, 8] since this is a distinct set of 9 numbers. Likewise, this distinction applies to all rows.
Here is the challenge: Since each row has 9 entries, for a given set of 9 numbers, there are 9! different permutations in a row for them to appear.
So, I decided to go with creating a Unique signature for each row. Get all the elements of a row, sort them out in ascending order, and create an 18-digit signature unique to a row. For example, for row 1 with elements [1, 2, 3, 4, 5, 6, 7, 8, 9], the unique signature would be 010203040506070809. So, if some other row has the same 9 numbers in some other permutation, it will also have the same signature of 010203040506070809. This signature will only be different if the row has another set of 9 numbers regardless of the 9! possible permutations. Hence if I add a constraint that all the unique signatures must be different, I should get what I want.
Unfortunately, this isn't working since the sorted()
method or list sorting is not working well with ortools
. So, I had to go for a crude approach and generate all the 9! signatures (I am aware that it is computationally expensive) and pick the smallest signature to be the Unique Signature using the code snippet below.
def DistinctRowColSubGridConstraints(self):
'''
Collect all rows, create a unique lexicographical signature for each row,
and ensure that all rows are unique.
'''
# Helper to encode a row as a lexicographical signature based on permutation indices
def encode_as_lexicographical_index(row):
# Create a list of indices: [0, 1, 2, ..., n-1] where n is the row length
n = len(row)
indices = list(range(n))
# Generate all permutations of the indices list
all_permutations = list(permutations(indices))
# Generate All Signatures
All_Signatures = []
for perm in all_permutations:
signature = 0
for idx in perm:
signature = signature * 100 + row[idx] # Concatenate each element to form the signature
All_Signatures.append(signature)
return All_Signatures
# Collect all rows and create lexicographical signature for each
row_signatures = []
for i in range(self.Rows):
row = [self.Cells[i][j] for j in range(self.Cols)] # Get the row
all_signatures = encode_as_lexicographical_index(row) # Get all possible signatures
RowUniqueSingature = self.Model.NewIntVar(lb=10**17, ub=10**18-1, name=f"RowUniqueSignature_{i}")
self.Model.Add(RowUniqueSingature==min(all_signatures))
row_signatures.append(RowUniqueSingature) # Add signature to list
# Ensure all rows have unique lexicographical signatures
self.model.AddAllDifferent(row_signatures)
print("Distinct row constraints added.")
Unfortunately, this is also not working since min()
cannot work on Bounded Linear Expressions.
I would appreciate any help or guidance on fixing this issue. I would greatly appreciate if someone could point me to a much efficient solution. Thank you.
In CP-SAT the equivalent to min() is:
add_min_equality (self, LinearExprT target, Iterable[LinearExprT] exprs)
And sorted() can be formulated e.g. by iterating the positions and adding constraints like position[i] < position[i+1]
.
IIRC, there is actually a soduko solver in the samples section (on ortools’ GitHub).
Edit: (related to my comment below)
I found this link with basically the same topic as yours incl. the use of sorted() and min(): https://devcodef1.com/news/1446546/python-sudoku-solver-with-or-tools
But it looks strange to me calling the (stateless) CP-SAT solver incrementally that way.
Edit2: (with the understanding of the rules of the non-classic version from your comment)
Using ortools’ sudoku example as starting point, we can formulate the additional constraint you are looking for as add_all_different
of all rows (and of all cols, and of all sub-grids/blocks) without calculating 9! permutations.
P.S. If my assumption here is right, there are only 12 different combinations to sum up to 45.
Output with the grid from your YT link as init:
(Colab via older iPad)
[ _ _ _ ][ 5 4 _ ][ _ _ _ ]
[ _ 2 _ ][ _ 6 _ ][ 7 _ 3 ]
[ 4 _ _ ][ _ 0 _ ][ 5 _ _ ]
[ 0 _ _ ][ _ _ _ ][ 10 11 _ ]
[ 1 _ 8 ][ _ _ _ ][ _ 3 _ ]
[ _ 9 _ ][ _ _ _ ][ _ _ _ ]
[ _ 5 _ ][ _ 2 _ ][ _ _ _ ]
[ _ _ _ ][ 9 8 _ ][ 0 _ 1 ]
[ _ 7 10 ][ _ _ _ ][ 6 2 _ ]
NumConflicts: 0
NumBranches: 0
WallTime: 0.058303263
[ 9 0 6 ][ 5 4 7 ][ 11 1 2 ]
[ 11 2 5 ][ 10 6 1 ][ 7 0 3 ]
[ 4 1 7 ][ 3 0 9 ][ 5 6 10 ]
[ 0 4 2 ][ 8 1 3 ][ 10 11 6 ]
[ 1 11 8 ][ 7 9 0 ][ 2 3 4 ]
[ 7 9 3 ][ 2 10 5 ][ 1 8 0 ]
[ 8 5 0 ][ 1 2 6 ][ 3 9 11 ]
[ 2 6 4 ][ 9 8 10 ][ 0 5 1 ]
[ 3 7 10 ][ 0 5 4 ][ 6 2 8 ]
row: 0 encoded: 538
row: 1 encoded: 681
row: 2 encoded: 395
row: 3 encoded: 813
row: 4 encoded: 802
row: 5 encoded: 659
row: 6 encoded: 670
row: 7 encoded: 527
row: 8 encoded: 263
col: 0 encoded: 802
col: 1 encoded: 538
col: 2 encoded: 263
col: 3 encoded: 659
col: 4 encoded: 527
col: 5 encoded: 395
col: 6 encoded: 681
col: 7 encoded: 670
col: 8 encoded: 813
block: 0_0 encoded: 538
block: 0_1 encoded: 395
block: 0_2 encoded: 681
block: 1_0 encoded: 802
block: 1_1 encoded: 659
block: 1_2 encoded: 813
block: 2_0 encoded: 263
block: 2_1 encoded: 527
block: 2_2 encoded: 670