I'm trying to replicate an N dimensional Delaunay triangulation that is performed by the Matlab delaunayn function in Python using the scipy.spatial.Delaunay function. However, while the Matlab function gives me the result I want and expect, scipy is giving me something different. I find this odd considering both are wrappers of the QHull library. I assume Matlab is implicitly setting different parameters in its call. The situation I'm trying to replicate between the two of them is found in Matlab's documentation.
The set up is to have a cube with a point in the center as below. The blue lines I provided to help visualize the shape, but they serve no purpose or meaning for this problem.
The triangulation I expect from this results in 12 simplices (listed in the Matlab example) and looks like the following.
However this python equivalent produces "extra" simplices.
x = np.array([[-1,-1,-1],[-1,-1,1],[-1,1,-1],[1,-1,-1],[1,1,1],[1,1,-1],[1,-1,1],[-1,1,1],[0,0,0]])
simp = scipy.spatial.Delaunay(x).simplices
The returned variable simp
should be an M x N array where M is the number of simplices found (should be 12 for my case) and N is the number of points in the simplex. In this case, each simplex should be a tetrahedron meaning N is 4.
What I'm finding though is that M is actually 18 and that the extra 6 simplices are not tetrahedrons, but rather the 6 faces of the cube.
What's going on here? How can I limit the returned simplices to only be tetrahedrons? I used this simple case to demonstrate the problem so I'd like a solution that isn't tailored to this problem.
Thanks to an answer by Amro, I was able to figure this out and I can get a match in simplices between Matlab and Scipy. There were two factors in play. First, as pointed out, Matlab and Scipy use different QHull options. Second, QHull returns simplices with zero volume. Matlab removes these, Scipy doesn't. That was obvious in the example above because all 6 extra simplices were the zero-volume coplanar faces of the cube. These can be removed, in N dimensions, with the following bit of code.
N = 3 # The dimensions of our points
options = 'Qt Qbb Qc' if N <= 3 else 'Qt Qbb Qc Qx' # Set the QHull options
tri = scipy.spatial.Delaunay(points, qhull_options = options).simplices
keep = np.ones(len(tri), dtype = bool)
for i, t in enumerate(tri):
if abs(np.linalg.det(np.hstack((points[t], np.ones([1,N+1]).T)))) < 1E-15:
keep[i] = False # Point is coplanar, we don't want to keep it
tri = tri[keep]
I suppose the other conditions should be addressed, but I'm guaranteed that my points contain no duplicates already, and the orientation condition appears to have no affect on the outputs that I can discern.
Some notes comparing MATLAB and SciPy functions:
According to MATLAB docs, by default it uses Qt Qbb Qc
Qhull options for 3-dimensional input, while SciPy uses Qt Qbb Qc Qz
.
not sure if it matters, but your NumPy array is not in the same order as the points created with ndgrid
in MATLAB.
In fact if you look at the MATLAB code in edit delaunayn.m
, you can see three extra steps performed:
mergeDuplicatePoints
(this is not an issue in your case)finally after getting the result from Qhull (implemented as a MEX-function qhullmx
), there is the following comment above a few lines of code:
Strip the zero volume simplices that may have been created by the presence of degeneracy.
Since the file is copyrighted, I won't post the code here, but you can check it on your end.