pythoninterpolationkriging

python 3D coordinate point cloud interpolation


I have a np array of coordinates -

Data[:,0] = x[:]
Data[:,1] = y[:]
Data[:,2] = z[:]

This represents a point cloud with an area of missing data.

How would you go about using this as the input data into some interpolation function (ideally kriging) which will give an interpolated Z value on the X and Y grid defined by:

xmax = np.max(data[:, 0])
ymax = np.max(data[:, 1])
xmin = np.min(data[:, 0])
ymin = np.min(data[:, 1])
xnew = np.linspace(xmin,xmax,35)
ynew = np.linspace(ymin,ymax,35)
x = np.zeros(1225)
y = np.zeros(1225)

for i in range (0,35):
    for j in range(0,35):
        x[i*35+j] = xnew[i]
        y[i*35+j] = ynew[j]

I'm having trouble with this because everything I can find which discusses 2D interpolation (2D in that their input array is 2D, describing a 3D point in space) uses mgrid. I don't want the resulting data in a grid, I want it in the original input format, basically pointcloud input and pointcloud output


Solution

  • In python, a good implementation of Kriging/Gaussian Process Regression with many examples is the one of the well-known machine learning package scikit-learn. It is based on the well-known DACE matlab implementation.

    Documentation on the Gaussian Process Regression implementation can be found on this page and in the links therein. You can find 5 tutorials at the bottom of this page. A list of available kernels can be found here.

    With the data you provided, you just have to do the following to fit a simple model with the kernel of your choice:

    import sklearn
    gp = sklearn.gaussian_process.GaussianProcessRegressor(kernel=RBF(10, (1e-2, 1e2)))
    gp.fit(Data[:,0:1], Data[:,2])  
    
    y_pred = gp.predict(the_grid_data_on_which _you_need_to_predict)