pythoninterpolationlinear-interpolation

How to implement linear interpolation?


Say I am given data as follows:

x = [1, 2.5, 3.4, 5.8, 6]
y = [2, 4, 5.8, 4.3, 4]

I want to design a function that will interpolate linearly between 1 and 2.5, 2.5 to 3.4, and so on using Python.

I have tried looking through this Python tutorial, but I am still unable to get my head around it.


Solution

  • As I understand your question, you want to write some function y = interpolate(x_values, y_values, x), which will give you the y value at some x? The basic idea then follows these steps:

    1. Find the indices of the values in x_values which define an interval containing x. For instance, for x=3 with your example lists, the containing interval would be [x1,x2]=[2.5,3.4], and the indices would be i1=1, i2=2
    2. Calculate the slope on this interval by (y_values[i2]-y_values[i1])/(x_values[i2]-x_values[i1]) (ie dy/dx).
    3. The value at x is now the value at x1 plus the slope multiplied by the distance from x1.

    You will additionally need to decide what happens if x is outside the interval of x_values, either it's an error, or you could interpolate "backwards", assuming the slope is the same as the first/last interval.

    Did this help, or did you need more specific advice?