pythonnumpylinear-regressionleast-squares

numpy Polynomial.fit with degree zero produces unexpected result


import numpy as np

f = np.array(
    [481.62900766, 511.94542042, 647.40216379, 686.10402156, 849.9420538, 888.64398048, 1029.26087049, 1071.18799217,
     1210.51481107, 1266.63254274, 1409.54282743])
s = np.array(
    [457.90057373, 520.90911865, 666.19372559, 709.64898682, 862.48828125, 892.19934082, 1031.70675659, 1063.03643799,
     1206.41647339, 1239.6506958, 1386.23660278])

series1 = np.polynomial.polynomial.Polynomial.fit(f, s, deg=1)
delta, ratio = series1.convert().coef
print(delta, ratio)  # delta = 24.72921108370622, ratio = 0.971307510662272
# ratio is expected to be 1.0, this delta is sufficiently close to zero

series0 = np.polynomial.polynomial.Polynomial.fit(f, s, deg=0)  # assume ratio = 1.0
delta = series0.convert().coef[0]
print(delta)  # delta = 912.3988175827271 is unreasonably large

# inspired by: https://stats.stackexchange.com/questions/436375/
delta = np.mean(s - f)
print(delta)  # delta = -1.4926089272727112 is quite close to zero

Is there a way to get the second result (delta close to zero) by using Polynomial.fit, with deg=0 or otherwise?


Solution

  • It turns out I needed to manually subtract f like so:

    series0 = np.polynomial.polynomial.Polynomial.fit(f, s - f, deg=0)
    delta = series0.convert().coef[0]
    print(delta)  # delta = -1.4926089272727097