Suppose I'm fitting some data points by a simple linear regression. Now I'd like to perform several joint linear regressions for several sets of data points. More specifically, I want one parameter to be equal among all fits, which is schematically depicted here for the y-axis intersection.
After searching Google for some time I could neither find any Python (Scipy) routine which does that, nor any general literature, how one would accomplish this.
Ideally, I want to perform those joint fits not only in the case of simple linear regressions, but also for more general fit functions (for instance, power-law fits with joint exponent).
The lmfit
module allows you to do this, as mentioned in their FAQ:
from lmfit import minimize, Parameters, fit_report
import numpy as np
# residual function to minimize
def fit_function(params, x=None, dat1=None, dat2=None):
model1 = params['offset'] + x * params['slope1']
model2 = params['offset'] + x * params['slope2']
resid1 = dat1 - model1
resid2 = dat2 - model2
return np.concatenate((resid1, resid2))
# setup fit parameters
params = Parameters()
params.add('slope1', value=1)
params.add('slope2', value=-1)
params.add('offset', value=0.5)
# generate sample data
x = np.arange(0, 10)
slope1, slope2, offset = 1.1, -0.9, 0.2
y1 = slope1 * x + offset
y2 = slope2 * x + offset
# fit
out = minimize(residual, params, kws={"x": x, "dat1": y1, "dat2": y2})
print(fit_report(out))
# [[Fit Statistics]]
# # fitting method = leastsq
# # function evals = 9
# # data points = 20
# # variables = 3
# chi-square = 1.4945e-31
# reduced chi-square = 8.7913e-33
# Akaike info crit = -1473.48128
# Bayesian info crit = -1470.49408
# [[Variables]]
# slope1: 1.10000000 +/- 8.2888e-18 (0.00%) (init = 1)
# slope2: -0.90000000 +/- 8.2888e-18 (0.00%) (init = -1)
# offset: 0.20000000 +/- 3.8968e-17 (0.00%) (init = 0.5)
# [[Correlations]] (unreported correlations are < 0.100)
# C(slope1, offset) = -0.742
# C(slope2, offset) = -0.742
# C(slope1, slope2) = 0.551