from scipy import optimize
import matplotlib.pyplot as plt
x_data = np.linspace(0,11,12)
y_data = np.array([0.02471114, 0.02057292, 0.01752668, 0.01494543, 0.01273249, 0.0110999 , 0.00946524, 0.00805622, 0.00670716, 0.00558925, 0.00465331, 0.00387775])
def func(x,a,b):
return ((a-1)*(-b*x - (0.024711**(1-a)/(1-a))))**(1/(1-a))
popt,pcov = optimize.curve_fit(func, x_data, y_data, p0=[1.1,0.1])
a_opt, b_opt = popt
x_model = np.linspace(min(x_data), max(y_data), 100)
y_model = func(x_model, a_opt, b_opt)
plt.scatter(x_data, y_data)
plt.plot(x_model, y_model, color='r')
plt.show()
This returns a tiny horizontal line at the coordinates of the first datapoint. What am I doing wrong? I am also getting an OptimizeWarning: Covariance of the parameters could not be estimated error.
Probably the error is in x_model = np.linspace(min(x_data), max(y_data), 100)
which I guess should really be x_model = np.linspace(min(x_data), max(x_data), 100)
.
Also, as noted in the comment by @9769953 you should use an initial guess of [1.1, -0.1].
With these changes, the fit looks ok to me.