i'm struggling to write a Taylor series approximation for a chosen function (func(x) in the programme) and plot a graph of approximated values against exact. The order/accuracy of the approximation is a user-specified value. Also this must be done without using arrays.
import math, matplotlib.pyplot as plt, numpy as np
order = float(input("Enter the Taylor series aproximation order (n): "))
file_object = open("taylor.dat", "w")
x = -2*np.pi
n = 0
approx = 0
taylor_list = []
def func(x):
fx = np.sin(x)*np.exp(-x/2)
return fx
def deriv(n):
nth = ((-(math.sqrt(5))/2)**n)*np.sin(-n*np.arctan(2))
return nth
def taylor(x, n):
tx = ((deriv(n))/(np.math.factorial(n)))*(x**n)
return tx
while x <= 2*np.pi:
file_object.write(str(round(x, 10)))
file_object.write(" ")
fx = func(x)
file_object.write(str(round(fx, 10)))
file_object.write(" ")
if n <= order:
tx = taylor(x, n)
approx = approx + tx
n = n + 1
print(approx)
file_object.write(str(round(approx, 10)))
file_object.write("\n")
taylor_list.append(approx)
x = x + (1/25)*np.pi
file_object.close()
my issue is that the numbers received by the taylor approximation (approx) arent working no matter what degree of accuracy selected, also it converges around an arbitrary number (25.1281585765..) the only thing left out is the graphing code which works fine. if you see any problems or suggestions, you'd make my day :)
You never reset your approx
and n
for new values of x
, so you just append the first calculated value at every value of x
.