pythonmatplotlibmachine-learningnoise-reduction

How do I reduce "Noise" within a model in python?


I am looking to reduce the volatility of a stock prediction model's plot. The idea is to be able to focus on a trend more than an exact guess.

Example Output Here

from matplotlib import pyplot as plt
plt.figure()
plt.plot(y_pred_org)     # real price over time
plt.plot(y_test_t_org)   # predicted price over time
plt.title('Prediction vs Real Stock Price')
plt.ylabel('Price')
plt.xlabel('Days')
plt.legend(['Prediction', 'Real'], loc='upper left')
plt.show()

Solution

  • You have multiple options: