I have an matlab example of this program but I can't do it in Python. How it must look like (mathlab example) https://i.sstatic.net/H0WWa.jpg.
import numpy as np
import matplotlib.pyplot as plt
from scipy.signal import hilbert, chirp
duration = int(input("Input duration of signal = "))
A = int(input("Input amplitude of signal = "))
start = int(input("Input start of modulation time = "))
end = int(input("Input lenght of modulation = "))
fs = 400.0
samples = int(fs*duration)
t = np.arange(samples) / fs
main_t = []
result = []
counter=0
for l in range(0,samples):
main_t.append(0)
result.append(0)
start_mod = np.arange(start*fs)
end_mod = np.arange(end*fs)
signal = chirp(t, 20.0, t[-1], 100.0)
signal *= (A + 0.5 * np.sin(2.0*np.pi*3.0*t) )
for i in np.arange(0,start*fs):
signal.insert(i,0)
fig = plt.figure()
ax0 = fig.add_subplot(211)
ax0.plot(main_t, result)
ax0.set_xlabel("time in seconds")
plt.show()
I expect the inputing some data from console and then print graph using matplotlib for example. Graph must look like graph that u can see on image(mathlab example).
Ignoring all the fiddling with getting input (which will vary a lot depending on other constraints), the code to generate an amplitude modulated signal should be simple.
I start by pulling in numpy, generating a set of points on which to sample the signal, calculating an amplitude, then combine them:
import numpy as np
x = np.linspace(0, 10, 501)
ampl = np.exp(-(x - 3.5)**2 / 0.8)
y = np.sin(x * 25) * ampl
we can then plot these using matplotlib with something like:
import matplotlib.pyplot as plt
plt.figure(figsize=(10,5))
plt.plot(x, y, label='signal')
plt.plot(x, ampl, ':', label='amplitude')
plt.xlabel('time')
plt.ylabel('value')
plt.legend()
I've pulled in seaborn and am using their ticks
style to make it a bit prettier.