I'm trying to model the following time dependent function in MATLAB:
where gamma,beta, and delta are constants. I'm not sure what the best way to approach the summation is. I could make u(t) and y(t) function handles, but symsum can only be used for symbolics. Is my only option to use a for loop to manually add up y(t-i) and u(t-m)? What I have done so far to generate y(t) is shown below:
N = 100;
y_out = zeros(N,1);
u = zeros(N,1);
for t = 1:1:N
u(t) = cos(beta*t);
end
y_out(1) = NARMA(beta,delta,gamma,true);
for t=2:N
y_out(t) = NARMA(beta,delta,gamma);
end
function y_out = NARMA(beta,delta,gamma,first_call)
% Set first_call to true only the first time you call the function
persistent y
if nargin>3 && first_call
y = 0;
end
for i=1:1:m
y = y + gamma*u + delta;
end
y_out = y;
end
Here is a possible solution that doesn't require loop (not tested):
t = 1:N;
u = @(t) cos(beta * t);
y = cumsum(gamma * u(t - m) .* u(t) + delta);
y(m+1:end) = y(m+1:end) - y(1:end-m);
EDIT:
It turned out that the above solution is wrong. I think this is correct:
u = @(t) cos(beta*t);
y = zeros(N+m, 1);
for t = 1:N
y(t+m) = sum(y(t:t+m-1)) + gamma * u(t-m) * u(t) + delta;
end
y = y (m+1:end);
EDIT2:
With further simplifications the summation can be reduced to a simple for loop:
u = @(t) cos(beta*t);
y = zeros(N+m, 1);
e = gamma * u((1:N)-m) .* u(1:N) + delta;
de = diff(e);
y(1+m) = e(1);
for t = 2:N
y(t+m) = 2 * y(t+m-1) - y(t-1) + de(t-1);
end
y = y (m+1:end);