I have a piece of code which performs geometric ad stock decay using Theano. It is a old piece of code and I need to update it using the latest version of PyTensor. Can someone please help me convert this?
def adstock_geometric_theano_pymc3(x, theta):
x = tt.as_tensor_variable(x)
def adstock_geometric_recurrence_theano(index,
input_x,
decay_x,
theta):
return tt.set_subtensor(decay_x[index],
tt.sum(input_x + theta * decay_x[index - 1]))
len_observed = x.shape[0]
x_decayed = tt.zeros_like(x)
x_decayed = tt.set_subtensor(x_decayed[0], x[0])
output, _ = theano.scan(
fn = adstock_geometric_recurrence_theano,
sequences = [tt.arange(1, len_observed), x[1:len_observed]],
outputs_info = x_decayed,
non_sequences = theta,
n_steps = len_observed - 1
)
return output[-1]
so first i'll share the converted code and then explain how and why everything works:
Suppose
x
is a sequence of advertising expenditures over time andtheta
is the decay rate. I'll use a small array of numbers for x and a hypothetical value for theta.Example Data:
x
: Advertising expenditures over 10 time periods, e.g.,[100, 120, 90, 110, 95, 105, 115, 100, 130, 125]
theta
: Decay rate, let's say0.5
import torch
def adstock_geometric_pytensor(x, theta):
x = torch.tensor(x, dtype=torch.float32)
theta = torch.tensor(theta, dtype=torch.float32)
def adstock_geometric_recurrence_pytensor(index, input_x, decay_x, theta):
decay_x[index] = input_x + theta * decay_x[index - 1]
return decay_x
len_observed = x.shape[0]
x_decayed = torch.zeros_like(x)
x_decayed[0] = x[0]
for index in range(1, len_observed):
x_decayed = adstock_geometric_recurrence_pytensor(index, x[index], x_decayed, theta)
return x_decayed
# Example usage
x_data = [100, 120, 90, 110, 95, 105, 115, 100, 130, 125] # Advertising expenditures
theta_value = 0.5 # Decay rate
output = adstock_geometric_pytensor(x_data, theta_value)
print(output)
Your original Theano code uses theano.scan
, a powerful tool for looping over sequences in a way optimized for parallel computation. This is a common approach in Theano
for handling recursive operations efficiently. However, when switching to PyTorch
(PyTensor
), there isn't a direct equivalent of theano.scan
. PyTorch tends to favor explicit loops in Python, which are less optimized but more straightforward.
In the modified PyTorch
code, I replaced the theano.scan
with a standard Python for-loop. This loop iteratively applies the adstock transformation. This change sacrifices some computational efficiency but maintains the core functionality.