deep-learningpytorchgaussian-processbayesian-deep-learningpytorch-distributions

Learning multivariate normal covariance matrix using pytorch


I am trying to learn a multivariate normal covariance matrix (Sigma, ∑) using some observations.

The way I went at it is by using pytorch.distributions.MultivariateNormal:

import torch
from torch.distributions import MultivariateNormal

# I tried both the scale_tril parameter and the covariance parameter.
mvn = MultivariateNormal(loc=torch.tensor([0.0, 0.0], requires_grad=False).view(1,2),
                        scale_tril=torch.tensor([[1.0 , 0.0], [0.0, 1.0]],
                                                requires_grad=True).view(-1, 2, 2))

loss = -mvn.log_prob(torch.ones((1, 2))).mean()
loss.backward()
print(mvn.loc.grad)

I get None. I tried fiddling with the dimensions of the both the loc and the scale_tril parameters. Nothing appears to work. Any ideas?

Bests, Eyal.


Solution

  • You are not calling .grad on your leaf nodes (on .view rather than tensor itself), also you have requires_grad=False on a mean, lets make things more explicit

    import torch
    from torch.distributions import MultivariateNormal
    
    mean = torch.tensor([0.0, 0.0], requires_grad=True)
    cov = torch.tensor([[1.0 , 0.0], [0.0, 1.0]], requires_grad=True)
    
    mvn = MultivariateNormal(loc=mean.view(1,2),
                             scale_tril=cov.view(-1, 2, 2))
    
    loss = -mvn.log_prob(torch.ones((1, 2))).mean()
    loss.backward()
    
    print(mean.grad)
    print(cov.grad)