machine-learningneural-networkpytorchfeed-forward

Pytorch, can't run backward() on even the most simple network without getting an error


I am new to pytorch and I can't run backward() on even the most simple network without generating an error. For example:

(Linear(6, 6)(Variable(torch.zeros([10, 6]))) - Variable(torch.zeros([10, 6]))).backward()

Throws the following error

{RuntimeError}element 0 of variables does not require grad and does not have a grad_fn

What have I done wrong in the code to create this issue?


Solution

  • Try adding a grad_output of matching shape as a parameter to backward:

    (Linear(6, 6)(Variable(torch.zeros([10, 6]))) - Variable(torch.zeros([10, 6]))).backward(torch.zeros([10, 6]))

    The following answer has more details: Why should be the function backward be called only on 1 element tensor or with gradients w.r.t to Variable?