pythonnumpypytorchgradienttensor

PyTorch: Can't call numpy() on Variable that requires grad. Use var.detach().numpy() instead


Calling tensor.numpy() gives the error:

RuntimeError: Can't call numpy() on Variable that requires grad. Use var.detach().numpy() instead.

tensor.cpu().detach().numpy() gives the same error.


Solution

  •  Error reproduced

    import torch
    
    tensor1 = torch.tensor([1.0,2.0],requires_grad=True)
    
    print(tensor1)
    print(type(tensor1))
    
    tensor1 = tensor1.numpy()
    
    print(tensor1)
    print(type(tensor1))
    

    which leads to the exact same error for the line tensor1 = tensor1.numpy():

    tensor([1., 2.], requires_grad=True)
    <class 'torch.Tensor'>
    Traceback (most recent call last):
      File "/home/badScript.py", line 8, in <module>
        tensor1 = tensor1.numpy()
    RuntimeError: Can't call numpy() on Variable that requires grad. Use var.detach().numpy() instead.
    
    Process finished with exit code 1
    

    Generic solution

    this was suggested to you in your error message, just replace var with your variable name

    import torch
    
    tensor1 = torch.tensor([1.0,2.0],requires_grad=True)
    
    print(tensor1)
    print(type(tensor1))
    
    tensor1 = tensor1.detach().numpy()
    
    print(tensor1)
    print(type(tensor1))
    

    which returns as expected

    tensor([1., 2.], requires_grad=True)
    <class 'torch.Tensor'>
    [1. 2.]
    <class 'numpy.ndarray'>
    
    Process finished with exit code 0
    

    Some explanation

    You need to convert your tensor to another tensor that isn't requiring a gradient in addition to its actual value definition. This other tensor can be converted to a numpy array. Cf. this discuss.pytorch post. (I think, more precisely, that one needs to do that in order to get the actual tensor out of its pytorch Variable wrapper, cf. this other discuss.pytorch post).