pythonmachine-learningpytorchin-placerelu

Assigning functional relu to a variable while inplace parameter is True


If I want to do ReLU operation after my convolution on x, and in my code I do:

x = F.leaky_relu(x, negative_slope=0.2, inplace=True)

Is this code wrong since I assign the relu to x variable while inplace is True? Ie. does it mean the ReLU function ran twice and in order to work correctly I should set the inplace to False or not assign to x? Thank you


Solution

  • Your code will take the tensor x and apply LeakyReLU operation on it. Inplace means you change x directly so your don't need to assign it.

    So either you write

    F.leaky_relu(x, negative_slope=0.2, inplace=True)
    

    or

    x = F.leaky_relu(x, negative_slope=0.2)
    

    The default value of inplace is False, that's why I don't set it in the second example.

    In both cases the operation is executed once