tensorflowkerasactivation-function

What is negative_slope argument of tf.keras.layers.ReLU?


tf.keras.layers.ReLU has negative_slope argument which is explained as Float >= 0. Negative slope coefficient. Default to 0.

tf.keras.layers.ReLU(
    max_value=None, 
    negative_slope=0.0, 
    threshold=0.0, 
    **kwargs
)

Is this to make it as Leaky ReLU? If so, is it the same with the alpha argument of the tf.keras.layers.LeakyReLU?

tf.keras.layers.LeakyReLU(alpha=0.3, **kwargs)

* alpha: Float >= 0. Negative slope coefficient. Default to 0.3.

Solution

  • Short answer:

    Is this to make it as Leaky ReLU?

    Yes, the negative_slope parameter of tf.keras.layers.ReLU plays the same role as alpha does in tf.keras.layers.LeakyReLU. For example, tf.keras.layers.ReLU(negative_slope=0.5) and tf.keras.layers.LeakyReLU(alpha=0.5) have the same behavior.


    Here is a visualization of their behavior:

    import matplotlib.pyplot as plt
    import numpy as np
    import tensorflow as tf
    
    relu=tf.keras.layers.ReLU(max_value=None, negative_slope=0.0, threshold=0.0)
    relu_neg_half=tf.keras.layers.ReLU(max_value=None, negative_slope=0.5, threshold=0.0)
    relu_neg_1tenth=tf.keras.layers.ReLU(max_value=None, negative_slope=0.1, threshold=0.0)
    relu_neg_1tenth_thresh_1=tf.keras.layers.ReLU(max_value=None, negative_slope=0.1, threshold=1.)
    relu_neg_1tenth_thresh_2=tf.keras.layers.ReLU(max_value=None, negative_slope=0.1, threshold=2.)
    
    lrelu_alph_half=tf.keras.layers.LeakyReLU(alpha=0.5)
    lrelu_alph_1tenth=tf.keras.layers.LeakyReLU(alpha=0.1)
    
    x=np.linspace(-5,5,101)
    
    fig = plt.figure(figsize=(6,6), dpi=150)
    markevery=0.05
    plt.plot(x, relu(x), '-o', markevery=markevery, label="ReLU | negative_slope=0.0")
    plt.plot(x, relu_neg_half(x), '--s', markevery=markevery, label="ReLU | negative_slope=0.5")
    plt.plot(x, relu_neg_1tenth(x), '--p', markevery=markevery, label="ReLU | negative_slope=0.1")
    plt.plot(x, relu_neg_1tenth_thresh_2(x), '--d', markevery=markevery, label="ReLU | negative_slope=0.1 | threshold=2.0")
    
    plt.plot(x, lrelu_alph_half(x), '--v', markevery=markevery*1.2, label="LeakyReLU | alpha=0.5")
    plt.plot(x, lrelu_alph_1tenth(x), '--^', markevery=markevery*1.2, label="LeakyReLU | alpha=0.5")
    
    plt.legend(frameon=False)
    plt.savefig('relu.png', bbox_inches='tight')
    

    Output: enter image description here