pythontensorflowkerasdeep-learningrelu

Python/Keras: LeakyRelu using tensorflow


I am having problems installing keras. The following are giving me too much trouble to get around (even when doing updates on the terminal):

from keras.layers import Dense, Activation
from keras.models import Sequential

So instead of initialising a ANN with ann = Sequential(), I do ann = tf.keras.models.Sequential(). This by importing:

import tensorflow as tf
from tensorflow import keras

I would like to use LeakyReLU as an activation function. However, this one seems to be different to implement and the keras documentation is not helping me that much compared to how others tend to do.

I've seen that ann.add(LeakyReLU(alpha=0.05)) is needed. However, what about the other parameters like unit or input_dim? How can I implement this using my code?

# Initialising the ANN
ann = tf.keras.models.Sequential()

# Adding the input layer and the first hidden layer
ann.add(tf.keras.layers.Dense(units=32, activation='relu'))

# Adding the second hidden layer
ann.add(tf.keras.layers.Dense(units=32, activation='relu'))

# Adding the output layer
ann.add(tf.keras.layers.Dense(units=1))

Solution

  • First of all you can import Sequential, Dense and Activation directly by using from tensorflow.keras.models import Sequential and from tensorflow.keras.layers import Dense, Activation

    You can implement LeakyReLU like this:

    from tensorflow import keras
    
    model = keras.models.Sequential([
        keras.layers.Dense(10),
        keras.layers.LeakyReLU(alpha=0.05)
    ])
    

    You can specify the LeakuReLU activation function after you declare the layer as given in keras documentation.