kerasneural-networkdropout

Dropout on a Dense layer


In Keras, when we define our first hidden layer with input_dim argument followed by a Dropout layer as follows:

model.add(Dense(units = 16, activation = 'relu', kernel_initializer = 'random_uniform', input_dim = 5))
model.add(Dropout(0.2))

Is the Dropout being applied to the hidden or to the input layer? If it is being applied to the hidden, how can i apply to the input one aswell and vice-versa?


Solution

  • The Dropout is applied to the output of the previous layer, so in this case to the hidden layer. If you want to apply it to the input, add a Dropout layer as your first layer in the network.