tensorflowkerasnormalizationsequentialdeeplearning4j

How to use LayerNormalization layer in a Keras sequential Model?


I am just getting into Keras and Tensor flow. Im having a lot of problems adding an input normalization layer in a sequential model. Now my model is ;

 model = tf.keras.models.Sequential()
 model.add(keras.layers.Dense(256, input_shape=(13, ), activation='relu'))
 model.add(tf.keras.layers.LayerNormalization(axis=-1 , center=True , scale=True))
 model.add(keras.layers.Dense(128, activation='relu'))
 model.add(keras.layers.Dense(64, activation='relu'))
 model.add(keras.layers.Dense(64, activation='relu'))
 model.add(keras.layers.Dense(1))
 model.summary()

My doubts are whether I should first perform an adapt function and how to use it in the sequential model. Thanks to all!!


Solution

  • I'm trying to figure this out as well. According to this example, adapt is not necessary.

    model = tf.keras.models.Sequential([
      # Reshape into "channels last" setup.
      tf.keras.layers.Reshape((28,28,1), input_shape=(28,28)),
      tf.keras.layers.Conv2D(filters=10, kernel_size=(3,3),data_format="channels_last"),
      # LayerNorm Layer
      tf.keras.layers.LayerNormalization(axis=3 , center=True , scale=True),
      tf.keras.layers.Flatten(),
      tf.keras.layers.Dense(128, activation='relu'),
      tf.keras.layers.Dropout(0.2),
      tf.keras.layers.Dense(10, activation='softmax')
    ])
    
    model.compile(optimizer='adam',
                  loss='sparse_categorical_crossentropy',
                  metrics=['accuracy'])
    model.fit(x_test, y_test)
    

    Also, make sure you want a LayerNormalization. If I understand correctly, that normalizes every input on its own. Batch normalization may be more appropriate. See this for more info.