tensorflowkerasdeep-learningmodelface-recognition

how to fix this Value Error ' ValueError: decay is deprecated in the new Keras optimizer,'?


I'm new at deep learning and i follow tutorial about face detection.

model = canaro.models.createSimpsonsModel(IMG_SIZE=IMG_SIZE, channels=channels, output_dim=len(characters), 
                                         loss='binary_crossentropy', decay=1e-7, learning_rate=0.001, momentum=0.9,
                                         nesterov=True)


ValueError Traceback (most recent call last) WARNING:absl:lr is deprecated, please use learning_rate instead, or use the legacy optimizer, e.g.,tf.keras.optimizers.legacy.SGD. Output exceeds the size limit. Open the full output data in a text editor ValueError: decay is deprecated in the new Keras optimizer, pleasecheck the docstring for valid arguments, or use the legacy optimizer, e.g., tf.keras.optimizers.legacy.SGD.

I already tried follow some steps but i dont know how to fix it.


Solution

  • As mentioned elsewhere, the decay argument has been deprecated for all optimizers since Keras 2.3, whose release notes explicitly suggest using LearningRateSchedule objects instead.

    Since you can't obviously modify the canaro source code (well... you could, but it'd be very bad practice, and definitely not recommended), I see two options:

    1. downgrade Tensorflow to a version that employs a Keras backend <2.3
    2. upgrade canaro to a version that supports TF 2.3+, if possible

    For those who got to this question after a Google search and can still modify their source code (they are not especially worried about canaro, that is) here you are an example snippet:

    TF<2.3 - style

    import tensorflow as tf
    epochs = 50
    learning_rate = 0.01
    decay_rate = learning_rate / epochs
    optimizer = tf.keras.optimizers.Adam(lr=learning_rate, decay=decay_rate)
    

    TF>=2.3 - style

    import tensorflow as tf
    lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(
        initial_learning_rate=0.01,
        decay_steps=10000,
        decay_rate=0.9)
    optimizer = tf.keras.optimizers.Adam(learning_rate=lr_schedule)