How can I get the value of the learning rate updated at each on_train_batch_begin
?
lr_decayed_fn = tf.keras.experimental.CosineDecay(initial_lr, decay_steps)
model.compile(optimizer=tf.keras.optimizers.SGD(learning_rate=lr_decayed_fn))
I tried this way
def on_train_batch_begin (self, batch, logs = None):
lr = K.get_value(self.model.optimizer.lr)
but I get <tensorflow.python.keras.optimizer_v2.learning_rate_schedule.CosineDecay object at 0x7f ...>
When you set a function as a learning rate or an object subclassing LearningRateScheduler
, you need to call that function (or Callable) with the current training step to get the learning rate. You can get the current training step by using the iterations
attribute of the optimizer.
class CustomCallback(tf.keras.callbacks.Callback):
def __init__(self) -> None:
super().__init__()
def on_train_batch_begin(self, batch, logs=None):
lr = tf.keras.backend.get_value(
self.model.optimizer.lr(self.model.optimizer.iterations)
)