pythontensorflowkeraslearning-rate

Monitor the Learning Rate of the InverseTimeDecay() - float() argument must be a string or a number, not 'InverseTimeDecay'


My goal is to have a look at the learning rate progression of the Adam optimizer, on which I apply InverseTimeDecay schedule. So I want to check if the learning rate actually decreases.

Having checked this question on stack overflow, I made similar changes in my code:

tf.keras.callbacks.LearningRateScheduler(hparams[HP_LEARNING_RATE])
def get_lr_metric(optimizer):
    def lr(y_true, y_pred):
        return optimizer.lr
    return lr
lr_metric = [get_lr_metric(optimizer)]

    model.compile(optimizer=optimizer,
                  loss=neural_network_parameters['model_loss'],
                  metrics=neural_network_parameters['model_metric'] + lr_metric, ) #+ lr_metric

However, when I start the training of the model I get the following error:

TypeError: float() argument must be a string or a number, not 'InverseTimeDecay'

TypeError: 'float' object is not callable

Kindly check my colab notebook and please comment on it any changes that I should do. Also, write in the comments any additional information that I might forget to mention.

[UDPATE] - I guess that my problem is the type of optimizer.lr value. Which in my case is an InverseTimeDecay object. How can I change a type of that object to a float number? InverseTimeDecay to float.


Solution

  • InverseTimeDecay and every LearningRateSchedule instances are functions that accept a step and return the learning rate.

    So the learning rate is completly predictable from the iterration/steps and there is no real need to monitor it using something like tensorboard, but if you really want you can use something like the following:

    def get_lr_metric(optimizer):
        lr = optimizer.learning_rate
        if isinstance(lr, tf.keras.optimizers.schedules.LearningRateSchedule):
            return lr(optimizer.iterations)
        else: 
            return lr