kerastensorboard

Keras: how to output learning rate onto tensorboard


I added a callback to decay the learning rate:

 keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.5, patience=100, 
                                   verbose=0, mode='auto',epsilon=0.00002, cooldown=20, min_lr=0)

Here is my tensorboard callback:

keras.callbacks.TensorBoard(log_dir='./graph/rank{}'.format(hvd.rank()), histogram_freq=10, batch_size=FLAGS.batch_size,
                            write_graph=True, write_grads=True, write_images=False)

I want to make sure the learning rate scheduler has kicked in during training, so I want to output the learning rate onto tensorboard. But I can not find where I can set it.

I also checked the optimizer api, but no luck.

keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)

How can I output the learning rate to tensorboad?


Solution

  • According to the author of Keras, the proper way is to subclass the TensorBoard callback:

    from keras import backend as K
    from keras.callbacks import TensorBoard
    
    class LRTensorBoard(TensorBoard):
        # add other arguments to __init__ if you need
        def __init__(self, log_dir, **kwargs):
            super().__init__(log_dir=log_dir, **kwargs)
    
        def on_epoch_end(self, epoch, logs=None):
            logs = logs or {}
            logs.update({'lr': K.eval(self.model.optimizer.lr)})
            super().on_epoch_end(epoch, logs)
    

    Then pass it as part of the callbacks argument to model.fit (credit Finncent Price):

    model.fit(x=..., y=..., callbacks=[LRTensorBoard(log_dir="/tmp/tb_log")])