pytorchpytorch-lightninglearning-rate

How to print learning rate per epoch with pytorch lightning?


I am having a problem with printing (logging) learning rate per epoch in pytorch lightning (PL). TensorFlow logs the learning rate at default. As PL guide suggested, I wrote the following code:

class FusionNetModule(pl.LightningModule):
...
    def configure_optimizers(self):
        optimizer = torch.optim.Adam(self.parameters(), lr=self.lr_rate)
        lr_scheduler = {'scheduler': torch.optim.lr_scheduler.ExponentialLR(optimizer, gamma=0.95),
                        'name': 'expo_lr'}
        return [optimizer], [lr_scheduler]

    def on_validation_epoch_end(self):
        # Log the learning rate.
        lr = self.trainer.lr_scheduler_configs[0].scheduler.get_last_lr()[0]
        self.log('learning_rate', lr)
        ...
...

    # Learning Rate Logger
    lr_logger = LearningRateMonitor(logging_interval='epoch')
    trainer = pl.Trainer(
        logger=True,
        max_epochs=epochs,
        accelerator="gpu",
        devices=[gpu_id],
        callbacks=[lr_logger, early_stopping, checkpoint_callback, metric_logger, progressbar],
        default_root_dir=model_path)

But I didn't get the learning rate in the training log.

Epoch 101: 100%|#| 23/23 [00:06<00:00,  3.43it/s, v_num=102, val_loss=0.988, val_acc=0.768, train_loss=0.965, train_acc=0.752]

Any hint or clue for logging lr would be appreciated.


Solution

  • I resolved the problem by replacing the following line

    self.log('learning_rate', lr)
    

    with

    self.log('learning_rate', lr, on_step=False, on_epoch=True, prog_bar=True)