pythonpytorch-lightning

PyTorch Lightning: Multiple scalars (e.g. train and valid loss) in same Tensorboard graph


With PyTorch Tensorboard I can log my train and valid loss in a single Tensorboard graph like this:

writer = torch.utils.tensorboard.SummaryWriter()

for i in range(1, 100):
    writer.add_scalars('loss', {'train': 1 / i}, i)

for i in range(1, 100):
    writer.add_scalars('loss', {'valid': 2 / i}, i)

enter image description here

How can I achieve the same with Pytorch Lightning's default Tensorboard logger?

def training_step(self, batch: Tuple[Tensor, Tensor], _batch_idx: int) -> Tensor:
    inputs_batch, labels_batch = batch

    outputs_batch = self(inputs_batch)
    loss = self.criterion(outputs_batch, labels_batch)

    self.log('loss/train', loss.item())  # creates separate graph

    return loss

def validation_step(self, batch: Tuple[Tensor, Tensor], _batch_idx: int) -> None:
    inputs_batch, labels_batch = batch

    outputs_batch = self(inputs_batch)
    loss = self.criterion(outputs_batch, labels_batch)

    self.log('loss/valid', loss.item(), on_step=True)  # creates separate graph

Solution

  • The doc describe it as self.logger.experiment.some_tensorboard_function() where some_tensorboard_function is the provided functions from tensorboard so for your question you want to use

    self.logger.experiment.add_scalars() 
    

    Tensorboard doc for pytorch-lightning can be found here