pytorchlearning-rate

How to change the learning rate in PyTorch (1.6)


I am using PyTorch and I want to change the learning rate after some epochs.

However, the code that is provided on most documentations, which is:

  optimizer = torch.optim.Adam([
        dict(params=model.parameters(), lr=learning_rate),
    ])
   #This line specifically
   optimizer.params_group[0]['lr'] = learning_rate

does not work.

Actually PyCharm hints at it:

Unresolved attribute reference 'params_group' for class 'Adam'

As a result, the error thrown is:

AttributeError: 'Adam' object has no attribute 'params_group'

How should one manually change the learning rate in PyTorch (1.6)?


Solution

  • Param_groups is not the feasible solution devised by the pytorch and thus you should be implementing pytorch.optim.lr_scheduler. Read more about this at other stackoverflow answer here.

    import torch.optim.lr_scheduler.StepLR #step learning rate
    scheduler = StepLR(optimizer, step_size=5, gamma=0.1)