pythonoptimizationneural-networkdeep-learningpytorch

How to change the learning rate of an optimizer at any given moment (no LR schedule)?


Is it possible in PyTorch to change the learning rate of the optimizer in the middle of training dynamically (I don't want to define a learning rate schedule beforehand)?

So let's say I have an optimizer:

optim = torch.optim.SGD(model.parameters(), lr=0.01)

Now due to some tests which I perform during training, I realize my learning rate is too high so I want to change it to say 0.001. There doesn't seem to be a method optim.set_lr(0.001) but is there some way to do this?


Solution

  • So the learning rate is stored in optim.param_groups[i]['lr']. optim.param_groups is a list of the different weight groups which can have different learning rates. Thus, simply doing:

    for g in optim.param_groups:
        g['lr'] = 0.001
    

    will do the trick.


    **Alternatively**,

    as mentionned in the comments, if your learning rate only depends on the epoch number, you can use a learning rate scheduler.

    For example (modified example from the doc):

    from torch.optim.lr_scheduler import LambdaLR
    optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
    # Assuming the optimizer has two groups.
    lambda_group1 = lambda epoch: epoch // 30
    lambda_group2 = lambda epoch: 0.95 ** epoch
    scheduler = LambdaLR(optimizer, lr_lambda=[lambda1, lambda2])
    for epoch in range(100):
        train(...)
        validate(...)
        scheduler.step()
    

    Also, there is a prebuilt learning rate scheduler to reduce on plateaus.