pythonoptimizationpytorchlearning-rate

Pytorch Change the learning rate based on number of epochs


When I set the learning rate and find the accuracy cannot increase after training few epochs

optimizer = optim.Adam(model.parameters(), lr = 1e-4)

n_epochs = 10
for i in range(n_epochs):

    // some training here

If I want to use a step decay: reduce the learning rate by a factor of 10 every 5 epochs, how can I do so?


Solution

  • You can use learning rate scheduler torch.optim.lr_scheduler.StepLR

    import torch.optim.lr_scheduler.StepLR
    scheduler = StepLR(optimizer, step_size=5, gamma=0.1)
    

    Decays the learning rate of each parameter group by gamma every step_size epochs see docs here Example from docs

    # Assuming optimizer uses lr = 0.05 for all groups
    # lr = 0.05     if epoch < 30
    # lr = 0.005    if 30 <= epoch < 60
    # lr = 0.0005   if 60 <= epoch < 90
    # ...
    scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
    for epoch in range(100):
        train(...)
        validate(...)
        scheduler.step()
    

    Example:

    import torch
    import torch.optim as optim
    
    optimizer = optim.SGD([torch.rand((2,2), requires_grad=True)], lr=0.1)
    scheduler = optim.lr_scheduler.StepLR(optimizer, step_size=5, gamma=0.1)
    
    for epoch in range(1, 21):
        scheduler.step()
        print('Epoch-{0} lr: {1}'.format(epoch, optimizer.param_groups[0]['lr']))
        if epoch % 5 == 0:print()
    
    Epoch-1 lr: 0.1
    Epoch-2 lr: 0.1
    Epoch-3 lr: 0.1
    Epoch-4 lr: 0.1
    Epoch-5 lr: 0.1
    
    Epoch-6 lr: 0.010000000000000002
    Epoch-7 lr: 0.010000000000000002
    Epoch-8 lr: 0.010000000000000002
    Epoch-9 lr: 0.010000000000000002
    Epoch-10 lr: 0.010000000000000002
    
    Epoch-11 lr: 0.0010000000000000002
    Epoch-12 lr: 0.0010000000000000002
    Epoch-13 lr: 0.0010000000000000002
    Epoch-14 lr: 0.0010000000000000002
    Epoch-15 lr: 0.0010000000000000002
    
    Epoch-16 lr: 0.00010000000000000003
    Epoch-17 lr: 0.00010000000000000003
    Epoch-18 lr: 0.00010000000000000003
    Epoch-19 lr: 0.00010000000000000003
    Epoch-20 lr: 0.00010000000000000003
    

    More on How to adjust Learning Rate - torch.optim.lr_scheduler provides several methods to adjust the learning rate based on the number of epochs.