【问题标题】:Pytorch Change the learning rate based on number of epochsPytorch 根据 epoch 数改变学习率
【发布时间】:2020-05-19 21:06:36
【问题描述】:

当我设置学习率并发现训练几个epoch后准确率无法提高

optimizer = optim.Adam(model.parameters(), lr = 1e-4)

n_epochs = 10
for i in range(n_epochs):

    // some training here

如果我想使用阶跃衰减:每 5 个 epoch 将学习率降低 10 倍,我该怎么做?

【问题讨论】:

    标签: optimization pytorch learning-rate


    【解决方案1】:

    你可以使用lr shedular torch.optim.lr_scheduler.StepLR

    import torch.optim.lr_scheduler.StepLR
    scheduler = StepLR(optimizer, step_size=5, gamma=0.1)
    

    step_size epochs see docs here 衰减每个参数组的学习率gamma 来自文档的示例

    # Assuming optimizer uses lr = 0.05 for all groups
    # lr = 0.05     if epoch < 30
    # lr = 0.005    if 30 <= epoch < 60
    # lr = 0.0005   if 60 <= epoch < 90
    # ...
    scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
    for epoch in range(100):
        train(...)
        validate(...)
        scheduler.step()
    

    例子:

    import torch
    import torch.optim as optim
    
    optimizer = optim.SGD([torch.rand((2,2), requires_grad=True)], lr=0.1)
    scheduler = optim.lr_scheduler.StepLR(optimizer, step_size=5, gamma=0.1)
    
    for epoch in range(1, 21):
        scheduler.step()
        print('Epoch-{0} lr: {1}'.format(epoch, optimizer.param_groups[0]['lr']))
        if epoch % 5 == 0:print()
    
    Epoch-1 lr: 0.1
    Epoch-2 lr: 0.1
    Epoch-3 lr: 0.1
    Epoch-4 lr: 0.1
    Epoch-5 lr: 0.1
    
    Epoch-6 lr: 0.010000000000000002
    Epoch-7 lr: 0.010000000000000002
    Epoch-8 lr: 0.010000000000000002
    Epoch-9 lr: 0.010000000000000002
    Epoch-10 lr: 0.010000000000000002
    
    Epoch-11 lr: 0.0010000000000000002
    Epoch-12 lr: 0.0010000000000000002
    Epoch-13 lr: 0.0010000000000000002
    Epoch-14 lr: 0.0010000000000000002
    Epoch-15 lr: 0.0010000000000000002
    
    Epoch-16 lr: 0.00010000000000000003
    Epoch-17 lr: 0.00010000000000000003
    Epoch-18 lr: 0.00010000000000000003
    Epoch-19 lr: 0.00010000000000000003
    Epoch-20 lr: 0.00010000000000000003
    

    更多关于How to adjust Learning Rate - torch.optim.lr_scheduler 提供了几种方法来根据时期数调整学习率。

    【讨论】:

    • optimizer.step() 也是需要的,因为 scheduler.step() 只控制学习率。写这篇文章以防有人错过。
    猜你喜欢
    • 2021-08-29
    • 2020-11-16
    • 2020-03-19
    • 1970-01-01
    • 2021-09-30
    • 1970-01-01
    • 2019-09-03
    • 1970-01-01
    • 2021-03-19
    相关资源
    最近更新 更多