pytorch lr 学习率线性上升


optimizer = torch.optim.SGD(//lr 0.02  args.momentum 0.9 args.weight_decay 10e-4
            params, lr=args.lr, momentum=args.momentum, weight_decay=args.weight_decay)
 lr_scheduler = torch.optim.lr_scheduler.StepLR(optimizer,  step_size=args.lr_step_size,
           gamma=args.lr_gamma)

//用multisteplr,学习率同样是上升
lr_scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=args.lr_steps,
           gamma=args.lr_gamma)

SGD + StepLR

img

SGD + MultiStepLR

img