define an Adam optmizer object with a learning rate of 1e-4:

from torch import optim
opt = optim.Adam(model_resnet18.parameters(), lr=1e-4)

 

we can read the current value of the learning rate using the following function:

def get_lr(opt):
	for param_group in opt.param_groups:
    	return param_group['lr']
        
current_lr = get_lr(opt)
print('current  lr = {}'.format(current_lr)

 

define a learning scheduler using the CosineAnnealingLR method:

from torch.optim.lr_scheduler import CosineAnnealingLR
lr_schedular = CosineAnnealingLR(opt, T_max=2, eta_min=1e-5)

※ reference: pytorch computer vision codebook

+ Recent posts