define an Adam optmizer object with a learning rate of 1e-4:
from torch import optim
opt = optim.Adam(model_resnet18.parameters(), lr=1e-4)
we can read the current value of the learning rate using the following function:
def get_lr(opt):
for param_group in opt.param_groups:
return param_group['lr']
current_lr = get_lr(opt)
print('current lr = {}'.format(current_lr)
define a learning scheduler using the CosineAnnealingLR method:
from torch.optim.lr_scheduler import CosineAnnealingLR
lr_schedular = CosineAnnealingLR(opt, T_max=2, eta_min=1e-5)
※ reference: pytorch computer vision codebook
'머신러닝 > Pytorch' 카테고리의 다른 글
Save the model object into a file for restoration/de-serialization (0) | 2021.04.02 |
---|---|
define colors using random tuples (0) | 2021.01.06 |
visualize the filters of the first CNN layer (0) | 2021.01.06 |
store best weights (0) | 2021.01.04 |
Storing and loading models (0) | 2020.12.31 |