ExponentialLR¶
-
class
torch.optim.lr_scheduler.
ExponentialLR
(optimizer, gamma, last_epoch=- 1, verbose=False)[source]¶ Decays the learning rate of each parameter group by gamma every epoch. When last_epoch=-1, sets initial lr as lr.
- Parameters
-
get_last_lr
()¶ Return last computed learning rate by current scheduler.
-
load_state_dict
(state_dict)¶ Loads the schedulers state.
- Parameters
state_dict (dict) – scheduler state. Should be an object returned from a call to
state_dict()
.
-
print_lr
(is_verbose, group, lr, epoch=None)¶ Display the current learning rate.