stable_learning_control.algos.pytorch.common
Contains several functions that are used across all the RL algorithms.
Submodules
Functions
|
Creates a learning rate scheduler. |
Package Contents
- stable_learning_control.algos.pytorch.common.get_lr_scheduler(optimizer, decaying_lr_type, lr_start, lr_final, steps)[source]
Creates a learning rate scheduler.
- Parameters:
optimizer (torch.optim.Adam) – Wrapped optimizer.
decaying_lr_type (str) – The learning rate decay type that is used (options are:
linear
andexponential
andconstant
).lr_start (float) – Initial learning rate.
lr_final (float) – Final learning rate.
steps (int, optional) – Number of steps/epochs used in the training. This includes the starting step/epoch.
- Returns:
A learning rate scheduler object.
- Return type:
See also
See the pytorch documentation on how to implement other decay options.