stable_learning_control.algos.pytorch.common.get_lr_scheduler
Contains functions used for creating Pytorch learning rate schedulers.
Classes
A learning rate scheduler that keeps the learning rate constant. |
Functions
|
Calculates the exponential decay rate needed to go from a initial learning rate |
|
Returns a linear decay factor (G) that enables a learning rate to transition |
|
Creates a learning rate scheduler. |
|
Estimates the learning rate at a given step. |
Module Contents
- class stable_learning_control.algos.pytorch.common.get_lr_scheduler.ConstantLRScheduler(optimizer)[source]
Bases:
torch.optim.lr_scheduler.LambdaLR
A learning rate scheduler that keeps the learning rate constant.
Initialize the constant learning rate scheduler.
- Parameters:
optimizer (
torch.optim.Optimizer
) – The wrapped optimizer.
- stable_learning_control.algos.pytorch.common.get_lr_scheduler.get_exponential_decay_rate(lr_start, lr_final, steps)[source]
Calculates the exponential decay rate needed to go from a initial learning rate to a final learning rate in N steps.
- Parameters:
- Returns:
The exponential decay rate (high precision).
- Return type:
- stable_learning_control.algos.pytorch.common.get_lr_scheduler.get_linear_decay_rate(lr_init, lr_final, steps)[source]
Returns a linear decay factor (G) that enables a learning rate to transition from an initial value (lr_init) at step 0 to a final value (lr_final) at a specified step (N). This decay factor is compatible with the
torch.optim.lr_scheduler.LambdaLR
scheduler. The decay factor is calculated using the following formula:- Parameters:
- Returns:
Linear learning rate decay factor (G).
- Return type:
- stable_learning_control.algos.pytorch.common.get_lr_scheduler.get_lr_scheduler(optimizer, decaying_lr_type, lr_start, lr_final, steps)[source]
Creates a learning rate scheduler.
- Parameters:
optimizer (torch.optim.Adam) – Wrapped optimizer.
decaying_lr_type (str) – The learning rate decay type that is used (options are:
linear
andexponential
andconstant
).lr_start (float) – Initial learning rate.
lr_final (float) – Final learning rate.
steps (int, optional) – Number of steps/epochs used in the training. This includes the starting step/epoch.
- Returns:
A learning rate scheduler object.
- Return type:
See also
See the pytorch documentation on how to implement other decay options.
- stable_learning_control.algos.pytorch.common.get_lr_scheduler.estimate_step_learning_rate(lr_scheduler, lr_start, lr_final, update_after, total_steps, step)[source]
Estimates the learning rate at a given step.
This function estimates the learning rate for a specific training step. It differs from the get_last_lr method of the learning rate scheduler, which returns the learning rate at the last scheduler step, not necessarily the current training step.
- Parameters:
lr_scheduler (torch.optim.lr_scheduler) – The learning rate scheduler.
lr_start (float) – The initial learning rate.
update_after (int) – The step number after which the learning rate should start decreasing.
lr_final (float) – The final learning rate.
total_steps (int) – The total number of steps/epochs in the training process. Excludes the initial step.
step (int) – The current step number. Excludes the initial step.
- Returns:
The learning rate at the given step.
- Return type: