stable_learning_control.algos.pytorch.common

Contains several functions that are used across all the RL algorithms.

Submodules

Package Contents

Functions

get_lr_scheduler

Contains functions used for creating Pytorch learning rate schedulers.

stable_learning_control.algos.pytorch.common.get_lr_scheduler(optimizer, decaying_lr_type, lr_start, lr_final, steps)

Creates a learning rate scheduler.

Parameters:
  • optimizer (torch.optim.Adam) – Wrapped optimizer.

  • decaying_lr_type (str) – The learning rate decay type that is used (options are: linear and exponential and constant).

  • lr_start (float) – Initial learning rate.

  • lr_final (float) – Final learning rate.

  • steps (int, optional) – Number of steps/epochs used in the training. This includes the starting step/epoch.

Returns:

A learning rate scheduler object.

Return type:

torch.optim.lr_scheduler

See also

See the pytorch documentation on how to implement other decay options.