stable_learning_control.algos.tf2.common.get_lr_scheduler

Module used for creating TensorFlow learning rate schedulers.

Module Contents

Functions

get_lr_scheduler(decaying_lr_type, lr_start, lr_final, ...)

Creates a learning rate scheduler.

Attributes

tf

stable_learning_control.algos.tf2.common.get_lr_scheduler.tf[source]
stable_learning_control.algos.tf2.common.get_lr_scheduler.get_lr_scheduler(decaying_lr_type, lr_start, lr_final, steps)[source]

Creates a learning rate scheduler.

Parameters:
  • decaying_lr_type (str) – The learning rate decay type that is used (options are: linear and exponential and constant).

  • lr_start (float) – Initial learning rate.

  • lr_final (float) – Final learning rate.

  • steps (int, optional) – Number of steps/epochs used in the training. This includes the starting step/epoch.

Returns:

A learning rate

scheduler object.

Return type:

tf.keras.optimizers.schedules.LearningRateSchedule

See also

See the TensorFlow documentation on how to implement other decay options.