stable_learning_control.utils.mpi_utils.mpi_tf2
Helper methods for managing TF2 MPI processes.
Note
This module is not yet translated to TF2. It is not used by any of the current algorithms, but is kept here for future reference.
Classes
Adam optimizer that averages gradients across MPI processes. |
Functions
|
|
|
|
|
|
Sync all tf variables across MPI processes. |
Module Contents
- stable_learning_control.utils.mpi_utils.mpi_tf2.sync_all_params()[source]
Sync all tf variables across MPI processes.
- class stable_learning_control.utils.mpi_utils.mpi_tf2.MpiAdamOptimizer(**kwargs)[source]
Bases:
object
Adam optimizer that averages gradients across MPI processes.
The compute_gradients method is taken from Baselines MpiAdamOptimizer. For documentation on method arguments, see the TensorFlow docs page for the base
AdamOptimizer
.