sketchgraphs_models.distributed_utils¶
Utility functions for distributed (multi-gpu) training.
Classes
|
Utility class which adapts a sampler into a distributed sampler which only samples a subset of the underlying sampler, according to a division by rank. |
|
Functions
-
sketchgraphs_models.distributed_utils.get_distributed_config(parameters, local_rank=None)¶
-
sketchgraphs_models.distributed_utils.initialize_distributed(config: sketchgraphs_models.distributed_utils.DistributedTrainingInfo)¶
-
sketchgraphs_models.distributed_utils.is_leader(config: sketchgraphs_models.distributed_utils.DistributedTrainingInfo)¶ Tests whether the current process is the leader for distributed training.
-
sketchgraphs_models.distributed_utils.train_boostrap_distributed(parameters, train)¶
-
sketchgraphs_models.distributed_utils.train_distributed(local_rank, parameters, train)¶