returnn.torch.distributed
¶
torch.distributed utils
- class returnn.torch.distributed.DistributedContext(options: Dict[str, Any])[source]¶
This class setups some helper functions for torch distributed training
- returnn.torch.distributed.get_ctx(config=None) DistributedContext | None [source]¶
- Parameters:
config (Config|None)
- Returns:
the global context if Torch distributed is enabled, or None otherwise. If we did not setup the context yet, it will automatically create it.