neodroidvision.utilities.torch_utilities.distributing.distributed.DistributedSampler¶
- class neodroidvision.utilities.torch_utilities.distributing.distributed.DistributedSampler(dataset: Sized, num_replicas: Optional[int] = None, rank=None, shuffle: bool = True)[source]¶
Bases:
Sampler
Sampler that restricts data loading to a subset of the dataset. It is especially useful in conjunction with
torch.nn.parallel.DistributedDataParallel
. In such case, each process can pass a DistributedSampler instance as a DataLoader sampler, and load a subset of the original dataset that is exclusive to it. .. note:: Dataset is assumed to be of constant size. Arguments: dataset: Dataset used for sampling. num_replicas (optional): Number of processes participating in distributed training. rank (optional): Rank of the current process within num_replicas.- __init__(dataset: Sized, num_replicas: Optional[int] = None, rank=None, shuffle: bool = True)[source]¶
- Parameters
dataset –
num_replicas –
rank –
shuffle –
Methods
__init__
(dataset[, num_replicas, rank, shuffle])- param dataset
set_epoch
(epoch)- param epoch