File tree Expand file tree Collapse file tree
Expand file tree Collapse file tree Original file line number Diff line number Diff line change 3131
3232 For distributed model parallel training where a model spans multiple
3333 servers, please refer to
34- `Getting Started With Distributed RPC Framework <rpc_tutorial.html>__
34+ `Getting Started With Distributed RPC Framework <rpc_tutorial.html>` __
3535 for examples and details.
3636
3737Basic Usage
Original file line number Diff line number Diff line change @@ -16,7 +16,7 @@ Source code of the two examples can be found in
1616
1717Previous tutorials,
1818`Getting Started With Distributed Data Parallel <ddp_tutorial.html >`__
19- and `Writing Distributed Applications With PyTorch <https://pytorch.org/tutorials/intermediate/ dist_tuto.html >`__,
19+ and `Writing Distributed Applications With PyTorch <dist_tuto.html >`__,
2020described `DistributedDataParallel <https://pytorch.org/docs/stable/_modules/torch/nn/parallel/distributed.html >`__
2121which supports a specific training paradigm where the model is replicated across
2222multiple processes and each process handles a split of the input data.
You can’t perform that action at this time.
0 commit comments