-
Couldn't load subscription status.
- Fork 47
Open
Description
The bump to Pytorch 1.13 broke some tests related to multiprocessing on CPU and GPU. We get the following errors:
torch.multiprocessing.spawn.ProcessRaisedExceptionAttributeError: 'LightningDistributedDataParallel' object has no attribute '_sync_params'
On these tests:
tests/callbacks/learning_rate_test.py:# TODO: fix test with num_processes=2
tests/callbacks/training_timer_test.py:# TODO: fix test with num_processes=2
tests/loggers/epoch_csv_logger_test.py:# TODO: fix test with num_processes=2
tests/scripts/htr/decode_ctc_test.py:# TODO: fix test with nprocs=2
tests/scripts/htr/netout_test.py:# TODO: fix test with nprocs=2
tests/scripts/htr/train_ctc_test.py:# TODO: fix "ddp_cpu" mode
tests/scripts/htr/train_ctc_test.py:# TODO: fix "ddp" mode
tests/scripts/htr/train_ctc_test.py:# TODO: fix first assertion
I skipped the tests for now, but I need to investigate why we are getting this error and how to fix it.
Mohammed20201991
Metadata
Metadata
Assignees
Labels
No labels