-
Notifications
You must be signed in to change notification settings - Fork 729
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix missing DDP in torch distributed #3185
Conversation
|
||
metric_meters.update(metrics, n=metrics.pop(NUM_SAMPLES, 1)) | ||
self.global_step += 1 | ||
with self.model.join(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can handle uneven data for different workers, but only available at torch 1.7.0: https://pytorch.org/docs/stable/generated/torch.nn.parallel.DistributedDataParallel.html
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will remove this patch first since our test environment doesn't have torch 1.7.0.
Can add later if we target to support torch 1.7.0
Is it possible to write a unit test? E.g. checking if different workers has the same weights after training. |
Added. Take a look. |
http://10.239.47.210:18888/view/ZOO-PR/job/ZOO-PR-Validation/4715/ One example test gets crashed. All other tests passed. Merge it first. |
* fix ddp * add model join * fix * add ut * remove join * remove debug msg
* fix ddp * add model join * fix * add ut * remove join * remove debug msg
* fix ddp * add model join * fix * add ut * remove join * remove debug msg
* fix ddp * add model join * fix * add ut * remove join * remove debug msg
* fix ddp * add model join * fix * add ut * remove join * remove debug msg
* fix ddp * add model join * fix * add ut * remove join * remove debug msg
* fix ddp * add model join * fix * add ut * remove join * remove debug msg
intel-analytics/analytics-zoo#528
Fix for torch_distributed backend.