-
Notifications
You must be signed in to change notification settings - Fork 27.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Only import torch.distributed if it is available #35133
Conversation
cc @kwen2501 @ArthurZucker - I believe this is caused by the imports added in #34184 |
Yes, indeed. We vendored a simple patch in nixpkgs: https://github.com/NixOS/nixpkgs/pull/362768/files. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM I'll add it to the patch!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, thanks!
This is a continuation of 217c47e but for another module. This issue was spotted in nixpkgs (again) when building lm-eval package that used a different path in transformers library to reach the same failure. Related: huggingface#35133
What does this PR do?
torch.distributed
is sometimes unavailable (on some platforms such as darwin).It leads to
transformers.models.auto.modeling_auto
to fail to import with:I propose to implement a similar logic as in huggingface/accelerate#2121 to load
torch.distributed
conditionally on its availability.I was able to test the patch in nixpkgs successfully.
Fixes #35129
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.