Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix torch compile with FSDP #1919

Merged
merged 9 commits into from
Sep 14, 2023

Conversation

pacman100
Copy link
Contributor

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 1, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@muellerzr muellerzr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! What this PR is doing makes sense with me, a general question though as to if we can already use an existing tool already, which checks if a module was compiled via torch.compile.

src/accelerate/accelerator.py Outdated Show resolved Hide resolved
src/accelerate/accelerator.py Outdated Show resolved Hide resolved
src/accelerate/accelerator.py Outdated Show resolved Hide resolved
Copy link
Collaborator

@muellerzr muellerzr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! LG2M, cc @BenjaminBossan so he can get a little familiar with this side of the codebase

@pacman100 pacman100 marked this pull request as ready for review September 7, 2023 02:31
Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing this, LGTM. I have a few comments but no blockers for merging.

src/accelerate/accelerator.py Outdated Show resolved Hide resolved
src/accelerate/accelerator.py Outdated Show resolved Hide resolved
src/accelerate/accelerator.py Show resolved Hide resolved
@muellerzr muellerzr requested review from BenjaminBossan and removed request for BenjaminBossan September 7, 2023 12:37
Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks Sourab.

src/accelerate/accelerator.py Outdated Show resolved Hide resolved
src/accelerate/accelerator.py Outdated Show resolved Hide resolved
@pacman100 pacman100 merged commit e5452a6 into huggingface:main Sep 14, 2023
24 checks passed
@pacman100 pacman100 deleted the fix_fsdp_torch_compile_issue branch October 6, 2023 08:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

ValueError: Expected _orig_mod to NOT be FullyShardedDataParallel if using an auto_wrap_policy
4 participants