Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix LCM distillation bug when creating the guidance scale embeddings using multiple GPUs. #6279

Merged

Conversation

dg845
Copy link
Contributor

@dg845 dg845 commented Dec 21, 2023

What does this PR do?

This PR fixes a bug in the LCM full model distillation scripts: when using multiple GPUs, attempting to creating the guidance scale embedding w_embedding raises an error because unet.config.time_cond_proj_dim can't be accessed since unet is wrapped by accelerate.

Fixes #6278. Bug discovered by @akameswa.

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@patrickvonplaten
@patil-suraj
@sayakpaul

@dg845 dg845 changed the title Fix bug when creating the guidance embeddings using multiple GPUs. Fix LCM distillation bug when creating the guidance scale embeddings using multiple GPUs. Dec 21, 2023
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@patrickvonplaten
Copy link
Contributor

@patil-suraj can you take a look here?

Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@patil-suraj patil-suraj merged commit 9df3d84 into huggingface:main Dec 27, 2023
14 checks passed
@dg845 dg845 deleted the lcm-distill-fix-unet-time-cond-multi-gpu branch December 27, 2023 20:24
donhardman pushed a commit to donhardman/diffusers that referenced this pull request Dec 29, 2023
…using multiple GPUs. (huggingface#6279)

Fix bug when creating the guidance embeddings using multiple GPUs.

Co-authored-by: Sayak Paul <[email protected]>
antoine-scenario pushed a commit to antoine-scenario/diffusers that referenced this pull request Jan 2, 2024
…using multiple GPUs. (huggingface#6279)

Fix bug when creating the guidance embeddings using multiple GPUs.

Co-authored-by: Sayak Paul <[email protected]>
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
…using multiple GPUs. (huggingface#6279)

Fix bug when creating the guidance embeddings using multiple GPUs.

Co-authored-by: Sayak Paul <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Could not run train_lcm_distill_sd_wds with multi-GPU setting
5 participants