-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make sure DDPM and diffusers
can be used without Transformers
#5668
Conversation
The documentation is not available anymore as the PR was closed or merged. |
@sayakpaul |
But I am doing
|
Hmm that should have been caught then. Checking the container. |
Do we even need to import |
I think it's for type hints. |
Hmm so the tests pass because torch isn't installed either, so just dummy objects are being created |
Wonder if there is a more robust method to test the dependencies here. @DN6 what would you suggest? |
I think there are a couple of issues Models are all imported if torch is available otherwise we get dummies, but the Mixins can introduce additional dependencies on transformers or other soft dependencies. (Technically torch is optional too) So we should be careful about how we add Mixins into model classes if they are only supposed to depend on a single dependency (torch or jax). Adds dedicated workflows for the soft dependencies. |
diffusers
can be used without Transformers
diffusers
can be used without Transformersdiffusers
can be used without Transformers
src/diffusers/models/lora.py
Outdated
@@ -18,14 +18,15 @@ | |||
import torch.nn.functional as F | |||
from torch import nn | |||
|
|||
from ..loaders import PatchedLoraProjection, text_encoder_attn_modules, text_encoder_mlp_modules |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need to do this change? Don't think it's necessary no?
@@ -2390,7 +2390,7 @@ def unfuse_text_encoder_lora(text_encoder): | |||
def set_adapters_for_text_encoder( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! Going forward, let's make sure we always use strings for Transformers type hints
@@ -1,19 +1,40 @@ | |||
from typing import TYPE_CHECKING | |||
|
|||
from ...utils import ( | |||
DIFFUSERS_SLOW_IMPORT, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice clean-up!
@@ -1 +1,48 @@ | |||
from .pipeline_pixart_alpha import PixArtAlphaPipeline | |||
from typing import TYPE_CHECKING | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice clean-up!
PR is good to go for me, just wondering whether we really need to do this change: #5668 (comment) |
* fix: import bug * fix * fix * fix import utils for lcm * fix: pixart alpha init * Fix --------- Co-authored-by: Patrick von Platen <[email protected]>
…ingface#5668) * fix: import bug * fix * fix * fix import utils for lcm * fix: pixart alpha init * Fix --------- Co-authored-by: Patrick von Platen <[email protected]>
…ingface#5668) * fix: import bug * fix * fix * fix import utils for lcm * fix: pixart alpha init * Fix --------- Co-authored-by: Patrick von Platen <[email protected]>
…ingface#5668) * fix: import bug * fix * fix * fix import utils for lcm * fix: pixart alpha init * Fix --------- Co-authored-by: Patrick von Platen <[email protected]>
Fixes #5661.
I think we need to make a patch release for this one.
While reviewing, could y'all please go over this comment too and verify things on your ends?