Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support schnell lora training #1827

Open
phyllispeng123 opened this issue Dec 9, 2024 · 2 comments
Open

Support schnell lora training #1827

phyllispeng123 opened this issue Dec 9, 2024 · 2 comments

Comments

@phyllispeng123
Copy link

Hi @kohya-ss, I just find the current training script or code cannot be applied to FLUX schnell directly. According to @sdbds, it seems the shift problem? There are a lot of users prefer schnell, can you help to launch a schnell version for lora training ? that will be very helpful !!!!!!
I change the script to the following, but it still gets bad result.

--timestep_sampling uniform \
--discrete_flow_shift 1.0 \
--guidance_scale 0.0 \

image

@kohya-ss
Copy link
Owner

kohya-ss commented Dec 9, 2024

Since schnell is a model distilled for fast generation, I don't think it is suitable for fine tuning or LoRA training. So, it seems like one way is to train LoRA on dev and apply it to schnell.

@phyllispeng123
Copy link
Author

Since schnell is a model distilled for fast generation, I don't think it is suitable for fine tuning or LoRA training. So, it seems like one way is to train LoRA on dev and apply it to schnell.

Thanks for your advice ~ I will have a try on dev first

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants