Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prodigy + ScheduleFree #1796

Closed
hablaba opened this issue Nov 19, 2024 · 7 comments
Closed

Prodigy + ScheduleFree #1796

hablaba opened this issue Nov 19, 2024 · 7 comments

Comments

@hablaba
Copy link

hablaba commented Nov 19, 2024

Hello! How might I use this optimizer? https://github.com/LoganBooker/prodigy-plus-schedule-free

Would I use the schedulefree wrapper with it somehow?

@rockerBOO
Copy link
Contributor

rockerBOO commented Nov 19, 2024

Probably need the schedule-free-opt branch. Then add the optimizer module from pip pip install git+http://github.com/LoganBooker/prodigy-plus-schedule-free. Then

--optimizer_type "prodigyplus.ProdigyPlusScheduleFree"

https://github.com/LoganBooker/prodigy-plus-schedule-free/blob/main/prodigyplus/prodigy_plus_schedulefree.py#L47C1-L99C28

And the options can be passed in optimizer_args

--optimizer_args "prodigy_steps=1000"

It should output the optimizer it's using in the logs to confirm.

Edit: Updated pip install

@gesen2egee
Copy link
Contributor

Did it work successfully?

@rockerBOO
Copy link
Contributor

rockerBOO commented Nov 20, 2024

I tested it and needed some fixes. I made #1799 for that. Seems to work at least.

Screenshot 2024-11-20 at 11-56-41 confused-dragon-301 pov-kohya-lora – Weights   Biases

@rockerBOO
Copy link
Contributor

This version also supports split groups, so you can set the LR (LR effectively a multiplier of the dynamic LR) differently for the text encoder(s) and UNet. Also might work better since it's separate of the UNet so you won't have that overtraining issue as much. Will do this split by default and there is no constraints to limit it to be equal.

@hablaba
Copy link
Author

hablaba commented Nov 25, 2024

Oh awesome, appreciate your work here. Yeah when I had tried it I got that error that it wasn’t supported and then I just had other things I was working on. I do think there’s a ton of potential though using schedule free prodigy

@bghira
Copy link

bghira commented Dec 1, 2024

that learning rate doesn't look very "dynamic", it never goes back down?

@kohya-ss
Copy link
Owner

kohya-ss commented Dec 2, 2024

Supported by #1811.

@kohya-ss kohya-ss closed this as completed Dec 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants