-
Notifications
You must be signed in to change notification settings - Fork 895
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Prodigy + ScheduleFree #1796
Comments
Probably need the schedule-free-opt branch. Then add the optimizer module from pip --optimizer_type "prodigyplus.ProdigyPlusScheduleFree" And the options can be passed in optimizer_args --optimizer_args "prodigy_steps=1000" It should output the optimizer it's using in the logs to confirm. Edit: Updated pip install |
Did it work successfully? |
I tested it and needed some fixes. I made #1799 for that. Seems to work at least. |
This version also supports split groups, so you can set the LR (LR effectively a multiplier of the dynamic LR) differently for the text encoder(s) and UNet. Also might work better since it's separate of the UNet so you won't have that overtraining issue as much. Will do this split by default and there is no constraints to limit it to be equal. |
Oh awesome, appreciate your work here. Yeah when I had tried it I got that error that it wasn’t supported and then I just had other things I was working on. I do think there’s a ton of potential though using schedule free prodigy |
that learning rate doesn't look very "dynamic", it never goes back down? |
Supported by #1811. |
Hello! How might I use this optimizer? https://github.com/LoganBooker/prodigy-plus-schedule-free
Would I use the schedulefree wrapper with it somehow?
The text was updated successfully, but these errors were encountered: