-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
use Blocks to swap error #3015
Comments
There are a few problems with your config: Second: Your cuda and xformer versions are throwing warnings, it is best to reinstall cuda and xformer. (Or use SPDA) |
Thank you for your reply. I tried to use multi-card training before, but there was an error, so I turned off the multi-card training parameters. It should not be the problem, and my cuda can only support cu118, maybe the xformers couldn't match the torch causing the problem |
I want to do the full checkpoint training for FLUX,
using flux-dev-de-distill as the base model, clip_l.safetensors and t5xxl_fp8_e4m3fn.safetensors as vae,
using single 3090
(I try to use 2 3090, but always fail even I set the correct parameters in the accelerate launch tab)
When I use Blocks to swap, it show errors:
The text was updated successfully, but these errors were encountered: