You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What is causing this warning during flux training? I have Torch 2.4.0+cu124 and running SD3 branch. Otherwise everything seems to work.
F:\sd-scripts\library\flux_models.py:446: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at C:\actions-runner_work\pytorch\pytorch\builder\windows\pytorch\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:555.)
x = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=attn_mask)
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
What is causing this warning during flux training? I have Torch 2.4.0+cu124 and running SD3 branch. Otherwise everything seems to work.
F:\sd-scripts\library\flux_models.py:446: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at C:\actions-runner_work\pytorch\pytorch\builder\windows\pytorch\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:555.)
x = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=attn_mask)
Beta Was this translation helpful? Give feedback.
All reactions