Replies: 1 comment 1 reply
-
In fact, in China, it is possible to use the optimized configuration for flux training with a 15$ P104 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I may be barking at the wrong tree, since these GPUs are old, but they were cheap. 😅
We have 2 P40s for training and typically get about 35 s/it which is ok, but not super speedy.
Any idea how to speed things up further with the settings we are using or are we at the limit.
We thought full fp32 would speed up, but couldn't get it working without OOM.
Here are the train settings, any idea is appreciated, if there is something we're not thinking of.
Beta Was this translation helpful? Give feedback.
All reactions