-
-
Notifications
You must be signed in to change notification settings - Fork 899
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add MPS support #1264
Add MPS support #1264
Conversation
This is awesome! @maximegmd |
I'm not sure what's up with |
I can always "parse" version numbers with a quick regex but it's not ideal (thanks ChatGPT) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @maximegmd 🚀
Thanks @maximegmd!! @winglian is MLX support still planned? |
The issue with supporting MLX is that we will pretty much have to rewrite axolotl entirely to replace all torch specific code (all of it) to MLX, so no bitsandbytes, no xformers, no flash-attn, no transformers... At this point I think we are better off waiting for the transformers support of MPS to reach an acceptable state. |
Thanks for this, @maximegmd !! |
Will run it again tomorrow, because I was today years old when I discovered the performance power option 😅 M3 Max 16/40 64GB |
@maximegmd For installation, since |
Right just install the base without any options. I use accelerate:
This training takes around 56 minutes on my machine. |
Nice, thx! I'll try it out |
* add mps support * linter stuff * CI fixes * install packaging for various tests * Update setup.py * Revert "install packaging for various tests" This reverts commit 980e7aa. * Revert "CI fixes" This reverts commit 4609e3b. --------- Co-authored-by: Wing Lian <[email protected]>
Description
Supports basic training on Mac M series.
Motivation and Context
It partially solves Mac support.
How has this been tested?
Ran a train job with lora-mps.yml from start to finish.