-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Do we have support for Vulkan ? #219
Comments
No vulkan support at the moment. I think the best bet is going to be using cmake to build I will not be putting any effect this direction, all we use internally is linux + cuda so justifying work elsewhere is a tough sell. Contributions for vulkan are welcome though. |
Does this mean llama-cpp-rs won't have Metal support going forward ? What would be the last build with Metal ? ( seems I can't compile 0.1.42 anymore - the last working version before the ggml-common.h problem ) Metal would be sorely missed as llama-cpp-rs seems to be the fastest Rust LLM at the mo...Candle isn't even close :) No worries I completely understand since your daily need is Cuda and not Metal / Vulkan / Other etc If I had the skills I'd help out but only 3month in on my Rustventures :) |
#202 I believe broke metal support which was in 0.1.42. (0.1.41 should be fine) I expect metal support to be working once agin upon replacing |
feel free to give #221 a try. No clue what the defaults for mac are (but a cursory glance makes me think it might just work™) |
I just tried release 0.1.45 and it compiles perfectly... ..in fact it broke a new record of 115 t/s with the Gemma 2B model. Many thanks ! |
Is vulkan really not supported still? I see code to build for vulkan and some other backends in the build.rs of llama-cpp-sys. Wondering if thats usable at all. |
It might work! Feel free to give it a test and report back. The build script just got changed. |
I'm asking as I've just compiled my first Rust app for Android...
..and now curious to understand what it would take to compile it with llama-cpp-rs support for Android Vulkan :)
https://www.reddit.com/r/LocalLLaMA/comments/1adbzx8/as_of_about_4_minutes_ago_llamacpp_has_been/
The text was updated successfully, but these errors were encountered: