GGUF can no longer use GPU? #5957
Unanswered
TiagoTiago
asked this question in
Q&A
Replies: 1 comment 8 replies
-
I've noticed similar behawior after recent git pull and pip --update. I've checked that installed versions of all llama_cpp_python libraries didn't updated to 0.2.64 despite 'requirements.txt'. |
Beta Was this translation helpful? Give feedback.
8 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm not sure if it was something did (been doing way too much at the same time lately and attention and memory haven't been firing on all cylinders lately), or if it was some update to OTGW, or some driver thing or what, but I noticed that GGUF files no longer seem to be using any VRAM at all, and there isn't even the "Use Tensor Cores", or whatever was called, checkbox when loading GGUF files anymore. Does anyone know what could be going on?
ps: CUDA on the base system seems to still be working, Blender sees it just fine and renders with no noticeable artifacts, and GPTQ and AWQ models seem to still use the GPU.
pps: This is on Linux, and I'm starting OTGW as have been for a long while,
conda activate oobabooga
followed by./start_linux.sh
Beta Was this translation helpful? Give feedback.
All reactions