-
Notifications
You must be signed in to change notification settings - Fork 639
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: CUDA Setup failed despite GPU being available. #1434
Comments
Ran into the same problem.
OS: Ubuntu 22.04.2 !python -m bitsandbytes
nvidia-smi
Reproductionfrom transformers import WhisperForConditionalGeneration
quantization_config = BitsAndBytesConfig(load_in_8bit=True)
model = WhisperForConditionalGeneration.from_pretrained(
model="openai/whisper-small,
quantization_config=quantization_config,
device_map="auto"
) |
I was previously using I ran the following code to install |
I download the Linux version from the following link. It resolved the error. |
Resolved, thanks for the help @kaijun123 |
System Info
OS: Ubuntu 24.04.1 LTS
Python: Python 3.10.15
nvcc:
NVIDIA (R) Cuda compiler driver
Built on Thu_Sep_12_02:18:05_PDT_2024
Cuda compilation tools, release 12.6, V12.6.77
Build cuda_12.6.r12.6/compiler.34841621_0
Packages in environment at:
Reproduction
Error message:
Reproduction:
I am trying to run the LLava-Med model on my device. The repo is provided here:
https://github.com/microsoft/LLaVA-Med
I followed the instructions and installed the necessary dependencies:
To verify that the location of the files are correct:
The same error occurs when I set LD_LIBRARY_PATH=/usr/local/cuda/lib64
Expected behavior
I should be able to download the model with the pre-trained weights, without any errors. However, I am getting errors from bitsandbytes.
The text was updated successfully, but these errors were encountered: