Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root." when loading a model #6529

Open
1 task done
Robtles opened this issue Nov 15, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@Robtles
Copy link

Robtles commented Nov 15, 2024

Describe the bug

Hello,

I'm running Text-Generation-WebUI with Pinokio. The installation goes fine, then I've added this model in the Model > Download section. The download works fine, but after that when I try to load it, I get this error:

OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.

Note: I got the same error with other models as well.

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

  • Install Pinokio
  • Look for text-generation-webui, download and install it
  • Once installed, try to install and load this model

Screenshot

No response

Logs

File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/modules/ui_model_menu.py", line 232, in load_model_wrapper
    shared.model, shared.tokenizer = load_model(selected_model, loader)
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/modules/models.py", line 93, in load_model
    output = load_func_map[loader](model_name)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/modules/models.py", line 313, in ExLlamav2_HF_loader
    from modules.exllamav2_hf import Exllamav2HF
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/modules/exllamav2_hf.py", line 7, in <module>
    from exllamav2 import (
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/exllamav2/__init__.py", line 3, in <module>
    from exllamav2.model import ExLlamaV2
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/exllamav2/model.py", line 35, in <module>
    from exllamav2.config import ExLlamaV2Config
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/exllamav2/config.py", line 5, in <module>
    from exllamav2.stloader import STFile, cleanup_stfiles
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/exllamav2/stloader.py", line 5, in <module>
    from exllamav2.ext import none_tensor, exllamav2_ext as ext_c
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/exllamav2/ext.py", line 276, in <module>
    exllamav2_ext = load \
                    ^^^^^^
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1312, in load
    return _jit_compile(
           ^^^^^^^^^^^^^
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1722, in _jit_compile
    _write_ninja_file_and_build_library(
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1811, in _write_ninja_file_and_build_library
    extra_ldflags = _prepare_ldflags(
                    ^^^^^^^^^^^^^^^^^
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1900, in _prepare_ldflags
    if (not os.path.exists(_join_cuda_home(extra_lib_dir)) and
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 2416, in _join_cuda_home
    raise OSError('CUDA_HOME environment variable is not set. '
OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.

System Info

MacBook Pro Sequoia 15.1, M2
@Robtles Robtles added the bug Something isn't working label Nov 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant