Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Requirements.txt Needs Some Quick Fixes #6552

Open
1 task done
VegaStarlake opened this issue Dec 1, 2024 · 0 comments
Open
1 task done

Requirements.txt Needs Some Quick Fixes #6552

VegaStarlake opened this issue Dec 1, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@VegaStarlake
Copy link

Describe the bug

Currently, the requirements.txt file in the repo and on https://github.com/oobabooga/text-generation-webui/blob/main/requirements.txt links to

"https://github.com/oobabooga/exllamav2/releases/download/v0.2.3/exllamav2-0.2.3+cu121.torch2.4.1-cp311-cp311-win_amd64.whl"

Which is broken because the current exllamav2 versions seem to be under the url "https://github.com/oobabooga/exllamav2/releases/v0.2.4"

Because of this, old environments that try to update update and I imagine fresh installs seem to be skipping this download which, in my case, caused not only exllama loaders to fail but also autoGPTQ loaders to fail.

"pip install --update..." for the missing files fixed it and models now load. It seems like everything else updated properly.

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

swap to "
https://github.com/oobabooga/exllamav2/releases/download/v0.2.4/exllamav2-0.2.4+cu121.torch2.4.1-cp311-cp311-win_amd64.whl; platform_system == "Windows" and python_version == "3.11"
https://github.com/oobabooga/exllamav2/releases/download/v0.2.4/exllamav2-0.2.4+cu121.torch2.4.1-cp310-cp310-win_amd64.whl; platform_system == "Windows" and python_version == "3.10"..."
and so on

Screenshot

No response

Logs

File "C:\text-generation-webui-main\modules\ui_model_menu.py", line 232, in load_model_wrapper

shared.model, shared.tokenizer = load_model(selected_model, loader)

                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\text-generation-webui-main\modules\models.py", line 93, in load_model

output = load_func_map[loader](model_name)

         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\text-generation-webui-main\modules\models.py", line 313, in ExLlamav2_HF_loader

from modules.exllamav2_hf import Exllamav2HF

File "C:\text-generation-webui-main\modules\exllamav2_hf.py", line 7, in

from exllamav2 import (

File "C:\text-generation-webui-main\installer_files\env\Lib\site-packages\exllamav2_init_.py", line 3, in

from exllamav2.model import ExLlamaV2

File "C:\text-generation-webui-main\installer_files\env\Lib\site-packages\exllamav2\model.py", line 35, in

from exllamav2.config import ExLlamaV2Config

File "C:\text-generation-webui-main\installer_files\env\Lib\site-packages\exllamav2\config.py", line 5, in

from exllamav2.fasttensors import STFile

File "C:\text-generation-webui-main\installer_files\env\Lib\site-packages\exllamav2\fasttensors.py", line 6, in

from exllamav2.ext import exllamav2_ext as ext_c

File "C:\text-generation-webui-main\installer_files\env\Lib\site-packages\exllamav2\ext.py", line 286, in

ext_c = exllamav2_ext

        ^^^^^^^^^^^^^

NameError: name 'exllamav2_ext' is not defined


### System Info

```shell
Windows, NVIDIA, 30xx, amd cpu, python v 3.11
@VegaStarlake VegaStarlake added the bug Something isn't working label Dec 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant