You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Because of this, old environments that try to update update and I imagine fresh installs seem to be skipping this download which, in my case, caused not only exllama loaders to fail but also autoGPTQ loaders to fail.
"pip install --update..." for the missing files fixed it and models now load. It seems like everything else updated properly.
File "C:\text-generation-webui-main\modules\ui_model_menu.py", line 232, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\text-generation-webui-main\modules\models.py", line 93, in load_model
output = load_func_map[loader](model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\text-generation-webui-main\modules\models.py", line 313, in ExLlamav2_HF_loader
from modules.exllamav2_hf import Exllamav2HF
File "C:\text-generation-webui-main\modules\exllamav2_hf.py", line 7, in
from exllamav2 import (
File "C:\text-generation-webui-main\installer_files\env\Lib\site-packages\exllamav2_init_.py", line 3, in
from exllamav2.model import ExLlamaV2
File "C:\text-generation-webui-main\installer_files\env\Lib\site-packages\exllamav2\model.py", line 35, in
from exllamav2.config import ExLlamaV2Config
File "C:\text-generation-webui-main\installer_files\env\Lib\site-packages\exllamav2\config.py", line 5, in
from exllamav2.fasttensors import STFile
File "C:\text-generation-webui-main\installer_files\env\Lib\site-packages\exllamav2\fasttensors.py", line 6, in
from exllamav2.ext import exllamav2_ext as ext_c
File "C:\text-generation-webui-main\installer_files\env\Lib\site-packages\exllamav2\ext.py", line 286, in
ext_c = exllamav2_ext
^^^^^^^^^^^^^
NameError: name 'exllamav2_ext' is not defined
### System Info
```shell
Windows, NVIDIA, 30xx, amd cpu, python v 3.11
The text was updated successfully, but these errors were encountered:
Describe the bug
Currently, the requirements.txt file in the repo and on https://github.com/oobabooga/text-generation-webui/blob/main/requirements.txt links to
"https://github.com/oobabooga/exllamav2/releases/download/v0.2.3/exllamav2-0.2.3+cu121.torch2.4.1-cp311-cp311-win_amd64.whl"
Which is broken because the current exllamav2 versions seem to be under the url "https://github.com/oobabooga/exllamav2/releases/v0.2.4"
Because of this, old environments that try to update update and I imagine fresh installs seem to be skipping this download which, in my case, caused not only exllama loaders to fail but also autoGPTQ loaders to fail.
"pip install --update..." for the missing files fixed it and models now load. It seems like everything else updated properly.
Is there an existing issue for this?
Reproduction
swap to "
https://github.com/oobabooga/exllamav2/releases/download/v0.2.4/exllamav2-0.2.4+cu121.torch2.4.1-cp311-cp311-win_amd64.whl; platform_system == "Windows" and python_version == "3.11"
https://github.com/oobabooga/exllamav2/releases/download/v0.2.4/exllamav2-0.2.4+cu121.torch2.4.1-cp310-cp310-win_amd64.whl; platform_system == "Windows" and python_version == "3.10"..."
and so on
Screenshot
No response
Logs
The text was updated successfully, but these errors were encountered: