You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been trying to install the text generation webui in Linux Mint but I keep getting circular import errors. I tried looking for an existing question or answer but I can't find any so I'm creating this post.
Please note that I'm trying to run the hfv2 llama weights but I'm getting the same issue regardless of which command I use to launch the webui. Please see below errors:
GPU: RX6750XT
Steps to recreate:
1.) Activate conda environment and install all required dependencies using installation option 1: Conda in the repo homepage, replacing the third line with AMD dependencies.
2.) Launch webui using "python server.py" or "python server.py --model LLaMA-7B --load-in-8bit --no-stream" command.
3.) Following error is shown below:
Traceback (most recent call last):
File "/home/mylinux/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1124, in _get_module
return importlib.import_module("." + module_name, self.name)
File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1050, in _gcd_import
File "", line 1027, in _find_and_load
File "", line 992, in _find_and_load_unlocked
File "", line 241, in _call_with_frames_removed
File "", line 1050, in _gcd_import
File "", line 1027, in _find_and_load
File "", line 1006, in _find_and_load_unlocked
File "", line 688, in _load_unlocked
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "/home/mylinux/.local/lib/python3.10/site-packages/transformers/models/init.py", line 15, in
from . import (
ImportError: cannot import name 'llama' from partially initialized module 'transformers.models' (most likely due to a circular import) (/home/mylinux/.local/lib/python3.10/site-packages/transformers/models/init.py)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/mylinux/Anallama/text-generation-webui/server.py", line 13, in
import modules.chat as chat
File "/home/mylinux/Anallama/text-generation-webui/modules/chat.py", line 15, in
from modules.text_generation import encode, generate_reply, get_max_prompt_length
File "/home/mylinux/Anallama/text-generation-webui/modules/text_generation.py", line 14, in
from modules.models import local_rank
File "/home/mylinux/Anallama/text-generation-webui/modules/models.py", line 10, in
from transformers import AutoModelForCausalLM, AutoTokenizer
File "", line 1075, in _handle_fromlist
File "/home/mylinux/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1114, in getattr
module = self._get_module(self._class_to_module[name])
File "/home/mylinux/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1126, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.auto because of the following error (look up to see its traceback):
cannot import name 'llama' from partially initialized module 'transformers.models' (most likely due to a circular import) (/home/mylinux/.local/lib/python3.10/site-packages/transformers/models/init.py)
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I've been trying to install the text generation webui in Linux Mint but I keep getting circular import errors. I tried looking for an existing question or answer but I can't find any so I'm creating this post.
Please note that I'm trying to run the hfv2 llama weights but I'm getting the same issue regardless of which command I use to launch the webui. Please see below errors:
GPU: RX6750XT
Steps to recreate:
1.) Activate conda environment and install all required dependencies using installation option 1: Conda in the repo homepage, replacing the third line with AMD dependencies.
2.) Launch webui using "python server.py" or "python server.py --model LLaMA-7B --load-in-8bit --no-stream" command.
3.) Following error is shown below:
Traceback (most recent call last):
File "/home/mylinux/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1124, in _get_module
return importlib.import_module("." + module_name, self.name)
File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1050, in _gcd_import
File "", line 1027, in _find_and_load
File "", line 992, in _find_and_load_unlocked
File "", line 241, in _call_with_frames_removed
File "", line 1050, in _gcd_import
File "", line 1027, in _find_and_load
File "", line 1006, in _find_and_load_unlocked
File "", line 688, in _load_unlocked
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "/home/mylinux/.local/lib/python3.10/site-packages/transformers/models/init.py", line 15, in
from . import (
ImportError: cannot import name 'llama' from partially initialized module 'transformers.models' (most likely due to a circular import) (/home/mylinux/.local/lib/python3.10/site-packages/transformers/models/init.py)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/mylinux/Anallama/text-generation-webui/server.py", line 13, in
import modules.chat as chat
File "/home/mylinux/Anallama/text-generation-webui/modules/chat.py", line 15, in
from modules.text_generation import encode, generate_reply, get_max_prompt_length
File "/home/mylinux/Anallama/text-generation-webui/modules/text_generation.py", line 14, in
from modules.models import local_rank
File "/home/mylinux/Anallama/text-generation-webui/modules/models.py", line 10, in
from transformers import AutoModelForCausalLM, AutoTokenizer
File "", line 1075, in _handle_fromlist
File "/home/mylinux/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1114, in getattr
module = self._get_module(self._class_to_module[name])
File "/home/mylinux/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1126, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.auto because of the following error (look up to see its traceback):
cannot import name 'llama' from partially initialized module 'transformers.models' (most likely due to a circular import) (/home/mylinux/.local/lib/python3.10/site-packages/transformers/models/init.py)
Beta Was this translation helpful? Give feedback.
All reactions