You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Exception: Cannot import 'llama_cpp_cuda' because 'llama_cpp' is already imported. See issue #1575 in llama-cpp-python. Please restart the server before attempting to use a different version of llama-cpp-python.
Is there an existing issue for this?
I have searched the existing issues
Reproduction
fresh install, start server, attempt to load model
Screenshot
No response
Logs
start_macos.sh
15:32:05-639275 INFO Starting Text generation web UI
Running on local URL: http://127.0.0.1:7860
15:32:14-027478 INFO Loading "Codestral-22B-v0.1-Q5_K_M.gguf"
15:32:14-057944 INFO llama.cpp weights detected:
"models/Codestral-22B-v0.1-Q5_K_M.gguf"
15:32:14-086935 ERROR Failed to load the model.
Traceback (most recent call last):
File "/Users/CM/Downloads/text-generation-webui-main-2/modules/ui_model_menu.py", line 246, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/CM/Downloads/text-generation-webui-main-2/modules/models.py", line 94, in load_model
output = load_func_map[loader](model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/CM/Downloads/text-generation-webui-main-2/modules/models.py", line 275, in llamacpp_loader
model, tokenizer = LlamaCppModel.from_pretrained(model_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/CM/Downloads/text-generation-webui-main-2/modules/llamacpp_model.py", line 39, in from_pretrained
LlamaCache = llama_cpp_lib().LlamaCache
^^^^^^^^^^^^^^^
File "/Users/CM/Downloads/text-generation-webui-main-2/modules/llama_cpp_python_hijack.py", line 38, in llama_cpp_lib
raise Exception(f"Cannot import 'llama_cpp_cuda' because '{imported_module}' is already imported. See issue #1575 in llama-cpp-python. Please restart the server before attempting to use a different version of llama-cpp-python.")
Exception: Cannot import 'llama_cpp_cuda' because 'llama_cpp' is already imported. See issue #1575 in llama-cpp-python. Please restart the server before attempting to use a different version of llama-cpp-python.
System Info
M1 Max 32GB Sonoma 14.6
The text was updated successfully, but these errors were encountered:
Describe the bug
Exception: Cannot import 'llama_cpp_cuda' because 'llama_cpp' is already imported. See issue #1575 in llama-cpp-python. Please restart the server before attempting to use a different version of llama-cpp-python.
Is there an existing issue for this?
Reproduction
fresh install, start server, attempt to load model
Screenshot
No response
Logs
System Info
The text was updated successfully, but these errors were encountered: