You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The issue comes with the last command from this part:
mkdir repositories
cd repositories
git clone https://github.com/qwopqwop200/GPTQ-for-LLaMa
cd GPTQ-for-LLaMa
git reset --hard 468c47c01b4fe370616747b6d69a2d3f48bab5e4 python setup_cuda.py install<------- THIS
Screenshot
No response
Logs
In file included from /home/christopher/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/include/c10/util/Half.h:15:
/home/christopher/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/include/c10/util/complex.h:8:10: fatal error: 'thrust/complex.h' file not found
#include <thrust/complex.h>
^~~~~~~~~~~~~~~~~~
29 warnings and 1 error generated when compiling for gfx1030.
error: command'/opt/rocm-5.4.3/bin/hipcc' failed with exit code 1
System Info
OS: Ubuntu 22.10 64 bit
CPU: Intel(R) Xeon(R) CPU E5-2620 v2 @ 2.10GHz
GPU: AMD Radeon RX 6600
The text was updated successfully, but these errors were encountered:
Describe the bug
When I run "python setup_cuda.py install" from this guide https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model#4-bit-mode I get this error.
Is there an existing issue for this?
Reproduction
I followed this guide to the letter: https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model#4-bit-mode
The issue comes with the last command from this part:
mkdir repositories
cd repositories
git clone https://github.com/qwopqwop200/GPTQ-for-LLaMa
cd GPTQ-for-LLaMa
git reset --hard 468c47c01b4fe370616747b6d69a2d3f48bab5e4
python setup_cuda.py install <------- THIS
Screenshot
No response
Logs
System Info
The text was updated successfully, but these errors were encountered: