You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've tried running this project on a Titan RTX on Windows and Linux and it doesn't run for either operating system. model llama3-ChatQA-1.5-8B should work on Tesla hardware because the torch_dtype is float16
System
Titan RTX
Linux Linux hp-z820 6.5.0-44-generic #44~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Tue Jun 18 14:36:16 UTC 2 x86_64 x86_64 x86_64 GNU/Linux
I've tried running this project on a Titan RTX on Windows and Linux and it doesn't run for either operating system. model
llama3-ChatQA-1.5-8B
should work on Tesla hardware because thetorch_dtype
isfloat16
System
Linux hp-z820 6.5.0-44-generic #44~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Tue Jun 18 14:36:16 UTC 2 x86_64 x86_64 x86_64 GNU/Linux
Steps
Logs
and
The text was updated successfully, but these errors were encountered: