Releases: 0cc4m/GPTQ-for-LLaMa
Releases · 0cc4m/GPTQ-for-LLaMa
GPTQ-KoboldAI 0.0.6
GPTQ-KoboldAI 0.0.5
Merge pull request #14 from TehVenomm/latestmerge Incorrect bit shift applied to 8bit models
GPTQ-KoboldAI 0.0.4
2023-05-19-2 Bump version
GPTQ-KoboldAI 0.0.3
2023-05-18-2 Fix setup.py
GPTQ-KoboldAI 0.0.2
Add support for upstream gptq cuda version Co-authored-by: qwopqwop200 <[email protected]>
GPTQ Python module
2023-05-06-2 Add MPT support
quant_cuda for CUDA 11.8/ROCm 5.4.2
Revert "Add wheel links file for pip" This reverts commit 539af97b13e2b8ab0da6b26733159c22d3fe0963.
quant_cuda for CUDA 11.7
Revert "Add wheel links file for pip" This reverts commit 539af97b13e2b8ab0da6b26733159c22d3fe0963.
2023-04-10
First wheel release