Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compression fails #5

Open
jb-apps opened this issue Mar 7, 2024 · 1 comment
Open

Compression fails #5

jb-apps opened this issue Mar 7, 2024 · 1 comment

Comments

@jb-apps
Copy link

jb-apps commented Mar 7, 2024

Hello, amazing work with Guernika, I love using it.

I have a Macbook Air M1 with 8GB of RAM and to run Guernika with XL models I need to compress them. Unfortunately, the conversion fails. After further research it seems that Guernika Model Converter needs to be updated. See this fix Fix quantize-nbits flag.

Currently using version 7.4.1 (1)

Would you provide a new update soon?

Thanks

Screenshot 2024-03-07 at 10 17 15
@jb-apps
Copy link
Author

jb-apps commented Mar 22, 2024

Hello @GuiyeC could you at least submit the latest code so I can submit a PR that fixes the issue? Cheers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant