Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

does not load a model if context size is "too big" #111

Open
dynamiccreator opened this issue Nov 28, 2024 · 0 comments
Open

does not load a model if context size is "too big" #111

dynamiccreator opened this issue Nov 28, 2024 · 0 comments

Comments

@dynamiccreator
Copy link

I'm trying to get various models to load on Linux with LM studio version 3.5. I can do so, but only if I keep the context size below ~8192.
I'm using CPU only mode and I have 128GB RAM (so that cannot be an issue for a 32B 4Q model)
7B models can be used with that context size, but as I said I got plenty of RAM, so it cannot be the issue here.

Loading with vanilla llama.cpp (build from source) loading a 32B model or even bigger models with 32768 context sizes or bigger works with no issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant