-
Notifications
You must be signed in to change notification settings - Fork 390
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Ollama integration #256
Comments
This would be a massively beneficial feature. I would gladly offer my help with implementing it |
Echoing the support here. This would have a material impact on uptake as well. |
If it's interesting, I have used it locally with LM Studio ( it supports any open source LLM ) --> caveat, most open-source have short context window ( 32k at best ) so large diffs won't work the process I used :
that should be it. You can set any random openai key just to get it going ( for example It works for me, but for small commits. I hope it gets proper support with flags. And hopefully we'll get bigger context window on open-source LLMs soon. Enjoy. |
I also made this work. Note that |
I have to say, GPT3.5 works a lot better than local LLM. |
This work for me. Thank a lot 👍🏼👍🏼 |
Feature request
I think it would be nice to have option to run this with local model through ollama.
I could implement it but seems there is no activity in the pull requests recently and worried my PR will get rejected. Please let me know if you greenlight this feature.
Why?
Gpt is paid, but would be nice to have free option.
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: