-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added support for local LLM models via Ollama #30
base: main
Are you sure you want to change the base?
Conversation
Merging PyPI packages to vs code extension branch
Merging the current state of vs code UI to vs code extension
Open vsx support in workflow
main.py
Outdated
@@ -258,6 +270,7 @@ def perform_non_gui_tasks(): | |||
|
|||
if __name__ == "__main__": | |||
empty_workspace() | |||
load_dotenv() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
loaded dotenv
requirements.txt
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added python-dotenv
Hello @devparanjay - please refer to the above-quoted message. I just now saw that you have committed to As your branch has become outdated, I suggest starting afresh and making only the necessary changes. As mentioned by me earlier, the code has moved to |
Hey, yes, sorry - the comments were for my benefit since I did this a while back and was just checking what changes I did where. I'll try to add this to the new code today or in the next couple of days. It does look like that starting from scratch would be good. |
Enhancement: changes for making sirji compatibility to windows/linux
Hello @kedarchandrayan, Sirji should now ideally work with Ollama. I've updated the README and added relevant settings in the Environment Variable section in views/chat.html and views/chat.js for the extension. Please review. Git seems to be recounting all of yours' commits again for some reason (probably because of forked branch merges from my repo), but here's a list of relevant files for your convenience -
|
Hello @devparanjay, Currently, the total number of file changes is showing as 214, which is not possible. As you mentioned, it seems to be an issue that Git is not identifying correctly. For ease of review, I would request that you fork again and make only the necessary changes to the specific files. This will help us avoid committing any unintentional changes. Please create a fresh PR; we can close the current PR later by referencing it in the new PR. |
Hello @kedarchandrayan, I've opened a new PR #110, please close this PR if everything looks fine for review in the new PR. I had forced a merge of two branches in my forked repo with |
Summary
Added support for Ollama to be able to use local LLM models.
FOSS LLMs a necessity for the future of Open Source Artificial Intelligence. This is an attempt to add that functionality to Sirji (although it was done in Windows and Sirji works on Mac for now).
A .env file is used to set the OpenAI API Key, Ollama model, and to set whether Ollama is used.
What are the specific steps to test this change?
What kind of change does this PR introduce?
Make sure the PR fulfills these requirements:
Other information:
I wasn't able to test this since I use a Windows machine.
I use Ruff for formatting in my IDE - explaining the format changes in code.
The important change-related files are the new .env, sirji/config/model.py, everywhere the OpenAI API key was being called, and wherever OpenAI completions was used and model was set.
(Unfortunately, I did not read the PR guideline earlier so skipped the "feature request issue" part; apologies for not raising it before the PR.)