-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Ollama and other LLMs #84
Comments
Hi @iSevenDays and thank you for raising the issue, that's a fair point. While at its core motleycrew supports Ollama and various cloud providers, the research agent project was built a while ago, when we didn't yet concern ourselves with custom LLMs. I'll get back to you today with a fix. |
I merged the version that allows you to specify a custom LLM and embeddings model. If you have any further questions, feel free to comment :) |
Hi @whimo thanks for the update! Usage:
However, I still get an error that OpenAI API Key is required when executing the code below. Maybe you can help what I should do differently?
|
Hi @iSevenDays, thanks for the PR! We don't auto-guess the LLM provider in init_llm (yet), so you have to specify it directly (see https://motleycrew.readthedocs.io/en/latest/choosing_llms.html#providing-an-llm-to-an-agent), in your case |
I like the idea of your project, and I hope you add support of Ollama and other LLMs.
I've just checked the project to see if I can use it and unfortunately no.
I followed your example https://motleycrew.readthedocs.io/en/latest/examples/research_agent.html and wanted to try it out with llama3.1 and other LLMs.
Here are examples why it is not possible right now:
Related example
OpenAI hard coded all over the code.
The text was updated successfully, but these errors were encountered: