You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add support for LiteLLM within SWE-agent so people can use any open-source model they want for the agent, rather than just OpenAI or Anthropic APIs.
Edit: Switched from Ollama to LiteLLM as LiteLLM should be a bit more abstract than Ollama and support a wider range of closed source and open source LLMs so the community can have more freedom over what they use.
The text was updated successfully, but these errors were encountered:
Seconding this since this project might come in handy in the near future. Also there are ideas about LiteLLM being able to call multiple models from Ollama that is worth anticipating. (AutoGen and other tools have similar supports as well)
Add support for LiteLLM within SWE-agent so people can use any open-source model they want for the agent, rather than just OpenAI or Anthropic APIs.
Edit: Switched from Ollama to LiteLLM as LiteLLM should be a bit more abstract than Ollama and support a wider range of closed source and open source LLMs so the community can have more freedom over what they use.
The text was updated successfully, but these errors were encountered: