-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
openai compatible provider #1
Comments
Put the base API url in a customization variable, instead of hard-coding it. This should let users use alternate OpenAI compatible providers. Fixes #1.
Oh interesting idea. It was hard-coded but I just added a commit to make the API url customizable - 34cc413. To use put this in your (setq robby-api-url "https://alternate-api-provider") Let me know if that works. What provider are you using? |
I'll try with torgetheai and ollama (which now supports some openai api directives). |
I'm trying a simple setup with
Otherwise this setup works out of the box with other packages I've been trying. |
Thanks! What other providers did you try? I’m noticing slight variations between the different providers in auxiliary things like fetching and parsing the list of available models or error response formats. I started working on something to allow plugging in little variations like that for specific providers. It will be a week or two however before I get back to this. I’ll sort the ‘405’ response from litellm then. |
I have only one cohere model plugged in so far via litellm. |
Hi there! Wondering if robby can setup to work with openapi compatible providers. Being able to setup a base_url for example...
The text was updated successfully, but these errors were encountered: