-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to have openrouter as the LLM API source? #46
Comments
Open router appears to support the OpenAI client https://openrouter.ai/docs/quick-start this means that you should be able to enter the openrouter base url and api key within the OpenAI section of cannoli settings and have it work let me know how it goes! |
Well, it turns out if I use OpenAI Models cannoli will be fine in most cases, but if I use other models (i.e. Gemini 1.5 flash as my favorite model for its long output window + low price) small issues just keep occurring. I assume this is because I am using openrouter via the "OpenAI config" in the settings but gemini models have different API input/output configs...? The color thing (for both nodes and arrows never really works as expected from the college) is especially frustrating. |
Can you give more details on the small issues you are experiencing? An openai compatible client means that open router should be handling all conversions from an incoming openai configuration to the outgoing model that you choose... |
i.e. note node can not be directly referenced: and all reference have to be in {{[[note]]}} format (unlike what is shown in college): , and sometimes it is color dependent... which I have never really figured out. |
Thanks for those examples, I'll try to replicate these. You said that these are only happening when using non-openai providers through OpenRouter? Or are these errors happening in other cases too? |
Choice arrows may be unreliable depending on the model chosen in open router but everything else should be unrelated to LLMs at all and may just be real bugs. |
I am a openrouter heavy user and I really hope this can come true..
The text was updated successfully, but these errors were encountered: