Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to have openrouter as the LLM API source? #46

Open
chenyuz3 opened this issue Aug 6, 2024 · 6 comments
Open

Is it possible to have openrouter as the LLM API source? #46

chenyuz3 opened this issue Aug 6, 2024 · 6 comments

Comments

@chenyuz3
Copy link

chenyuz3 commented Aug 6, 2024

I am a openrouter heavy user and I really hope this can come true..

@cephalization
Copy link
Member

Open router appears to support the OpenAI client https://openrouter.ai/docs/quick-start

this means that you should be able to enter the openrouter base url and api key within the OpenAI section of cannoli settings and have it work

let me know how it goes!

@chenyuz3
Copy link
Author

Open router appears to support the OpenAI client https://openrouter.ai/docs/quick-start

this means that you should be able to enter the openrouter base url and api key within the OpenAI section of cannoli settings and have it work

let me know how it goes!

Well, it turns out if I use OpenAI Models cannoli will be fine in most cases, but if I use other models (i.e. Gemini 1.5 flash as my favorite model for its long output window + low price) small issues just keep occurring. I assume this is because I am using openrouter via the "OpenAI config" in the settings but gemini models have different API input/output configs...? The color thing (for both nodes and arrows never really works as expected from the college) is especially frustrating.

@cephalization
Copy link
Member

Open router appears to support the OpenAI client https://openrouter.ai/docs/quick-start

this means that you should be able to enter the openrouter base url and api key within the OpenAI section of cannoli settings and have it work

let me know how it goes!

Well, it turns out if I use OpenAI Models cannoli will be fine in most cases, but if I use other models (i.e. Gemini 1.5 flash as my favorite model for its long output window + low price) small issues just keep occurring. I assume this is because I am using openrouter via the "OpenAI config" in the settings but gemini models have different API input/output configs...? The color thing (for both nodes and arrows never really works as expected from the college) is especially frustrating.

Can you give more details on the small issues you are experiencing? An openai compatible client means that open router should be handling all conversions from an incoming openai configuration to the outgoing model that you choose...

@chenyuz3
Copy link
Author

Open router appears to support the OpenAI client https://openrouter.ai/docs/quick-start
this means that you should be able to enter the openrouter base url and api key within the OpenAI section of cannoli settings and have it work
let me know how it goes!

Well, it turns out if I use OpenAI Models cannoli will be fine in most cases, but if I use other models (i.e. Gemini 1.5 flash as my favorite model for its long output window + low price) small issues just keep occurring. I assume this is because I am using openrouter via the "OpenAI config" in the settings but gemini models have different API input/output configs...? The color thing (for both nodes and arrows never really works as expected from the college) is especially frustrating.

Can you give more details on the small issues you are experiencing? An openai compatible client means that open router should be handling all conversions from an incoming openai configuration to the outgoing model that you choose...

i.e. note node can not be directly referenced:
image

and all reference have to be in {{[[note]]}} format (unlike what is shown in college):
image

, and sometimes it is color dependent... which I have never really figured out.

functions like choice arrows are not working too:
image

@blindmansion
Copy link
Member

Thanks for those examples, I'll try to replicate these.

You said that these are only happening when using non-openai providers through OpenRouter? Or are these errors happening in other cases too?

@cephalization
Copy link
Member

Choice arrows may be unreliable depending on the model chosen in open router but everything else should be unrelated to LLMs at all and may just be real bugs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants