-
Notifications
You must be signed in to change notification settings - Fork 186
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
allow json values as function call arguments #274
base: main
Are you sure you want to change the base?
Conversation
Changed from enum to function, which always deserialize in string |
Thank you for the PR, it is always welcome! I'd recommend to add this in community crates dedicated to LLAMA instead, unfortunately for async-openai this is out of scope. For more information please see https://github.com/64bit/async-openai?tab=readme-ov-file#contributing |
@64bit, thank you for your kind reply. |
There's an idea on how to support it while keeping the scope limited to OpenAI: #280 (comment) Please consider it for contributions |
Currently, only strings (which can be in JSON format) are allowed.
Models like LLAMA, deployed with the Hugging Face Text Generation API, are accessible with the same API as OpenAI, but generate a JSON object instead of a string.