-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unhandled exception (JSON::ParseException) in AI Integration #518
Comments
I had a short look at the source code (first time crytal :D). My first guess would be, the prompt forcing it to prodcuce JSON is not sufficient. I had a quick look at the Ollama docs and it seems there is a format option to force JSON too. So my first wild guess would be: src/llm/ollama/ollama.cr: Add:
to
|
Ok did a short test with the suggested changes from above. Feels this goes further, but still fails with a new casting exception.
|
Hi @schniggie, Thank you for pointing out the I’m currently testing the AI integration code in the If you have any PRs to suggest or improvements to propose, please feel free to share them. We’d be happy to review them positively. Thank you again for bringing this issue to our attention! |
Describe the bug
Using the AI Integration results in an Unhandled exception (JSON::ParseException).
To Reproduce
Steps to reproduce the behavior:
Details
Here also compiled with all debug symbols:
Versions
Additional context
I tested all kind of different models, llama3.2, llama3.1, phi4, deepseek-chat.
The text was updated successfully, but these errors were encountered: