Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(openai): parsing of streaming completions #1844

Merged
merged 1 commit into from
Oct 17, 2023

Conversation

Eisfunke
Copy link
Contributor

streamCompletion previously failed often (not always though), because parseResponseChunk assumed that a chunk of the input stream from the OpenAI api would only ever contain a single message. However, the API often sends two data messages in one chunk separated by two newlines (which the format allows) together at the start of a completion, which failed to parse as JSON. Because the unparsed chunks were stuck in the buffer, they were always appended to the beginning for all other chunks, which therefore also didn't parse, leading to all messages not being parsed.

I fixed this by re-chunking the stream into lines with Streams.lines. And also added parsing logic for comments, multi-line messages and messages separated by double-newlines, as specified in the Server-Sent Events spec the API uses here. It seems the API currently doesn't send comments and only ever sends single-line messages, but in case they change that, IHP can now handle it.

https://platform.openai.com/docs/api-reference/chat/create#chat/create-stream
https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#event_stream_format

@mpscholten mpscholten merged commit e66f966 into master Oct 17, 2023
2 checks passed
@mpscholten mpscholten deleted the nicolas/fix-openai-stream branch October 17, 2023 17:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants