Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: handle multiple embedding events for llama-index #1166

Merged
merged 1 commit into from
Dec 12, 2024

Conversation

RogerHYang
Copy link
Contributor

resolves #1139

@RogerHYang RogerHYang requested a review from a team as a code owner December 12, 2024 15:43
@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Dec 12, 2024
@RogerHYang RogerHYang force-pushed the multiple-embedding-events branch from 583e77c to 02910ec Compare December 12, 2024 15:44
@RogerHYang RogerHYang merged commit 0cef233 into main Dec 12, 2024
3 checks passed
@RogerHYang RogerHYang deleted the multiple-embedding-events branch December 12, 2024 16:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:M This PR changes 30-99 lines, ignoring generated files.
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

[bug] Data Loss in Batch Embedding Processing for llamindex
2 participants