We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
From OpenSearch 2.16, it's possible to add batch inference support in ingest processors which inherits from AbstractBatchingProcessor(opensearch-project/neural-search#820). From now on, OpenSearch java client doesn't support batch_size parameter(optional) when defining text_embedding processor.
The text was updated successfully, but these errors were encountered:
batch_size
text_embedding
@YeonghyeonKO your fix for this was released in v2.18.0 of the client
Sorry, something went wrong.
No branches or pull requests
Is your feature request related to a problem?
From OpenSearch 2.16, it's possible to add batch inference support in ingest processors which inherits from AbstractBatchingProcessor(opensearch-project/neural-search#820). From now on, OpenSearch java client doesn't support batch_size parameter(optional) when defining text_embedding processor.
What solution would you like?
The text was updated successfully, but these errors were encountered: