-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: retrieve conversation metadata when loading conversation #27
feat: retrieve conversation metadata when loading conversation #27
Conversation
Signed-off-by: SuZhou-Joe <[email protected]>
7c39d6a
to
5ca502e
Compare
const [interactionsResp, conversation] = await Promise.all([ | ||
this.client.transport.request({ | ||
method: 'GET', | ||
path: `/_plugins/_ml/memory/conversation/${sessionId}/_list`, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
NIT: #22 (comment)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done for this.
})) as ApiResponse<{ | ||
interactions: Interaction[]; | ||
}>; | ||
const [interactionsResp, conversation] = await Promise.all([ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OpenSearch Dashboards supports batching concurrent requests when enabling courier:batchSearches
which ends up making the application utilize _msearch
.
Does the ML Commons plugin support batch requests? Could potentially save some time for total roundtrip so that it just batches these two requests together instead of making two requests and waiting for two responses.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks kavilla, I do not think there is a mechanism in current opensearchClient that can batch concurrent requests to backend plugin. ML_common only gives APIs to retrieve the data we need. And it is better that Node.js has no knowledge about the implementation detail of how backend store these data.
As for roundtrip, I think that will be OK because we send these two requests in parallel and thus we only need to pay for one roundtrip time.
createdTimeMs: Date.now(), | ||
updatedTimeMs: Date.now(), | ||
title: conversation.body.name, | ||
createdTimeMs: +new Date(conversation.body.create_time), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it still Ms
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+new Date(xxx)
is milliseconds, but do we need to convert them to epoch or should we use string? since UI doesn't display epoch number. I originally wrote this because i thought backend would return epoch
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
An epoch is easy to convert into string or Date both in Browser and Node.js. For a string type, it is hard tell which format it is using from the type declaration.
@@ -4,6 +4,7 @@ | |||
*/ | |||
|
|||
import { ApiResponse } from '@opensearch-project/opensearch/.'; | |||
import { TransportRequestPromise } from '@opensearch-project/opensearch/lib/Transport'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: i think api response can go in this import statement as well
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, done for that.
const messageParserRunner = new MessageParserRunner(this.messageParsers); | ||
const finalInteractions: Interaction[] = [...session.body.interactions]; | ||
const finalInteractions: Interaction[] = [...interactionsResp.body.interactions]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is the spread needed or is this just const finalInteractions: Interaction[] = interactionsResp.body.interactions;
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, thanks for catching.
Signed-off-by: SuZhou-Joe <[email protected]>
Welcome to Codecov 🎉Once merged to your default branch, Codecov will compare your coverage reports and display the results in this comment. Thanks for integrating Codecov - We've got you covered ☂️ |
Signed-off-by: SuZhou-Joe <[email protected]>
Signed-off-by: SuZhou-Joe <[email protected]>
* feat: retrieve conversation metadata when loading conversation Signed-off-by: SuZhou-Joe <[email protected]> * feat: change /_plugins/_ml to a constant Signed-off-by: SuZhou-Joe <[email protected]> * feat: change import Signed-off-by: SuZhou-Joe <[email protected]> * feat: optimize code Signed-off-by: SuZhou-Joe <[email protected]> --------- Signed-off-by: SuZhou-Joe <[email protected]>
* feat: retrieve conversation metadata when loading conversation Signed-off-by: SuZhou-Joe <[email protected]> * feat: change /_plugins/_ml to a constant Signed-off-by: SuZhou-Joe <[email protected]> * feat: change import Signed-off-by: SuZhou-Joe <[email protected]> * feat: optimize code Signed-off-by: SuZhou-Joe <[email protected]> --------- Signed-off-by: SuZhou-Joe <[email protected]>
Description
Issues Resolved
List any issues this PR will resolve, e.g. Closes [...].
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.