You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So I have been thinking. So Google's Gchat has a feature that summarizes past conversations, Could we use this idea? Like if the bot explained or said something long could we feed the conversations back as a separate session to get the summary and replace some of our context with smaller versions for LTM or the current Prompt?
Even if our current models do not summarize well future ones will. And if our max context size enlarges this feature will effectively make it larger still.
The text was updated successfully, but these errors were encountered:
I believe that's something langchain supports. It's a very good approach not just for the reasons you mentioned (managing our context budget) but because it'll also limit filler text that enters the system.
So I have been thinking. So Google's Gchat has a feature that summarizes past conversations, Could we use this idea? Like if the bot explained or said something long could we feed the conversations back as a separate session to get the summary and replace some of our context with smaller versions for LTM or the current Prompt?
Even if our current models do not summarize well future ones will. And if our max context size enlarges this feature will effectively make it larger still.
The text was updated successfully, but these errors were encountered: