Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Interesting Idea to increase usable context. #10

Open
USBhost opened this issue Apr 5, 2023 · 1 comment
Open

Interesting Idea to increase usable context. #10

USBhost opened this issue Apr 5, 2023 · 1 comment

Comments

@USBhost
Copy link

USBhost commented Apr 5, 2023

So I have been thinking. So Google's Gchat has a feature that summarizes past conversations, Could we use this idea? Like if the bot explained or said something long could we feed the conversations back as a separate session to get the summary and replace some of our context with smaller versions for LTM or the current Prompt?

Even if our current models do not summarize well future ones will. And if our max context size enlarges this feature will effectively make it larger still.

@wawawario2
Copy link
Owner

wawawario2 commented Apr 8, 2023

I believe that's something langchain supports. It's a very good approach not just for the reasons you mentioned (managing our context budget) but because it'll also limit filler text that enters the system.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants