Replies: 2 comments 2 replies
-
To add a little bit more context to this idea for people who are too busy to read the paper. In their work they have implemented NPCs in an RPG game using LLM -- they are using some form of memory streaming to enable AI model to keep the context of the conversation, remember the events, locations, other characters, etc beyond the current token limit of 4096 tokens. Needless to say, this enables much more human-like interaction because the LLM can keep track of things instead of starting to output nonsense after a while. |
Beta Was this translation helpful? Give feedback.
-
@henk717 What do you think of this? Is this something that could be useful? |
Beta Was this translation helpful? Give feedback.
-
It would be very interesting if memory stream functionality from this academic paper was implemented for chat / adventure mode.
https://arxiv.org/abs/2304.03442
PDF
Beta Was this translation helpful? Give feedback.
All reactions