-
I've been using my own models for a few years now (hosted on my a100 racks in a colo) and I created a thing called a protected prompt. This essentially remains persistent and the chat uses the remaining tokens as available. I'm trying to recreate that with oogabooga. I think that's the idea behind a softprompt. Either way if anyone can point me in the direction of how to do this I'd be very grateful. For clarity: The protected prompt will always be present at the beginning of the tokens submitted. Rolling tokens drop off as the full amount of tokens fill up. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
You can do it with an extension that defines an def input_modifier(string):
"""
This function is applied to your text inputs before
they are fed into the model.
"""
return "This AI is a party assassin.\n\n" + string https://github.com/oobabooga/text-generation-webui/wiki/Extensions#how-to-write-an-extension |
Beta Was this translation helpful? Give feedback.
You can do it with an extension that defines an
input_modifier
like this:https://github.com/oobabooga/text-generation-webui/wiki/Extensions#how-to-write-an-extension