@ai16z/eliza v0.1.5-alpha.3 / trimTokens
trimTokens(
context
,maxTokens
,model
):string
Truncate the context to the maximum length allowed by the model.
• context: string
The text to truncate
• maxTokens: number
Maximum number of tokens to keep
• model: TiktokenModel
The tokenizer model to use
string
The truncated text