Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Respect sequence_len in config for
type: llama2_chat
(#926)
* Respect sequence_len in config for `type: llama2_chat` It was hardcoded to `4096` I am not sure why? This updates it to pull from the config. cc: @winglian * Update llama2_chat.py * apply black formatting * fix tokenizer * update test data * lint fixtures
- Loading branch information