-
-
Notifications
You must be signed in to change notification settings - Fork 894
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add support for Gemma chat template (#1530)
* Add support for Gemma chat template * Update fschat version to include its newest support for Gemma chat style * pin fastchat to current HEAD --------- Co-authored-by: Wing Lian <[email protected]>
- Loading branch information
1 parent
7477a53
commit 60f5ce0
Showing
2 changed files
with
9 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
60f5ce0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is broken somehow when training llama-3 I get:
60f5ce0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just edit fastchat_conversation_turns.py and remove that if statement
60f5ce0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got the same error when training llama3
https://github.com/OpenAccess-AI-Collective/axolotl/blob/68601ec6ad1cc0e8cb855376586e6eef6a8aa270/src/axolotl/monkeypatch/fastchat_conversation_turns.py#L126