Skip to content

Commit

Permalink
tokenizer is never built when converting finetuning dataset (mosaicml…
Browse files Browse the repository at this point in the history
  • Loading branch information
eldarkurtic authored Aug 29, 2024
1 parent 8516181 commit 100f40f
Showing 1 changed file with 0 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,6 @@ def convert_finetuning_dataset(
decoder_only_format=not encoder_decoder,
)

tokenizer = None
tokenizer_kwargs = tokenizer_kwargs
tokenizer_kwargs.update({'model_max_length': max_seq_len})
if tokenizer:
Expand Down

0 comments on commit 100f40f

Please sign in to comment.