Skip to content

Commit

Permalink
fix max_seq_len
Browse files Browse the repository at this point in the history
  • Loading branch information
DesmonDay committed Dec 29, 2023
1 parent b055be6 commit 7eda0a4
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion llm/llama/pretrain-linly_llama2_7b-tp2sd4_stage2.json
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
"use_flash_attention": true,
"use_fused_rms_norm": true,
"use_fused_rope": true,
"max_seq_length": 4096,
"max_seq_length": 2048,
"learning_rate": 3e-05,
"min_learning_rate": 3e-06,
"warmup_steps": 30,
Expand Down
2 changes: 1 addition & 1 deletion llm/llama/pretrain-llama_7b-tp2sd4_stage2.json
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
"use_flash_attention": true,
"use_fused_rms_norm": true,
"use_fused_rope": true,
"max_seq_length": 4096,
"max_seq_length": 2048,
"learning_rate": 3e-05,
"min_learning_rate": 3e-06,
"warmup_steps": 30,
Expand Down

0 comments on commit 7eda0a4

Please sign in to comment.