-
-
Notifications
You must be signed in to change notification settings - Fork 899
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* MLP: Memory saving * Remove RMSNorm restrictions * Map packed weights to original * FusedAttention module * Simplify code * Move fused modules * Fix critical typo * Split inplace * Add FFT config * Add validation of fused arguments * Add fused arguments to config * Update docs * Fix validation logic * Add fused modules to flash attn * Only fuse during training * Remove timing * Formatting * Formatting * Formatting * chore: lint * chore: lint * add e2e tests for fused llama * no lora for tests --------- Co-authored-by: Wing Lian <[email protected]>
- Loading branch information
1 parent
a21935f
commit 15d3a65
Showing
10 changed files
with
364 additions
and
12 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,73 @@ | ||
base_model: NousResearch/Llama-2-7b-hf | ||
base_model_config: NousResearch/Llama-2-7b-hf | ||
model_type: LlamaForCausalLM | ||
tokenizer_type: LlamaTokenizer | ||
is_llama_derived_model: true | ||
|
||
load_in_8bit: false | ||
load_in_4bit: false | ||
strict: false | ||
|
||
datasets: | ||
- path: mhenrichsen/alpaca_2k_test | ||
type: alpaca | ||
dataset_prepared_path: last_run_prepared | ||
val_set_size: 0.01 | ||
output_dir: ./out | ||
|
||
sequence_len: 4096 | ||
sample_packing: true | ||
pad_to_sequence_len: true | ||
|
||
adapter: | ||
lora_model_dir: | ||
lora_r: | ||
lora_alpha: | ||
lora_dropout: | ||
lora_target_linear: | ||
lora_fan_in_fan_out: | ||
|
||
wandb_project: | ||
wandb_entity: | ||
wandb_watch: | ||
wandb_run_id: | ||
wandb_log_model: | ||
|
||
gradient_accumulation_steps: 1 | ||
micro_batch_size: 1 | ||
num_epochs: 1 | ||
optimizer: adamw_bnb_8bit | ||
lr_scheduler: cosine | ||
learning_rate: 0.0002 | ||
|
||
train_on_inputs: false | ||
group_by_length: false | ||
bf16: true | ||
fp16: false | ||
tf32: false | ||
|
||
gradient_checkpointing: true | ||
early_stopping_patience: | ||
resume_from_checkpoint: | ||
local_rank: | ||
logging_steps: 1 | ||
xformers_attention: | ||
flash_attention: true | ||
flash_attn_cross_entropy: false | ||
flash_attn_rms_norm: true | ||
flash_attn_fuse_qkv: false | ||
flash_attn_fuse_mlp: true | ||
|
||
warmup_steps: 100 | ||
eval_steps: 0.05 | ||
eval_table_size: | ||
save_steps: | ||
debug: | ||
deepspeed: #deepspeed/zero2.json # multi-gpu only | ||
weight_decay: 0.1 | ||
fsdp: | ||
fsdp_config: | ||
special_tokens: | ||
bos_token: "<s>" | ||
eos_token: "</s>" | ||
unk_token: "<unk>" |
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.