Skip to content

Commit

Permalink
Change LCM-LoRA README Script Example Learning Rates to 1e-4 (#6304)
Browse files Browse the repository at this point in the history
Change README LCM-LoRA example learning rates to 1e-4.
  • Loading branch information
dg845 authored Dec 25, 2023
1 parent 84c403a commit a3d31e3
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion examples/consistency_distillation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ accelerate launch train_lcm_distill_lora_sd_wds.py \
--mixed_precision=fp16 \
--resolution=512 \
--lora_rank=64 \
--learning_rate=1e-6 --loss_type="huber" --adam_weight_decay=0.0 \
--learning_rate=1e-4 --loss_type="huber" --adam_weight_decay=0.0 \
--max_train_steps=1000 \
--max_train_samples=4000000 \
--dataloader_num_workers=8 \
Expand Down
2 changes: 1 addition & 1 deletion examples/consistency_distillation/README_sdxl.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ accelerate launch train_lcm_distill_lora_sdxl_wds.py \
--mixed_precision=fp16 \
--resolution=1024 \
--lora_rank=64 \
--learning_rate=1e-6 --loss_type="huber" --use_fix_crop_and_size --adam_weight_decay=0.0 \
--learning_rate=1e-4 --loss_type="huber" --use_fix_crop_and_size --adam_weight_decay=0.0 \
--max_train_steps=1000 \
--max_train_samples=4000000 \
--dataloader_num_workers=8 \
Expand Down

0 comments on commit a3d31e3

Please sign in to comment.