Skip to content

Commit

Permalink
make sure to save the lora adapter at the end of RL/dpo training (#1573)
Browse files Browse the repository at this point in the history
  • Loading branch information
winglian authored May 8, 2024
1 parent cb78a36 commit 796a085
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions src/axolotl/train.py
Original file line number Diff line number Diff line change
Expand Up @@ -212,6 +212,10 @@ def terminate_handler(_, __, model_weakref):
if cfg.flash_optimum and BetterTransformer:
model = BetterTransformer.reverse(model)

if cfg.rl and cfg.adapter and not cfg.rl_adapter_ref_model:
trainer.model.save_pretrained(
cfg.output_dir, safe_serialization=safe_serialization
)
model.save_pretrained(cfg.output_dir, safe_serialization=safe_serialization)

if not cfg.hub_model_id:
Expand Down

0 comments on commit 796a085

Please sign in to comment.