Skip to content

Commit

Permalink
fix: make sure to pass kwargs to enable checkpoint
Browse files Browse the repository at this point in the history
  • Loading branch information
NanoCode012 authored Mar 16, 2024
1 parent 34211ff commit a914cb3
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/axolotl/utils/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -888,7 +888,7 @@ def load_and_quantize_parallel(name_param, model, **kwargs):

if cfg.adapter in ["lora", "qlora"]:
if cfg.gradient_checkpointing:
model.gradient_checkpointing_enable()
model.gradient_checkpointing_enable(gradient_checkpointing_kwargs=cfg.gradient_checkpointing_kwargs)
if (
cfg.load_in_8bit or cfg.load_in_4bit
) and not skip_prepare_model_for_kbit_training:
Expand Down

0 comments on commit a914cb3

Please sign in to comment.