Skip to content

Commit

Permalink
fix(model): apply gate fp32 only for mixtral (#1241)
Browse files Browse the repository at this point in the history
* fix(model): apply gate fp32 only for mixtral

* Update src/axolotl/utils/models.py

* fix gate layer check

---------

Co-authored-by: Wing Lian <wing.lian@gmail.com>
  • Loading branch information
NanoCode012 and winglian authored Feb 1, 2024
1 parent dfd1885 commit 2d65f47
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/axolotl/utils/models.py
Original file line number Diff line number Diff line change
@@ -676,7 +676,7 @@ def load_model(
if not cfg.fsdp:
# FSDP doesn't like mixed Float and BFloat16
for name, module in model.named_modules():
if any(m in name for m in ["norm", "gate"]):
if "norm" in name or name.endswith(".gate"):
module.to(torch.float32)
if model_config.model_type == "btlm":
# don't upcast lm_head for btlm

0 comments on commit 2d65f47

Please sign in to comment.