Skip to content

Commit

Permalink
better init for parallelism_config
Browse files Browse the repository at this point in the history
  • Loading branch information
eitanturok committed Sep 25, 2024
1 parent 19f5477 commit 2b9664e
Showing 1 changed file with 1 addition and 3 deletions.
4 changes: 1 addition & 3 deletions llmfoundry/command_utils/train.py
Original file line number Diff line number Diff line change
Expand Up @@ -530,9 +530,7 @@ def train(cfg: DictConfig) -> Trainer:
tp_config['layer_plan'] |= strategy_layer_plan

# Parallelism config
tp = TPConfig(**tp_config)
fsdp = FSDPConfig(**fsdp_config)
parallelism_config = ParallelismConfig(fsdp=fsdp, tp=tp)
parallelism_config = dict(fsdp=fsdp_config, tp=tp_config)

# Optimizer
optimizer_name: str = train_cfg.optimizer.pop('name')
Expand Down

0 comments on commit 2b9664e

Please sign in to comment.