Skip to content

Commit

Permalink
formatting
Browse files Browse the repository at this point in the history
  • Loading branch information
YangFei1990 committed Nov 29, 2023
1 parent f0dd2a1 commit cc417a8
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion torch_xla/distributed/zero_redundancy_optimizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,9 @@ def init_zero(self):
group = list(group)
self.local_rank = group.index(self.global_rank)
if self.local_rank is None:
raise ValueError(f"Current rank {self.global_rank} is missing from the sharding_groups {self.sharding_groups}")
raise ValueError(
f"Current rank {self.global_rank} is missing from the sharding_groups {self.sharding_groups}"
)
# Shard parameters for use in optimizer
sharded_param_groups = self._shard_parameters()
# Optimizer initialization
Expand Down

0 comments on commit cc417a8

Please sign in to comment.