Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Allow FSDP to use with
torch.autocast
for bfloat16 mixed precision (#…
…2033) * Ignore native_amp when FSDP is used * Rollback condition * Fix mixed precision of bfloat16 for FSDP
- Loading branch information