Skip to content

Precompute flash attention padding info#880

Merged
ShashankMosaicML merged 45 commits intomosaicml:mainfrom ShashankMosaicML:shashank/update_flash_attn_cache_unpad_inputJan 18, 2024

Commits

Commits on Oct 9, 2023

Commits on Oct 27, 2023

Commits on Nov 6, 2023

Commits on Nov 8, 2023

Commits on Nov 14, 2023

Commits on Nov 15, 2023

Commits on Dec 2, 2023

Commits on Dec 8, 2023

Commits on Dec 11, 2023

Commits on Dec 13, 2023

Commits on Dec 15, 2023

Commits on Dec 16, 2023

Commits on Dec 19, 2023

Commits on Dec 20, 2023

Commits on Jan 1, 2024

Commits on Jan 2, 2024

Commits on Jan 3, 2024

Commits on Jan 5, 2024

Commits on Jan 17, 2024

Commits on Jan 18, 2024