Skip to content

Add support for Flex Attention #9422

Add support for Flex Attention

Add support for Flex Attention #9422

Triggered via pull request December 5, 2024 09:59
@ShashankMosaicMLShashankMosaicML
synchronize #1675
Status Success
Total duration 16m 4s
Billable time 27m
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu-1
Matrix: pytest-gpu-2
Matrix: pytest-gpu-4
Fit to window
Zoom out
Zoom in