Skip to content

Add support for Flex Attention #9398

Add support for Flex Attention

Add support for Flex Attention #9398

Triggered via pull request December 4, 2024 07:09
@ShashankMosaicMLShashankMosaicML
synchronize #1675
Status Success
Total duration 17m 3s
Billable time 25m
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu-1
Matrix: pytest-gpu-2
Matrix: pytest-gpu-4
Fit to window
Zoom out
Zoom in