Skip to content

Add support for Flex Attention #9399

Add support for Flex Attention

Add support for Flex Attention #9399

Triggered via pull request December 4, 2024 07:48
@ShashankMosaicMLShashankMosaicML
synchronize #1675
Status Success
Total duration 16m 2s
Billable time 25m
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu-1
Matrix: pytest-gpu-2
Matrix: pytest-gpu-4
Fit to window
Zoom out
Zoom in