Skip to content

Add support for Flex Attention #9418

Add support for Flex Attention

Add support for Flex Attention #9418

Triggered via pull request December 5, 2024 06:59
@ShashankMosaicMLShashankMosaicML
synchronize #1675
Status Failure
Total duration 15m 36s
Billable time 26m
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu-1
Matrix: pytest-gpu-2
Matrix: pytest-gpu-4
Fit to window
Zoom out
Zoom in

Annotations

1 error
gpu-2.5.1-1
Process completed with exit code 1.