Skip to content

Adding support for alibi when using flash attention #3784

Adding support for alibi when using flash attention

Adding support for alibi when using flash attention #3784

Triggered via pull request December 23, 2023 22:26
@ShashankMosaicMLShashankMosaicML
synchronize #820
Status Success
Total duration 13m 44s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Fit to window
Zoom out
Zoom in