Skip to content

Adding support for alibi when using flash attention #3785

Adding support for alibi when using flash attention

Adding support for alibi when using flash attention #3785

Triggered via pull request December 24, 2023 04:11
@ShashankMosaicMLShashankMosaicML
synchronize #820
Status Success
Total duration 12m 1s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Fit to window
Zoom out
Zoom in