Skip to content

Adding support for alibi when using flash attention #3786

Adding support for alibi when using flash attention

Adding support for alibi when using flash attention #3786

Triggered via pull request December 25, 2023 07:20
@ShashankMosaicMLShashankMosaicML
synchronize #820
Status Success
Total duration 12m 0s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Fit to window
Zoom out
Zoom in