Skip to content

Adding support for alibi when using flash attention #3595

Adding support for alibi when using flash attention

Adding support for alibi when using flash attention #3595

Triggered via pull request December 25, 2023 07:44
Status Success
Total duration 11m 17s
Artifacts 1

pr-cpu.yaml

on: pull_request
Matrix: pytest-cpu
Coverage Results  /  coverage
9s
Coverage Results / coverage
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
coverage-a10823d6618fd9b936c69cedb6564bf1da975db8-cpu-2.1.0 Expired
300 KB