Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2437

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2437

Triggered via pull request October 14, 2023 02:41
Status Cancelled
Total duration 4m 46s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

pr-cpu.yaml

on: pull_request
Matrix: pytest-cpu
Coverage Results  /  coverage
Coverage Results / coverage
Fit to window
Zoom out
Zoom in

Annotations

5 errors
cpu-2.0.1 / pytest-cpu
Process completed with exit code 2.
cpu-latest / pytest-cpu
FailFast: cancelling since parallel instance has failed
cpu-latest / pytest-cpu
The operation was canceled.
cpu-2.1.0 / pytest-cpu
FailFast: cancelling since parallel instance has failed
cpu-2.1.0 / pytest-cpu
Process completed with exit code 2.