Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2442

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2442

Triggered via pull request October 14, 2023 21:14
Status Cancelled
Total duration 45s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

pr-cpu.yaml

on: pull_request
Matrix: pytest-cpu
Coverage Results  /  coverage
Coverage Results / coverage
Fit to window
Zoom out
Zoom in

Annotations

6 errors
cpu-latest / pytest-cpu
Canceling since a higher priority waiting request for 'PR CPU tests-672' exists
cpu-latest / pytest-cpu
The operation was canceled.
cpu-2.1.0 / pytest-cpu
Canceling since a higher priority waiting request for 'PR CPU tests-672' exists
cpu-2.1.0 / pytest-cpu
The operation was canceled.
cpu-2.0.1 / pytest-cpu
Canceling since a higher priority waiting request for 'PR CPU tests-672' exists
cpu-2.0.1 / pytest-cpu
The operation was canceled.