Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2576

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2576

Triggered via pull request October 14, 2023 21:14
@dakingggdakinggg
synchronize #672
Status Cancelled
Total duration 38s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Waiting for pending jobs
Fit to window
Zoom out
Zoom in

Annotations

1 error
PR GPU tests
Canceling since a higher priority waiting request for 'PR GPU tests-672' exists