Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2711

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2711

Triggered via pull request October 22, 2023 18:58
@dakingggdakinggg
synchronize #672
Status Cancelled
Total duration 1m 22s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Waiting for pending jobs
Fit to window
Zoom out
Zoom in

Annotations

1 error
PR GPU tests
Canceling since a higher priority waiting request for 'PR GPU tests-672' exists