Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2707

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2707

Triggered via pull request October 22, 2023 17:52
@dakingggdakinggg
synchronize #672
Status Success
Total duration 21m 46s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Fit to window
Zoom out
Zoom in