Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2559

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2559

Triggered via pull request October 13, 2023 03:52
@dakingggdakinggg
synchronize #672
Status Success
Total duration 10m 19s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Fit to window
Zoom out
Zoom in