Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2438

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2438

Triggered via pull request October 14, 2023 04:13
Status Success
Total duration 9m 33s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

code-quality.yaml

on: pull_request
Matrix: code-quality
Fit to window
Zoom out
Zoom in