Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #1630

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #1630

Triggered via pull request October 13, 2023 03:52
Status Success
Total duration 3m 59s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

codeql-analysis.yml

on: pull_request
Matrix: Analyze
Fit to window
Zoom out
Zoom in

Annotations

1 warning
Analyze (python)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/