Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2440

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2440

Triggered via pull request October 14, 2023 20:04
Status Success
Total duration 15m 5s
Artifacts 3
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

pr-cpu.yaml

on: pull_request
Matrix: pytest-cpu
Coverage Results  /  coverage
14s
Coverage Results / coverage
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
coverage-dcad725593ef7be0c27fc0e5c35c22f416037514-cpu-2.0.1 Expired
236 KB
coverage-dcad725593ef7be0c27fc0e5c35c22f416037514-cpu-2.1.0 Expired
236 KB
coverage-dcad725593ef7be0c27fc0e5c35c22f416037514-cpu-latest Expired
236 KB