Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2472

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2472

Triggered via pull request October 17, 2023 04:39
Status Success
Total duration 16m 21s
Artifacts 3

pr-cpu.yaml

on: pull_request
Matrix: pytest-cpu
Coverage Results  /  coverage
12s
Coverage Results / coverage
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
coverage-d24e0b0c0a1a73f550cfc8986b0fdb5bbda201c4-cpu-2.0.1 Expired
236 KB
coverage-d24e0b0c0a1a73f550cfc8986b0fdb5bbda201c4-cpu-2.1.0 Expired
236 KB
coverage-d24e0b0c0a1a73f550cfc8986b0fdb5bbda201c4-cpu-latest Expired
236 KB