Skip to content

Fixing the gen_attention_mask_in_length function to handle the case when sequence id is -1 due to attention masking #4202

Fixing the gen_attention_mask_in_length function to handle the case when sequence id is -1 due to attention masking

Fixing the gen_attention_mask_in_length function to handle the case when sequence id is -1 due to attention masking #4202

Triggered via pull request February 5, 2024 17:56
Status Success
Total duration 17m 39s
Artifacts 1

pr-cpu.yaml

on: pull_request
Matrix: pytest-cpu
Coverage Results  /  coverage
8s
Coverage Results / coverage
Fit to window
Zoom out
Zoom in

Annotations

2 warnings
cpu-2.1.0 / pytest-cpu
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: actions/checkout@v3, actions/upload-artifact@v3. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
Coverage Results / coverage
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: actions/checkout@v3, actions/download-artifact@v3. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.

Artifacts

Produced during runtime
Name Size
coverage-6b60b600aa17b2c6e156ed138954876b08bfbea6-cpu-2.1.0 Expired
308 KB