Skip to content

flash_attention: support also cross attention. #10462

flash_attention: support also cross attention.

flash_attention: support also cross attention. #10462

Re-run triggered December 2, 2024 09:07
Status Success
Total duration 1h 55m 31s
Artifacts 2

build_and_test.yml

on: pull_request
get-torch-commit
2s
Build PyTorch/XLA  /  build
1h 9m
Matrix: CPU tests / test

Artifacts

Produced during runtime
Name Size
cpp-test-bin
661 MB
torch-xla-wheels
222 MB