Skip to content

Update non_xla attention to properly support paged_attention dynamo code path #8072

Update non_xla attention to properly support paged_attention dynamo code path

Update non_xla attention to properly support paged_attention dynamo code path #8072

Triggered via pull request May 2, 2024 23:29
Status Success
Total duration 1h 16m 31s
Artifacts 3

build_and_test.yml

on: pull_request
Build PyTorch/XLA  /  build
26m 41s
Build PyTorch/XLA / build
Build XLA CUDA plugin  /  build
4m 44s
Build XLA CUDA plugin / build
Build upstream Docker image  /  build
Build upstream Docker image / build
Build docs  /  build-docs
1m 34s
Build docs / build-docs
TPU tests  /  tpu-test
TPU tests / tpu-test
Matrix: CPU tests / test
Matrix: GPU tests / test
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
cpp-test-bin Expired
644 MB
cuda-plugin Expired
95.3 MB
torch-xla-wheels Expired
206 MB