Skip to content

[common/PyTorch] Add FusedAttention support for SWA (left, right) #6487

[common/PyTorch] Add FusedAttention support for SWA (left, right)

[common/PyTorch] Add FusedAttention support for SWA (left, right) #6487

Triggered via pull request January 6, 2025 12:50
Status Failure
Total duration 16m 43s
Artifacts

build.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Annotations

4 errors and 4 warnings
PaddlePaddle
Process completed with exit code 1.
JAX
Process completed with exit code 1.
Core
Process completed with exit code 1.
PyTorch
Process completed with exit code 1.
PaddlePaddle
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
JAX
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
Core
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PyTorch
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636