Skip to content

Actions: NVIDIA/TransformerEngine

TE-CI Trigger

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
2,933 workflow runs
2,933 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

[JAX] Consolidate the distributed fused attention test code
TE-CI Trigger #4438: Issue comment #1405 (comment) created by mgoldfarb-nvidia
January 12, 2025 22:56 3m 41s
January 12, 2025 22:56 3m 41s
Not compile in wsl2 pytorch wheels
TE-CI Trigger #4437: Issue comment #1404 (comment) created by johnnynunez
January 12, 2025 18:59 5s
January 12, 2025 18:59 5s
Doesn't work on wsl2
TE-CI Trigger #4436: Issue comment #683 (comment) created by johnnynunez
January 12, 2025 13:14 4s
January 12, 2025 13:14 4s
[PyTorch] Avoid parameters function in op backward pass
TE-CI Trigger #4435: Issue comment #1403 (comment) created by timmoon10
January 11, 2025 03:16 3m 40s
January 11, 2025 03:16 3m 40s
Installation stuck at 97%
TE-CI Trigger #4434: Issue comment #1399 (comment) created by timmoon10
January 11, 2025 01:19 5s
January 11, 2025 01:19 5s
TransformerEngine build fail with Conda
TE-CI Trigger #4433: Issue comment #954 (comment) created by MaureenZOU
January 10, 2025 19:52 5s
January 10, 2025 19:52 5s
support new flash_attn_interface
TE-CI Trigger #4432: Issue comment #1392 (comment) created by rgtjf
January 10, 2025 08:11 5s
January 10, 2025 08:11 5s
why close ag overlap when is_grad_enabled is False
TE-CI Trigger #4431: Issue comment #1398 (comment) created by clivinn-shla81092
January 10, 2025 02:49 4s
January 10, 2025 02:49 4s
[PyTorch] Fix AttentionParams comparison logic
TE-CI Trigger #4430: Issue comment #1397 (comment) created by cyanguwa
January 9, 2025 11:43 3m 33s
January 9, 2025 11:43 3m 33s
support new flash_attn_interface
TE-CI Trigger #4429: Issue comment #1392 (comment) created by cyanguwa
January 9, 2025 06:33 5s
January 9, 2025 06:33 5s
January 9, 2025 03:26 4m 21s
[JAX] Test_multiprocessing_encoder with process spawn in bash
TE-CI Trigger #4427: Issue comment #1394 (comment) created by phu0ngng
January 8, 2025 16:13 3m 33s
January 8, 2025 16:13 3m 33s
overlapping issue about backward of LayerNormLinear
TE-CI Trigger #4426: Issue comment #1353 (comment) created by cos120
January 8, 2025 01:33 5s
January 8, 2025 01:33 5s
[JAX] Correct fused attention output after each step of ring attention
TE-CI Trigger #4425: Issue comment #1393 (comment) created by mgoldfarb-nvidia
January 7, 2025 22:50 3m 33s
January 7, 2025 22:50 3m 33s
overlapping issue about backward of LayerNormLinear
TE-CI Trigger #4424: Issue comment #1353 (comment) created by timmoon10
January 7, 2025 22:24 5s
January 7, 2025 22:24 5s
clean CP implementation for flash attention and cuDNN 9.6
TE-CI Trigger #4423: Issue comment #1387 (comment) created by xrennvidia
January 7, 2025 22:18 3m 38s
January 7, 2025 22:18 3m 38s
Better cuBLAS handle management
TE-CI Trigger #4422: Issue comment #1389 (comment) created by ptrendx
January 7, 2025 22:15 3m 40s
January 7, 2025 22:15 3m 40s
[JAX] Correct fused attention output after each step of ring attention
TE-CI Trigger #4421: Issue comment #1393 (comment) created by mgoldfarb-nvidia
January 7, 2025 22:06 3m 44s
January 7, 2025 22:06 3m 44s
[JAX] Correct fused attention output after each step of ring attention
TE-CI Trigger #4420: Issue comment #1393 (comment) created by mgoldfarb-nvidia
January 7, 2025 22:03 3m 36s
January 7, 2025 22:03 3m 36s
bug fix for using return_layernorm_output=True
TE-CI Trigger #4419: Issue comment #1382 (comment) created by timmoon10
January 7, 2025 19:00 3m 32s
January 7, 2025 19:00 3m 32s
Better cuBLAS handle management
TE-CI Trigger #4418: Issue comment #1389 (comment) created by ptrendx
January 7, 2025 00:09 4m 1s
January 7, 2025 00:09 4m 1s
Better cuBLAS handle management
TE-CI Trigger #4417: Issue comment #1389 (comment) created by ptrendx
January 6, 2025 22:52 3m 29s
January 6, 2025 22:52 3m 29s
Better cuBLAS handle management
TE-CI Trigger #4416: Issue comment #1389 (comment) created by ptrendx
January 6, 2025 17:36 5s
January 6, 2025 17:36 5s
Add paged attention support
TE-CI Trigger #4415: Issue comment #1355 (comment) created by cyanguwa
January 6, 2025 12:27 3m 36s
January 6, 2025 12:27 3m 36s
clean CP implementation for flash attention and cuDNN 9.6
TE-CI Trigger #4414: Issue comment #1387 (comment) created by cyanguwa
January 6, 2025 12:03 3m 38s
January 6, 2025 12:03 3m 38s