Skip to content

Add softcapping support to flash attention #3376

Add softcapping support to flash attention

Add softcapping support to flash attention #3376

Triggered via pull request August 22, 2024 01:43
Status Success
Total duration 1m 56s
Artifacts

book.yml

on: pull_request
Test candle-book
1m 47s
Test candle-book
Fit to window
Zoom out
Zoom in