Skip to content

[PyTorch] Fix FlashAttention 2 head_dim > 192 on sm103 and other architectures#2836

Open
pedramr wants to merge 1 commit intoNVIDIA:mainfrom
pedramr:fix/sm103-flash-attn-allowlist
Open

[PyTorch] Fix FlashAttention 2 head_dim > 192 on sm103 and other architectures#2836
pedramr wants to merge 1 commit intoNVIDIA:mainfrom
pedramr:fix/sm103-flash-attn-allowlist

Commits

Commits on Apr 4, 2026