[PyTorch] Fix FlashAttention 2 head_dim > 192 on sm103 and other architectures#2836
Open
pedramr wants to merge 1 commit intoNVIDIA:mainfrom
Open
[PyTorch] Fix FlashAttention 2 head_dim > 192 on sm103 and other architectures#2836pedramr wants to merge 1 commit intoNVIDIA:mainfrom
pedramr wants to merge 1 commit intoNVIDIA:mainfrom