Request for Python 3.12 wheel: flash-attn for CUDA 13 + PyTorch 2.9
#17
by
razvanab
- opened
Can you please compiled for Python 3.12 - flash-attn for CUDA 13 + PyTorch 2.9?
Thank you.
https://huggingface.co/niknah/flash-attention-windows-wheel
There is also a sageattention wheel.