flash-attn
PyPI page
Home page
Author:
Tri Dao
Summary:
Flash Attention: Fast and Memory-Efficient Exact Attention
Latest version:
2.8.3
Required dependencies:
einops
|
torch
Downloads last day:
34,822
Downloads last week:
185,757
Downloads last month:
894,878