flash-attn
PyPI page
Home page
Author:
Tri Dao
Summary:
Flash Attention: Fast and Memory-Efficient Exact Attention
Latest version:
2.8.3
Required dependencies:
einops
|
torch
Downloads last day:
16,271
Downloads last week:
177,024
Downloads last month:
826,633