flash-attn
PyPI page
Home page
Author:
Tri Dao
Summary:
Flash Attention: Fast and Memory-Efficient Exact Attention
Latest version:
2.8.3
Required dependencies:
einops
|
torch
Downloads last day:
34,423
Downloads last week:
162,349
Downloads last month:
678,530