flash-attn
PyPI page
Home page
Author:
Tri Dao
Summary:
Flash Attention: Fast and Memory-Efficient Exact Attention
Latest version:
2.8.3
Required dependencies:
einops
|
torch
Downloads last day:
35,876
Downloads last week:
270,698
Downloads last month:
1,016,815