flash-attn-npu
PyPI page
Home page
Author:
Minghua Shen
Summary:
High-performance FlashAttention implementation for Ascend NPU
Latest version:
0.1.1
Required dependencies:
einops
|
torch
|
torch_npu
Downloads last day:
2
Downloads last week:
15
Downloads last month:
159