PyPI page
Home page
Author:
None
License:
MIT License
Copyright (c) 2025 AILabs
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentati...
Summary:
LLM Inference for Large-Context Offline Workloads
Latest version:
1.0.3
Required dependencies:
accelerate
|
flash-attn
|
flash-linear-attention
|
numpy
|
torch
|
transformers
Downloads last day:
4
Downloads last week:
78
Downloads last month:
251