PyPI page
Home page
Author:
vLLM Team
License:
Apache 2.0
Summary:
A high-throughput and memory-efficient inference and serving engine for LLMs
Latest version:
0.4.2.post2
Required dependencies:
cmake
|
fastapi
|
filelock
|
lm-format-enforcer
|
ninja
|
npu-vllm
|
numpy
|
openai
|
outlines
|
prometheus-fastapi-instrumentator
|
prometheus_client
|
psutil
|
py-cpuinfo
|
pydantic
|
pynvml
|
ray
|
requests
|
sentencepiece
|
tiktoken
|
tokenizers
|
transformers
|
typing_extensions
|
uvicorn
Optional dependencies:
tensorizer
Downloads last day:
1
Downloads last week:
119
Downloads last month:
148