PyPI Stats

Search

All packages
Top packages

Track packages

vllm-consul


PyPI page
Home page
Author: vLLM Team
License: Apache 2.0
Summary: A high-throughput and memory-efficient inference and serving engine for LLMs
Latest version: 0.2.1
Required dependencies: fastapi | fschat | ninja | numpy | pandas | psutil | pyarrow | pydantic | python-consul | python-json-logger | ray | sentencepiece | torch | transformers | uvicorn | xformers

Downloads last day: 0
Downloads last week: 134
Downloads last month: 145