PyPI Stats

Search

All packages
Top packages

Track packages

vllm-rocm


PyPI page
Home page
Author: vLLM Team
License: Apache 2.0
Summary: A high-throughput and memory-efficient inference and serving engine for LLMs with AMD GPU support
Latest version: 0.6.3
Required dependencies: aiohttp | awscli | boto3 | botocore | einops | fastapi | filelock | gguf | importlib-metadata | lm-format-enforcer | mistral-common | msgspec | numpy | openai | outlines | partial-json-parser | peft | pillow | prometheus-client | prometheus-fastapi-instrumentator | protobuf | psutil | py-cpuinfo | pydantic | pytest-asyncio | pyyaml | pyzmq | ray | requests | sentencepiece | setuptools | six | tensorizer | tiktoken | tokenizers | tqdm | transformers | typing-extensions | uvicorn
Optional dependencies: librosa | soundfile | tensorizer

Downloads last day: 8
Downloads last week: 39
Downloads last month: 108