PyPI Stats

Search

All packages
Top packages

Track packages

llm-proxy-server


PyPI page
Home page
Author: Vitalii Stepanenko
License: MIT License Copyright (c) 2025–2026 Vitalii Stepanenko Permission is hereby granted, free of charge, to any person obtaining a copy of this software and ...
Summary: LLM Proxy Server is an OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc.
Latest version: 3.2.2
Required dependencies: ai-microcore | fastapi | pydantic | requests | typer | uvicorn | websockets
Optional dependencies: anthropic | google-genai | pytest | pytest-asyncio | pytest-cov

Downloads last day: 80
Downloads last week: 227
Downloads last month: 358