llm-proxy-server
"LLM Proxy Server" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc.
Installation
In a virtualenv (see these instructions if you need to create one):
pip3 install llm-proxy-server
Releases
Version | Released | Bullseye Python 3.9 |
Bookworm Python 3.11 |
Files |
---|---|---|---|---|
0.0.3 | 2025-05-24 | |||
|
||||
0.0.2 | 2025-05-24 | |||
|
||||
0.0.1 | 2025-05-24 | |||
|
Issues with this package?
- Search issues for this package
- Package or version missing? Open a new issue
- Something else? Open a new issue