llm-proxy-server

"LLM Proxy Server" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc.

Installation

In a virtualenv (see these instructions if you need to create one):

pip3 install llm-proxy-server

Releases

Version Released Bullseye
Python 3.9
Bookworm
Python 3.11
Files
0.0.3 2025-05-24
0.0.2 2025-05-24
0.0.1 2025-05-24

Issues with this package?

Page last updated 2025-05-25 14:38:31 UTC