lm-proxy
LM-Proxy is an OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc.
Installation
In a virtualenv (see these instructions if you need to create one):
pip3 install lm-proxy
Dependencies
Releases
| Version | Released | Bullseye Python 3.9 |
Bookworm Python 3.11 |
Trixie Python 3.13 |
Files |
|---|---|---|---|---|---|
| 3.0.2 | 2026-02-19 | ||||
|
|||||
| 3.0.1 | 2026-02-10 | ||||
|
|||||
| 3.0.0 | 2026-02-05 | ||||
|
|||||
| 3.0.0.dev1 pre-release | 2026-01-20 | ||||
| 2.1.1 | 2025-11-20 | ||||
|
|||||
| 2.1.0 | 2025-11-02 | ||||
|
|||||
| 2.0.0 | 2025-10-26 | ||||
|
|||||
| 1.1.0 | 2025-10-15 | ||||
|
|||||
| 1.0.0 | 2025-10-15 | ||||
|
|||||
| 0.4.0 | 2025-10-14 | ||||
|
|||||
| 0.3.0 | 2025-10-09 | ||||
|
|||||
| 0.2.2 | 2025-10-08 | ||||
|
|||||
| 0.2.1 | 2025-08-28 | ||||
| 0.2.0 | 2025-08-27 | ||||
| 0.0.3 | 2025-05-24 | ||||
| 0.0.2 | 2025-05-24 | ||||
| 0.0.1 | 2025-05-24 | ||||
Issues with this package?
- Search issues for this package
- Package or version missing? Open a new issue
- Something else? Open a new issue