maxllm

A unified OpenAI API client with rate limiting, caching, and batch processing

Installation

In a virtualenv (see these instructions if you need to create one):

pip3 install maxllm

Dependencies

Releases

Version Released Bullseye
Python 3.9
Bookworm
Python 3.11
Trixie
Python 3.13
Files
0.1.0 2025-11-15      

Issues with this package?

Page last updated 2025-11-15 05:24:57 UTC