vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
Installation
In a virtualenv (see these instructions if you need to create one):
pip3 install vllm
Releases
Issues with this package?
- Search issues for this package
- Package or version missing? Open a new issue
- Something else? Open a new issue