mcp-llm-inferencer

Uses Claude or OpenAI API to convert prompt-mapped input into concrete MCP server components such as tools, resource templates, and prompt handlers

Installation

In a virtualenv (see these instructions if you need to create one):

pip3 install mcp-llm-inferencer

Releases

Version Released Bullseye
Python 3.9
Bookworm
Python 3.11
Files
0.1.0 2025-04-30

Issues with this package?

Page last updated 2025-05-23 16:35:21 UTC