RUN IN TERMINAL
claude mcp add erri-litellm-mcp -- uvx litellm-mcp
ADD TO claude_desktop_config.json
{
"mcpServers": {
"erri-litellm-mcp": {
"command": "uvx",
"args": ["litellm-mcp"]
}
}
}
ADD TO .vscode/mcp.json
{
"mcpServers": {
"erri-litellm-mcp": {
"command": "uvx",
"args": ["litellm-mcp"]
}
}
}
ADD TO .cursor/mcp.json
{
"mcpServers": {
"erri-litellm-mcp": {
"command": "uvx",
"args": ["litellm-mcp"]
}
}
}
About This MCP Server
Access 100+ LLMs with one API: GPT-4, Claude, Gemini, Mistral, and more.. This is a Model Context Protocol (MCP) server that extends AI assistants like Claude with developer tools capabilities via the stdio transport.
Package
litellm-mcp on pypi
HOW TO USE
Select your AI client above to get the install command. This MCP server uses the stdio transport and is available on pypi.
What tools does it provide?
The LiteLLM Multi-Model API server extends your AI assistant with developer tools capabilities. Once installed, your AI can use its tools automatically.