RUN IN TERMINAL
claude mcp add commcparmory-runpod -- uvx mcparmory-runpod
ADD TO claude_desktop_config.json
{
"mcpServers": {
"commcparmory-runpod": {
"command": "uvx",
"args": ["mcparmory-runpod"]
}
}
}
ADD TO .vscode/mcp.json
{
"mcpServers": {
"commcparmory-runpod": {
"command": "uvx",
"args": ["mcparmory-runpod"]
}
}
}
ADD TO .cursor/mcp.json
{
"mcpServers": {
"commcparmory-runpod": {
"command": "uvx",
"args": ["mcparmory-runpod"]
}
}
}
About This MCP Server
Launch, scale, and manage GPU pods and serverless endpoints across regions. This is a Model Context Protocol (MCP) server that extends AI assistants like Claude with other capabilities via the stdio transport.
Package
mcparmory-runpod on pypi
HOW TO USE
Select your AI client above to get the install command. This MCP server uses the stdio transport and is available on pypi.
What tools does it provide?
The runpod server extends your AI assistant with other capabilities. Once installed, your AI can use its tools automatically.