RUN IN TERMINAL
claude mcp add etri-metrillm -- npx -y metrillm-mcp
ADD TO claude_desktop_config.json
{
"mcpServers": {
"etri-metrillm": {
"command": "npx",
"args": ["-y", "metrillm-mcp"]
}
}
}
ADD TO .vscode/mcp.json
{
"mcpServers": {
"etri-metrillm": {
"command": "npx",
"args": ["-y", "metrillm-mcp"]
}
}
}
ADD TO .cursor/mcp.json
{
"mcpServers": {
"etri-metrillm": {
"command": "npx",
"args": ["-y", "metrillm-mcp"]
}
}
}
About This MCP Server
Benchmark local LLM models — speed, quality & hardware fitness verdict from any MCP client. This is a Model Context Protocol (MCP) server that extends AI assistants like Claude with developer tools capabilities via the stdio transport.
Package
metrillm-mcp on npm
HOW TO USE
Select your AI client above to get the install command. This MCP server uses the stdio transport and is available on npm.
What tools does it provide?
The metrillm server extends your AI assistant with developer tools capabilities. Once installed, your AI can use its tools automatically.