Get a week free of Claude Code →
🛠️

omni-nli

An MCP server for natural language inference

Developer Tools streamable-http pypi pypistreamable-httpCogitatorTech
RUN IN TERMINAL
claude mcp add ogitatorech-omni-nli -- uvx omni-nli
ADD TO claude_desktop_config.json
{ "mcpServers": { "ogitatorech-omni-nli": { "command": "uvx", "args": ["omni-nli"] } } }
ADD TO .vscode/mcp.json
{ "mcpServers": { "ogitatorech-omni-nli": { "command": "uvx", "args": ["omni-nli"] } } }
ADD TO .cursor/mcp.json
{ "mcpServers": { "ogitatorech-omni-nli": { "command": "uvx", "args": ["omni-nli"] } } }

About This MCP Server

An MCP server for natural language inference. This is a Model Context Protocol (MCP) server that extends AI assistants like Claude with developer tools capabilities via the streamable-http transport.

Package

omni-nli on pypi

HOW TO USE

Select your AI client above to get the install command. This MCP server uses the streamable-http transport and is available on pypi.

What tools does it provide?

The omni-nli server extends your AI assistant with developer tools capabilities. Once installed, your AI can use its tools automatically.

DETAILS
README BADGE
Skills Playground MCP badge
[![Skills Playground](https://skillsplayground.com/badges/mcp/ogitatorech-omni-nli.svg)](https://skillsplayground.com/mcps/ogitatorech-omni-nli/)
SHARE
Share on X

Related MCP Servers