RUN IN TERMINAL
claude mcp add michael-denyer-hot-memory-mcp -- uvx hot-memory-mcp
ADD TO claude_desktop_config.json
{
"mcpServers": {
"michael-denyer-hot-memory-mcp": {
"command": "uvx",
"args": ["hot-memory-mcp"]
}
}
}
ADD TO .vscode/mcp.json
{
"mcpServers": {
"michael-denyer-hot-memory-mcp": {
"command": "uvx",
"args": ["hot-memory-mcp"]
}
}
}
ADD TO .cursor/mcp.json
{
"mcpServers": {
"michael-denyer-hot-memory-mcp": {
"command": "uvx",
"args": ["hot-memory-mcp"]
}
}
}
About This MCP Server
Two-tier memory: hot cache (0ms) + semantic search. Self-organizing.. This is a Model Context Protocol (MCP) server that extends AI assistants like Claude with search & web scraping capabilities via the stdio transport.
Package
hot-memory-mcp on pypi
HOW TO USE
Select your AI client above to get the install command. This MCP server uses the stdio transport and is available on pypi.
What tools does it provide?
The hot-memory-mcp server extends your AI assistant with search & web scraping capabilities. Once installed, your AI can use its tools automatically.