Skip to content

MCP Server

For Claude Desktop, Cursor, Windsurf, and any MCP-compatible application. The LLM sees memory tools and can choose to use them.

This is the only integration that is NOT fully autoassociative -- the LLM must decide to call memory_recall or memory_store. However, well-prompted LLMs will use these tools naturally.

Tools Exposed

Tool Description
memory_recall(query) Search memory for relevant information
memory_store(content) Store new information in memory
memory_recall_with_confidence(query) Search with metamemory confidence levels
memory_status() Check memory system health
memory_consolidate() Trigger episodic-to-semantic consolidation

Configuration

Configure in Claude Desktop (claude_desktop_config.json):

{
  "mcpServers": {
    "cognitive-memory": {
      "command": "python",
      "args": ["-m", "integrations.mcp.server"],
      "cwd": "/path/to/cognitive-memory-model",
      "env": {}
    }
  }
}

Replace /path/to/cognitive-memory-model with the actual path to the repo.

When to Use MCP

Use MCP when your environment does not support hooks or API wrapping -- for example, Claude Desktop without hooks, or Cursor. The tradeoff is that the LLM must actively choose to remember and retrieve, rather than having it happen automatically.