Basic Memory
Active
Basic Memory lets you build persistent knowledge through natural conversations with Large Language Models (LLMs) like Claude, while keeping everything in simple Markdown files on your computer. It uses the Model Context Protocol (MCP) to enable any compatible LLM to read and write to your local knowledge base.
Updated 8/8/2025
Key Features
What makes this MCP server special
AI assistants can load context from local files in a new conversation
 Notes are saved locally as Markdown files in real time
 No project knowledge or special prompting required
 Chat histories capture conversations but aren't structured knowledge
 RAG systems can query documents but don't let LLMs write back
 Vector databases require complex setups and often live in the cloud
 Knowledge graphs typically need specialized tools to maintain
 Local-first: All knowledge stays in files you control
 Bi-directional: Both you and the LLM read and write to the same files
 Structured yet simple: Uses familiar Markdown with semantic patterns
 Traversable knowledge graph: LLMs can follow links between topics
 Standard formats: Works with existing editors like Obsidian
 Lightweight infrastructure: Just local files indexed in a local SQLite database
Requirements
uvx
Server Information
Status
Active
Version
1.0.0
License
GNU Affero General Public License v3.0
Author
https://github.com/basicmachines-co
Installation Guide
Get started with this MCP server in minutes
1. Install the Package
-2. Configuration
{
  "mcp": {
    "servers": {
      "basic-memory": {
        "command": "uvx",
        "args": [
          "basic-memory",
          "mcp"
        ]
      }
    }
  }
}3. Usage Example
-