Integrations
Connect Lore to your AI coding tools via the Model Context Protocol (MCP).
What is MCP?
The Model Context Protocol (MCP) is a standard for giving AI assistants access to external tools and data sources. Lore ships with a built-in MCP server that exposes your project's knowledge base to AI editors.
When connected, the AI can automatically query your project's decisions, invariants, and gotchas before writing or modifying code.
Claude Code
Add Lore to your Claude Code configuration at ~/.claude/settings.json:
{
"mcpServers": {
"lore": {
"command": "lore",
"args": ["serve"]
}
}
}
Once configured, Claude Code will have access to the following Lore tools:
Available MCP Tools
| Tool | Description |
|---|---|
lore_overview |
Returns a full summary of the project's knowledge base — entry counts, types, and key entries |
lore_why |
Given a file path, returns all Lore entries linked to that file. Claude calls this automatically when you ask it to work on a file. |
lore_search |
Semantic search across all entries using a natural language query |
lore_log |
Allows Claude to log new Lore entries directly from the conversation |
lore_drafts |
Review and manage auto-captured draft entries |
lore_stale |
Check for stale entries that may need updating |
lore_why to check for
invariants and decisions linked to that file. This prevents it from suggesting refactors that break
established constraints.
🛡️ Autonomous Zero-Prompt Enforcement
To make Claude autonomously respect your codebase rules without you ever having to ask it, drop a
CLAUDE.md (or .cursorrules / .clinerules) file in the root of
your project with this exact instruction:
"Before you modify, create, or explain any file in this codebase, you must ALWAYS execute the
lore_why tool on that file path first to check if there are any architectural invariants,
gotchas, or decisions you need to respect."
Once added, the AI will autonomously query Lore in the background before it writes a single line of code, ensuring it never violates your team's architectural constraints.
Cursor
Cursor supports MCP servers through its settings. Add the same config:
- Open Cursor Settings → MCP
- Add a new MCP server with:
{
"command": "lore",
"args": ["serve"]
}
Cursor will now have access to the same 6 MCP tools listed above.
VS Code
VS Code supports MCP through extensions like Claude Code for VS Code. Once you have the extension installed, configure Lore as an MCP server in your workspace settings.
Manual / Terminal Usage
If your AI tool doesn't support MCP, you can still use Lore via the CLI:
# Generate a context prompt and pipe to clipboard
$ lore prompt "auth migration" | pbcopy
# Export all entries to a CLAUDE.md file
$ lore export
The lore export command generates a CLAUDE.md file at your project root
containing all entries in a format optimized for AI consumption. This file can be included in any AI
tool's context window.
The MCP Server
The MCP server communicates over stdio (standard input/output). You can test it directly:
# Start the MCP server
$ lore serve
# With quiet mode (suppresses startup messages)
$ lore serve --quiet
The server uses the @modelcontextprotocol/sdk npm package and implements the standard MCP
tool protocol.