Project Memory for the AI Era

Your codebase has a story.
Lore remembers it.

Lore is an open-source, CLI-first Project Memory system designed specifically for engineers and AI Coding Agents (like Claude, Copilot, and Cursor).

It acts as the ultimate "glue" between human context and AI output by capturing architectural decisions, constraints, and known traps, and structurally injecting them straight into your AI's context window.

bash
~lore ui # Local browser dashboard
~lore prompt "auth" # Context Compactor

The Onboarding Roadmap
(Getting Started)

Getting a project up and running with Lore takes less than 60 seconds.

Step 1: Install the Engine

Lore is published on the global NPM registry. Install it globally so you can use it in any repository:

bash
$npm install -g lore-memory

Step 2: Initialize your Project

Navigate to the root of your existing codebase and run lore init.

This instantly creates a hidden .lore/ database in your repository (which you will commit to Git so your entire team shares the brain), and it installs an optional "Architect-in-the-Loop" post-commit hook.

bash
$cd your-project
$lore init

Step 3: Setup Autonomous AI Defense

To make your AI respect your project completely unprompted, drop a file named CLAUDE.md (or .cursorrules) into the root of your repository with this exact rule:

"Before you modify, create, or explain any file in this codebase, you must ALWAYS execute the lore_why tool on that file path first to check if there are any architectural invariants, gotchas, or decisions you need to respect."

Finally, connect the MCP server to your AI agent's config to enable the tools natively.

~/.claude/settings.json
{
  "mcpServers": {
    "lore": { "command": "lore", "args": ["serve"] }
  }
}

🧠 Using the Features
(Day-to-Day Workflow)

Here is exactly how you use Lore as your daily driver:

1. The Interactive Menu
$ lore

If you ever forget a command, simply type lore into your terminal and hit Enter. You will be greeted with a stunning interactive menu that guides you through every featureβ€”no manual-reading required!

2. Logging Knowledge
$ lore log

When you make a technical decision, or discover a nasty bug ("Gotcha"), log it instantly: lore log

The CLI will beautifully prompt you to categorize the entry into one of four types:

πŸ”΄ Invariants

Rules that must never be broken (e.g. "Never console.log auth tokens").

βš–οΈ Decisions

Technical choices (e.g. "We chose Next.js over Vite").

⚠️ Gotchas

Traps that trigger bugs (e.g. "Stripe webhooks can fire twice").

πŸͺ¦ Graveyard

Abandoned code (e.g. "We removed Redis in v2").

3. Passive Mining
$ lore watch --daemon

Software engineers hate writing Wikis natively. With Lore, you don't have to.

Run lore watch --daemon in the background. As you code in your IDE, if you type // WARNING: or // HACK:, the Lore daemon intercepts your OS file save, reads your comment, and instantly drafts a beautiful Markdown rule in the background. Review them at the end of the sprint with lore drafts.

Lore Pending Drafts UI
Lore Knowledge Base UI

4. The Local UI Dashboard
$ lore ui

Want to visually explore your project's brain?

Run lore ui to spin up a blazingly fast local web server on port 3333. You can explore the health of your project (The "Lore Score"), search your database, and view all your team's architectural decisions mathematically linked to the exact files they govern.

5. Managing Outdated Context
$ lore stale

Traditional wikis suffer from "Context Rot"β€”they go out of date and people stop trusting them.

Because you link Lore rules directly to source code files (src/auth.js), if a developer refactors that file 6 months later, Lore compares the OS timestamps and automatically flags the rule as [Stale]. Run lore stale to instantly catch outdated documentation before it confuses your AI!

Lore Health Score UI
bash
$lore prompt "Refactoring Auth" | pbcopy
β†’ Found 3 relevant entries via Blast Radius ranking
β†’ 2 invariants, 1 gotcha included

System prompt written to stdout β€” pipe to clipboard or file.

6. The Context Compactor
$ lore prompt

Don't use Claude Code? If you prefer the ChatGPT web browser, don't dump your entire repository into the chat (it wastes context and money).

Run lore prompt "Refactoring Auth" | pbcopy

Lore will use completely offline Semantic Vector Search (via Ollama) and mathematically compute the PageRank "Blast Radius" of your authentication code. It will instantly format a perfectly concise set of strict rules and drop them right onto your clipboard, ready to paste into ChatGPT!

100%
Local & Offline
No cloud APIs, no telemetry. Your code stays on your machine.
0
Config Required
Run lore init and you're done. Zero YAML, zero env vars.
<5 min
Setup Time
Install, init, and start capturing context in under 5 minutes.
4
Knowledge Types
Decisions, Invariants, Gotchas, and Graveyard entries.

Without Lore vs With Lore

See the difference context makes in your AI coding sessions.

Without Lore
  • βœ—Every AI session starts from zero context
  • βœ—Claude suggests refactors that break invariants
  • βœ—Team re-debates the same decisions every quarter
  • βœ—New devs repeat mistakes from 6 months ago
  • βœ—Wiki dies after week one β€” nobody updates it
  • βœ—Context window wasted on irrelevant boilerplate
With Lore
  • βœ“AI reads your project's full decision history
  • βœ“Invariants are injected β€” refactors respect constraints
  • βœ“Decisions are recorded once with full rationale
  • βœ“Gotchas surface before they bite again
  • βœ“Auto-capture means docs stay alive by default
  • βœ“Semantic search context compactor saves tokens

How Lore Compares

Lore isn't a wiki. It's not a note-taking app. It's a structured, git-native memory layer.

Feature Lore Notion Confluence README.md
Git-versioned βœ“ βœ— βœ— βœ“
Auto-captures from code βœ“ βœ— βœ— βœ—
AI-native (MCP) βœ“ βœ— βœ— βœ—
Staleness detection βœ“ βœ— βœ— βœ—
Structured types βœ“ ~ ~ βœ—
100% offline βœ“ βœ— βœ— βœ“
Free & open source βœ“ βœ— βœ— βœ“

Works With Your Stack

Lore integrates seamlessly with the tools you already use.

Claude Code
Cursor
VS Code
Terminal / CLI
GitHub
SQLite

πŸ”’ 100% Privacy-First

Your codebase's intelligence stays entirely on your machine.

☁️ Zero Cloud APIs

No server connection required. Your codebase never leaves your machine. Local Ollama embeddings execute the Context Compactor entirely offline.

πŸ™ Git Native

Your Lore database (.lore/) is just Markdown and JSON. It lives entirely inside your repository and syncs securely via your existing Git provider alongside your code.

⚑ No LLM Execution Req.

For 90% of the daily workflow (logging, passive mining, searching), Lore requires absolutely zero LLM execution. It is fundamentally a deterministic, high-speed CLI tool.

Ready to give your codebase
a memory?

Start capturing your team's institutional knowledge in under 5 minutes. Free, open source, and runs entirely on your machine.