Installation

Install Lore globally via npm, or build from source.

Prerequisites

Optional: Ollama (for semantic search)

For full semantic vector search in lore prompt and lore search, install Ollama and pull the embedding model:

$ ollama pull nomic-embed-text
💡 Note
Without Ollama, Lore falls back to basic keyword matching. All other features work without it.

Install via npm

$ npm install -g lore-memory

This makes the lore command available globally in your terminal.

Install from source

$ git clone https://github.com/Tjindl/Lore
$ cd Lore
$ npm install
$ npm link

The npm link command symlinks the lore binary into your PATH.

Verify installation

$ lore --version
0.4.0

Initialize a project

$ cd your-project
$ lore init

This creates the .lore/ directory structure:

.lore/
├── index.json        # Entry index and file mappings
├── entries/          # Individual entry JSON files
├── drafts/           # Auto-captured draft entries
├── embeddings.db     # SQLite vector database (if Ollama is running)
└── graph.json        # Module dependency graph

It also installs a Git post-commit hook that prompts you to log a decision when it detects large code changes (>50 lines).