chromadb-memoryLong-term memory via ChromaDB with local Ollama embeddings. Auto-recall injects relevant context every turn. No cloud APIs required — fully self-hosted.
Install via ClawdBot CLI:
clawdbot install msensintaffar/chromadb-memoryLong-term semantic memory backed by ChromaDB and local Ollama embeddings. Zero cloud dependencies.
chromadb_search tool: Manual semantic search over your ChromaDB collection
docker run -d --name chromadb -p 8100:8000 chromadb/chroma:latest
ollama pull nomic-embed-text
# 1. Copy the plugin extension
mkdir -p ~/.openclaw/extensions/chromadb-memory
cp {baseDir}/scripts/index.ts ~/.openclaw/extensions/chromadb-memory/
cp {baseDir}/scripts/openclaw.plugin.json ~/.openclaw/extensions/chromadb-memory/
# 2. Add to your OpenClaw config (~/.openclaw/openclaw.json):
{
"plugins": {
"entries": {
"chromadb-memory": {
"enabled": true,
"config": {
"chromaUrl": "http://localhost:8100",
"collectionName": "longterm_memory",
"ollamaUrl": "http://localhost:11434",
"embeddingModel": "nomic-embed-text",
"autoRecall": true,
"autoRecallResults": 3,
"minScore": 0.5
}
}
}
}
}
# 4. Restart the gateway
openclaw gateway restart
| Option | Default | Description |
|--------|---------|-------------|
| chromaUrl | http://localhost:8100 | ChromaDB server URL |
| collectionName | longterm_memory | Collection name (auto-resolves UUID, survives reindexing) |
| collectionId | — | Collection UUID (optional fallback) |
| ollamaUrl | http://localhost:11434 | Ollama API URL |
| embeddingModel | nomic-embed-text | Ollama embedding model |
| autoRecall | true | Auto-inject relevant memories each turn |
| autoRecallResults | 3 | Max auto-recall results per turn |
| minScore | 0.5 | Minimum similarity score (0-1) |
minScore are injected into the agent's context as Auto-recall adds ~275 tokens per turn worst case (3 results × ~300 chars + wrapper). Against a 200K+ context window, this is negligible.
minScore to 0.6 or 0.7minScore to 0.4, increase autoRecallResults to 5autoRecall: false, use chromadb_search toolUser Message → Ollama (embed) → ChromaDB (query) → Context Injection
↓
Agent Response
No OpenAI. No cloud. Your memories stay on your hardware.
Generated Mar 1, 2026
Individuals can use this skill to automatically recall relevant notes, research, or past conversations when working on projects. It helps maintain context across sessions without manual searching, ideal for writers, researchers, or hobbyists organizing personal data locally.
Small businesses can deploy this skill to enhance AI support agents by automatically retrieving past customer interactions and product information from a local ChromaDB. This ensures privacy and reduces response time without relying on cloud services.
Researchers can index papers, notes, and datasets into ChromaDB, enabling the AI to auto-recall relevant sources during discussions or analysis. This supports literature reviews and hypothesis testing with full control over sensitive academic data.
Companies can use this skill to help employees query internal documents, policies, and meeting notes automatically. It improves information retrieval for onboarding, compliance checks, and project management while keeping data on-premises for security.
Writers can store character details, plot points, and world-building elements in ChromaDB, with the AI auto-recalling relevant context to maintain consistency across chapters. This aids in long-form storytelling without external dependencies.
Offer paid consulting, installation, and customization services for businesses wanting to integrate this skill into their local AI setups. Revenue comes from one-time fees or subscription-based support plans for maintenance and tuning.
Package this skill with pre-configured ChromaDB and Ollama setups as a self-hosted memory solution for industries like healthcare or legal that require data privacy. Sell as a licensed product with optional training and updates.
Provide the core skill for free to build a user base, then offer premium features like advanced analytics, multi-collection support, or integration with other tools. Monetize through tiered subscriptions for enterprises.
💬 Integration Tip
Ensure ChromaDB and Ollama are running and accessible via the configured URLs; start with default settings and adjust minScore based on recall performance.
Captures learnings, errors, and corrections to enable continuous improvement. Use when: (1) A command or operation fails unexpectedly, (2) User corrects Clau...
Helps users discover and install agent skills when they ask questions like "how do I do X", "find a skill for X", "is there a skill that can...", or express interest in extending capabilities. This skill should be used when the user is looking for functionality that might exist as an installable skill.
Search and analyze your own session logs (older/parent conversations) using jq.
Typed knowledge graph for structured agent memory and composable skills. Use when creating/querying entities (Person, Project, Task, Event, Document), linking related objects, enforcing constraints, planning multi-step actions as graph transformations, or when skills need to share state. Trigger on "remember", "what do I know about", "link X to Y", "show dependencies", entity CRUD, or cross-skill data access.
Ultimate AI agent memory system for Cursor, Claude, ChatGPT & Copilot. WAL protocol + vector search + git-notes + cloud backup. Never lose context again. Vibe-coding ready.
Headless browser automation CLI optimized for AI agents with accessibility tree snapshots and ref-based element selection