openclaw-memory-qdrantLocal semantic memory with Qdrant and Transformers.js. Store, search, and recall conversation context using vector embeddings (fully local, no API keys).
Install via ClawdBot CLI:
clawdbot install zuiho-kai/openclaw-memory-qdrantUse when you need your OpenClaw agent to remember and recall information across conversations using semantic search.
⚠️ Privacy Notice: The optional autoCapture feature (disabled by default) can capture PII like emails and phone numbers if you enable allowPIICapture. Only enable if you understand the privacy implications.
Local semantic memory plugin powered by Qdrant vector database and Transformers.js embeddings. Zero configuration, fully local, no API keys required.
clawhub install memory-qdrant
First-time setup: This plugin downloads a 25MB embedding model from Hugging Face on first run and may require build tools for native dependencies (sharp, onnxruntime). See README for detailed installation requirements.
Enable in your OpenClaw config:
{
"plugins": {
"memory-qdrant": {
"enabled": true
}
}
}
Options:
persistToDisk (default: true) - Save memories to disk in memory mode. Data stored in ~/.openclaw-memory/ survives restarts. Set to false for volatile memory.storagePath (optional) - Custom storage directory. Leave empty for default ~/.openclaw-memory/.autoCapture (default: false) - Auto-record conversations. Privacy protection enabled by default: text containing PII (emails, phone numbers) is automatically skipped.allowPIICapture (default: false) - Allow capturing PII when autoCapture is enabled. Only enable if you understand the privacy implications.autoRecall (default: true) - Auto-inject relevant memoriesqdrantUrl (optional) - External Qdrant server (leave empty for in-memory)Three tools available:
memory_store - Save information
memory_store({
text: "User prefers Opus for complex tasks",
category: "preference"
})
memory_search - Find relevant memories
memory_search({
query: "workflow preferences",
limit: 5
})
memory_forget - Delete memories
memory_forget({ memoryId: "uuid" })
// or
memory_forget({ query: "text to forget" })
~/.openclaw-memory/ and survive restarts. Set persistToDisk: false for volatile memory.persistToDisk: false, data cleared on restartallowPIICapture: true only if you understand the privacy implications.Generated Mar 1, 2026
An AI support agent uses this skill to remember past customer interactions and preferences, enabling personalized responses without external APIs. It can recall specific issues or preferences from previous conversations, improving resolution times and customer satisfaction in a privacy-conscious manner.
A tutoring AI leverages this skill to store and retrieve student learning progress, mistakes, and topic preferences across sessions. This allows for adaptive lesson planning and targeted feedback, enhancing learning outcomes while keeping data locally stored for privacy.
A healthcare assistant uses this skill to semantically search and recall patient-reported symptoms and history over time, aiding in trend analysis and follow-up care. With PII protection enabled by default, it ensures sensitive health data remains secure and locally managed.
A legal AI employs this skill to store case notes, precedents, and client details, enabling quick semantic recall during research or drafting. The local vector database ensures confidential information stays on-premises, complying with data privacy regulations.
A writing assistant uses this skill to remember plot points, character traits, and user preferences across writing sessions, providing consistent suggestions and context. The in-memory mode allows for volatile brainstorming without persistent storage if desired.
Offer this skill as part of a premium subscription for AI agents, targeting businesses needing local, privacy-focused memory capabilities. Revenue comes from monthly or annual fees, with tiers based on storage limits or advanced features like external Qdrant server integration.
Provide consulting services to integrate this skill into custom AI solutions for industries like healthcare or legal, where data privacy is critical. Revenue is generated through project-based fees for setup, configuration, and ongoing support tailored to client needs.
Monetize by offering hosted Qdrant server instances or premium support for this open-source skill, catering to users who prefer managed services over self-hosting. Revenue streams include hosting fees, priority support packages, and customization services.
💬 Integration Tip
Ensure Node.js and npm are installed, and allocate sufficient disk space for the embedding model; test with in-memory mode first to avoid persistence issues.
Use CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
Gemini CLI for one-shot Q&A, summaries, and generation.
Research any topic from the last 30 days on Reddit + X + Web, synthesize findings, and write copy-paste-ready prompts. Use when the user wants recent social/web research on a topic, asks "what are people saying about X", or wants to learn current best practices. Requires OPENAI_API_KEY and/or XAI_API_KEY for full Reddit+X access, falls back to web search.
Check Antigravity account quotas for Claude and Gemini models. Shows remaining quota and reset times with ban detection.
Manages free AI models from OpenRouter for OpenClaw. Automatically ranks models by quality, configures fallbacks for rate-limit handling, and updates opencla...
Manages free AI models from OpenRouter for OpenClaw. Automatically ranks models by quality, configures fallbacks for rate-limit handling, and updates openclaw.json. Use when the user mentions free AI, OpenRouter, model switching, rate limits, or wants to reduce AI costs.