nima-coreNoosphere Integrated Memory Architecture โ Complete cognitive stack for AI agents: persistent memory, emotional intelligence, dream consolidation, hive mind,...
Install via ClawdBot CLI:
clawdbot install dmdorta1111/nima-coreNoosphere Integrated Memory Architecture โ A complete cognitive stack for AI agents: persistent memory, emotional intelligence, dream consolidation, hive mind, and precognitive recall.
Website: https://nima-core.ai ยท GitHub: https://github.com/lilubot/nima-core
pip install nima-core && nima-core
Your bot now has persistent memory. Zero config needed.
NIMA evolved from a memory plugin into a full cognitive architecture:
| Module | What It Does | Version |
|--------|-------------|---------|
| Memory Capture | 3-layer capture (input/contemplation/output), 4-phase noise filtering | v2.0 |
| Semantic Recall | Vector + text hybrid search, ecology scoring, token-budgeted injection | v2.0 |
| Dynamic Affect | Panksepp 7-affect emotional state (SEEKING, RAGE, FEAR, LUST, CARE, PANIC, PLAY) | v2.1 |
| VADER Analyzer | Contextual sentiment โ caps boost, negation, idioms, degree modifiers | v2.2 |
| Memory Pruner | LLM distillation of old conversations โ semantic gists, 30-day suppression limbo | v2.3 |
| Dream Consolidation | Nightly synthesis โ extracts insights and patterns from episodic memory | v2.4 |
| Hive Mind | Multi-agent memory sharing via shared DB + optional Redis pub/sub | v2.5 |
| Precognition | Temporal pattern mining โ predictive memory pre-loading | v2.5 |
| Lucid Moments | Spontaneous surfacing of emotionally-resonant memories | v2.5 |
| Darwinian Memory | Clusters similar memories, ghosts duplicates via cosine + LLM verification | v3.0 |
| Installer | One-command setup โ LadybugDB, hooks, directories, embedder config | v3.0 |
install.sh) for zero-friction setup3.0.4OPENCLAW HOOKS
โโโ nima-memory/ Capture hook (3-layer, 4-phase noise filter)
โ โโโ index.js Hook entry point
โ โโโ ladybug_store.py LadybugDB storage backend
โ โโโ embeddings.py Multi-provider embedding (Voyage/OpenAI/Ollama/local)
โ โโโ backfill.py Historical transcript import
โ โโโ health_check.py DB integrity checks
โโโ nima-recall-live/ Recall hook (before_agent_start)
โ โโโ lazy_recall.py Current recall engine
โ โโโ ladybug_recall.py LadybugDB-native recall
โโโ nima-affect/ Affect hook (message_received)
โ โโโ vader-affect.js VADER sentiment analyzer
โ โโโ emotion-lexicon.js Emotion keyword lexicon
โโโ shared/ Resilient wrappers, error handling
PYTHON CORE (nima_core/)
โโโ cognition/
โ โโโ dynamic_affect.py Panksepp 7-affect system
โ โโโ emotion_detection.py Text emotion extraction
โ โโโ affect_correlation.py Cross-affect analysis
โ โโโ affect_history.py Temporal affect tracking
โ โโโ affect_interactions.py Affect coupling dynamics
โ โโโ archetypes.py Personality baselines (Guardian, Explorer, etc.)
โ โโโ personality_profiles.py JSON personality configs
โ โโโ response_modulator_v2.py Affect โ response modulation
โโโ dream_consolidation.py Nightly memory synthesis engine
โโโ memory_pruner.py Episodic distillation + suppression
โโโ hive_mind.py Multi-agent memory sharing
โโโ precognition.py Temporal pattern mining
โโโ lucid_moments.py Spontaneous memory surfacing
โโโ connection_pool.py SQLite pool (WAL, thread-safe)
โโโ logging_config.py Singleton logger
โโโ metrics.py Thread-safe counters/timings
~/.nima/| Feature | Env Var | Network Calls To | Default |
|---------|----------|------------------|---------|
| Cloud embeddings | NIMA_EMBEDDER=voyage | voyage.ai | Off |
| Cloud embeddings | NIMA_EMBEDDER=openai | openai.com | Off |
| Memory pruner | ANTHROPIC_API_KEY set | anthropic.com | Off |
| Ollama embeddings | NIMA_EMBEDDER=ollama | localhost:11434 | Off |
| HiveMind | HIVE_ENABLED=true | Redis pub/sub | Off |
| Precognition | Using external LLM | Configured endpoint | Off |
| Component | Location | Purpose |
|-----------|----------|---------|
| Python core (nima_core/) | ~/.nima/ | Memory, affect, cognition |
| OpenClaw hooks | ~/.openclaw/extensions/nima-*/ | Capture, recall, affect |
| SQLite database | ~/.nima/memory/graph.sqlite | Persistent storage |
| Logs | ~/.nima/logs/ | Debug logs (optional) |
| Env Var | Required? | Network Calls? | Purpose |
|---------|-----------|----------------|---------|
| NIMA_EMBEDDER=local | No | โ | Default โ offline embeddings |
| VOYAGE_API_KEY | Only if using Voyage | โ
voyage.ai | Cloud embeddings |
| OPENAI_API_KEY | Only if using OpenAI | โ
openai.com | Cloud embeddings |
| ANTHROPIC_API_KEY | Only if using pruner | โ
anthropic.com | Memory distillation |
| NIMA_OLLAMA_MODEL | Only if using Ollama | โ (localhost) | Local GPU embeddings |
Recommendation: Start with NIMA_EMBEDDER=local (default). Only enable cloud providers when you need better embedding quality.
install.sh and hook files before running~/.openclaw/openclaw.json before adding hooks~/.nima/logs/ for suspicious activity~/.nima/
โโโ memory/
โ โโโ graph.sqlite # SQLite backend (default)
โ โโโ ladybug.lbug # LadybugDB backend (optional)
โ โโโ embedding_cache.db # Cached embeddings
โ โโโ embedding_index.npy# Vector index
โโโ affect/
โ โโโ affect_state.json # Current emotional state
โโโ logs/ # Debug logs (if enabled)
~/.openclaw/extensions/
โโโ nima-memory/ # Capture hook
โโโ nima-recall-live/ # Recall hook
โโโ nima-affect/ # Affect hook
Controls:
{
"plugins": {
"entries": {
"nima-memory": {
"skip_subagents": true,
"skip_heartbeats": true,
"noise_filtering": { "filter_system_noise": true }
}
}
}
}
| Provider | Setup | Dims | Cost |
|----------|-------|------|------|
| Local (default) | NIMA_EMBEDDER=local | 384 | Free |
| Voyage AI | NIMA_EMBEDDER=voyage + VOYAGE_API_KEY | 1024 | $0.12/1M tok |
| OpenAI | NIMA_EMBEDDER=openai + OPENAI_API_KEY | 1536 | $0.13/1M tok |
| Ollama | NIMA_EMBEDDER=ollama + NIMA_OLLAMA_MODEL | 768 | Free |
| | SQLite (default) | LadybugDB (recommended) |
|--|-----------------|------------------------|
| Text Search | 31ms | 9ms (3.4x faster) |
| Vector Search | External | Native HNSW (18ms) |
| Graph Queries | SQL JOINs | Native Cypher |
| DB Size | ~91 MB | ~50 MB (44% smaller) |
Upgrade: pip install real-ladybug && python -c "from nima_core.storage import migrate; migrate()"
# Embedding (default: local)
NIMA_EMBEDDER=local|voyage|openai|ollama
VOYAGE_API_KEY=pa-xxx
OPENAI_API_KEY=sk-xxx
NIMA_OLLAMA_MODEL=nomic-embed-text
# Data paths
NIMA_DATA_DIR=~/.nima
NIMA_DB_PATH=~/.nima/memory/ladybug.lbug
# Memory pruner
NIMA_DISTILL_MODEL=claude-haiku-4-5
ANTHROPIC_API_KEY=sk-ant-xxx
# Logging
NIMA_LOG_LEVEL=INFO
NIMA_DEBUG_RECALL=1
| Hook | Fires | Does |
|------|-------|------|
| nima-memory | After save | Captures 3 layers โ filters noise โ stores in graph DB |
| nima-recall-live | Before LLM | Searches memories โ scores by ecology โ injects as context (3000 token budget) |
| nima-affect | On message | VADER sentiment โ Panksepp 7-affect state โ archetype modulation |
./install.sh
openclaw gateway restart
Or manual:
cp -r openclaw_hooks/nima-memory ~/.openclaw/extensions/
cp -r openclaw_hooks/nima-recall-live ~/.openclaw/extensions/
cp -r openclaw_hooks/nima-affect ~/.openclaw/extensions/
Nightly synthesis extracts insights and patterns from episodic memory:
python -m nima_core.dream_consolidation
# Or schedule via OpenClaw cron at 2 AM
Distills old conversations into semantic gists, suppresses raw noise:
python -m nima_core.memory_pruner --min-age 14 --live
python -m nima_core.memory_pruner --restore 12345 # undo within 30 days
Multi-agent memory sharing:
from nima_core import HiveMind
hive = HiveMind(db_path="~/.nima/memory/ladybug.lbug")
context = hive.build_agent_context("research task", max_memories=8)
hive.capture_agent_result("agent-1", "result summary", "model-name")
Temporal pattern mining โ predictive memory pre-loading:
from nima_core import NimaPrecognition
precog = NimaPrecognition(db_path="~/.nima/memory/ladybug.lbug")
precog.run_mining_cycle()
Spontaneous surfacing of emotionally-resonant memories (with safety: trauma filtering, quiet hours, daily caps):
from nima_core import LucidMoments
lucid = LucidMoments(db_path="~/.nima/memory/ladybug.lbug")
moment = lucid.surface_moment()
Panksepp 7-affect emotional intelligence with personality archetypes:
from nima_core import DynamicAffectSystem
affect = DynamicAffectSystem(identity_name="my_bot", baseline="guardian")
state = affect.process_input("I'm excited about this!")
# Archetypes: guardian, explorer, trickster, empath, sage
from nima_core import (
DynamicAffectSystem,
get_affect_system,
HiveMind,
NimaPrecognition,
LucidMoments,
)
# Affect (thread-safe singleton)
affect = get_affect_system(identity_name="lilu")
state = affect.process_input("Hello!")
# Hive Mind
hive = HiveMind()
context = hive.build_agent_context("task description")
# Precognition
precog = NimaPrecognition()
precog.run_mining_cycle()
# Lucid Moments
lucid = LucidMoments()
moment = lucid.surface_moment()
See CHANGELOG.md for full version history.
MIT โ free for any AI agent, commercial or personal.
Generated Mar 1, 2026
Deploy NIMA Core to enable a customer support AI agent with persistent memory and emotional intelligence. The agent can recall past interactions, adapt responses based on user sentiment, and consolidate insights from support tickets to improve future interactions, enhancing customer satisfaction and reducing repeat issues.
Use NIMA Core to create a therapeutic AI companion that tracks emotional states and memories over time. It can surface emotionally resonant memories for reflection, analyze patterns in user conversations, and provide personalized support, helping users manage mental well-being through consistent, empathetic interactions.
Implement NIMA Core in a research environment where multiple AI agents collaborate via the Hive Mind feature. They share memory databases to pool insights, perform precognitive pattern mining on data trends, and distill complex information into semantic gists, accelerating academic or business research projects.
Integrate NIMA Core into an educational AI tutor that uses persistent memory to track student progress and emotional engagement. It can recall previous lessons, adapt teaching styles based on affect analysis, and consolidate learning patterns to provide tailored feedback, improving educational outcomes.
Leverage NIMA Core for an enterprise AI that manages organizational knowledge with memory capture and pruning. It can filter noise from meetings, distill key insights into summaries, and enable semantic recall for employees, streamlining information retrieval and decision-making processes.
Offer NIMA Core as a free, open-source package with basic features, then generate revenue through paid premium support, customization services, and enterprise-level consulting. This model attracts developers and businesses seeking reliable assistance and tailored integrations.
Provide a SaaS platform where users subscribe to access enhanced cloud-based features like Hive Mind sharing, advanced embeddings via external APIs, and managed Redis pub/sub. This model targets organizations needing scalable, networked AI capabilities without local setup overhead.
License NIMA Core to large enterprises for integration into proprietary AI systems, with revenue from licensing fees, training workshops, and ongoing maintenance. This model caters to companies requiring secure, on-premises deployment with full control over data and customization.
๐ฌ Integration Tip
Start with the default local embeddings to avoid external calls, then gradually enable optional features like Hive Mind or cloud embeddings based on specific use-case needs to minimize complexity.
Captures learnings, errors, and corrections to enable continuous improvement. Use when: (1) A command or operation fails unexpectedly, (2) User corrects Clau...
Helps users discover and install agent skills when they ask questions like "how do I do X", "find a skill for X", "is there a skill that can...", or express interest in extending capabilities. This skill should be used when the user is looking for functionality that might exist as an installable skill.
Search and analyze your own session logs (older/parent conversations) using jq.
Typed knowledge graph for structured agent memory and composable skills. Use when creating/querying entities (Person, Project, Task, Event, Document), linking related objects, enforcing constraints, planning multi-step actions as graph transformations, or when skills need to share state. Trigger on "remember", "what do I know about", "link X to Y", "show dependencies", entity CRUD, or cross-skill data access.
Ultimate AI agent memory system for Cursor, Claude, ChatGPT & Copilot. WAL protocol + vector search + git-notes + cloud backup. Never lose context again. Vibe-coding ready.
Headless browser automation CLI optimized for AI agents with accessibility tree snapshots and ref-based element selection