memory-compression-systemIntegrated memory management and extreme context compression for OpenClaw. Combines memory management, compression, search, and automation in one unified skill.
Install via ClawdBot CLI:
clawdbot install Hassffw/memory-compression-systemIntegrated memory management and extreme context compression for OpenClaw. Combines the best features of memory management and extreme compression into a single, streamlined skill with automatic scheduling.
# Install skill
openclaw skill install memory-compression-system
# Enable automatic compression (runs every 6 hours)
scripts/enable.sh
# Check status
scripts/status.sh
# Manual compression
scripts/compress.sh --format ultra
# Search memory
scripts/search.sh "keyword"
openclaw skill search memory-compression-system
openclaw skill install memory-compression-system
cd /home/node/.openclaw/workspace/skills
git clone [repository-url] memory-compression-system
cd memory-compression-system
scripts/install.sh
# Enable automatic compression
scripts/enable.sh
# Disable automatic compression
scripts/disable.sh
# Check system status
scripts/status.sh
# Run health check
scripts/health.sh
# Compress with auto format selection
scripts/compress.sh
# Compress to specific format
scripts/compress.sh --format base64
scripts/compress.sh --format binary
scripts/compress.sh --format ultra
# Decompress files
scripts/decompress.sh [filename]
# List compressed files
scripts/list.sh
# Search across all memory
scripts/search.sh "keyword"
# Search with filters
scripts/search.sh "keyword" --format ultra --date 2026-02-15
# Export search results
scripts/search.sh "keyword" --export json
# View search history
scripts/search-history.sh
# Cleanup old files
scripts/cleanup.sh --days 30
# Backup system
scripts/backup.sh
# Restore from backup
scripts/restore.sh [backup-file]
# View logs
scripts/logs.sh
# Performance metrics
scripts/metrics.sh
Edit config/default.conf:
# Compression settings
COMPRESSION_ENABLED=true
DEFAULT_FORMAT=ultra
RETENTION_DAYS=30
MAX_COMPRESSED_FILES=100
# Cron schedule (UTC)
CRON_SCHEDULE="0 */6 * * *" # Every 6 hours
CLEANUP_SCHEDULE="0 4 * * *" # Daily at 04:00
# Search settings
SEARCH_ENABLED=true
SEARCH_INDEX_AUTO_UPDATE=true
SEARCH_HISTORY_SIZE=1000
# Performance settings
MAX_MEMORY_MB=100
MAX_PROCESSING_TIME_SEC=300
export MEMORY_COMPRESSION_DEBUG=1 # Enable debug mode
export MEMORY_COMPRESSION_QUIET=0 # Disable quiet mode
export MEMORY_COMPRESSION_TEST=0 # Enable test mode
VERSION:3.0
FORMAT:B64C
TIMESTAMP:2026-02-15T14:55:00Z
SIZE:1024
CHECKSUM:crc32
DATA:<base64_encoded>
Features:
[Magic:CBIN][Version:3][Flags:1][Size:4][Dictionary:var][Data:var][Checksum:2]
Features:
Target: ~150 tokens for complete context
Features:
1. Backup current memory
2. Compress with optimal format
3. Update search index
4. Cleanup old files
5. Log results
1. Remove files older than RETENTION_DAYS
2. Archive logs
3. Optimize search index
4. Update statistics
1. Check compression status
2. Verify file integrity
3. Monitor resource usage
4. Report issues
# Basic search
scripts/search.sh "compression"
# Advanced search
scripts/search.sh "compression ratio" --format ultra --after 2026-02-01
# Export results
scripts/search.sh "test" --export csv --output results.csv
# Search history
scripts/search.sh --history
{
"version": "3.0",
"last_updated": "2026-02-15T18:00:00Z",
"files": [
{
"filename": "memory_20260215_180000.ultra",
"format": "ultra",
"size": 256,
"original_size": 1024,
"ratio": 0.25,
"keywords": ["compression", "memory", "test"],
"timestamp": "2026-02-15T18:00:00Z"
}
]
}
# Check error logs
scripts/logs.sh --error
# Restore from backup
scripts/restore.sh latest
# Repair search index
scripts/repair-index.sh
# Reset system
scripts/reset.sh --safe
Issue: Compression fails
Solution: Check disk space and run scripts/repair.sh
Issue: Search not working
Solution: Rebuild index with scripts/rebuild-index.sh
Issue: Cron job not running
Solution: Check with scripts/status.sh --cron
Issue: Performance degradation
Solution: Run scripts/cleanup.sh --aggressive
logs/compression.log: Compression operationslogs/search.log: Search operationslogs/error.log: Error messageslogs/performance.log: Performance metricslogs/cron.log: Cron job execution# Basic status
scripts/status.sh
# Detailed health check
scripts/health.sh
# Performance metrics
scripts/metrics.sh
# System information
scripts/info.sh
cd test
./run-tests.sh
# Test compression
scripts/test-compression.sh
# Test search
scripts/test-search.sh
# Test cron job
scripts/test-cron.sh
# Test error handling
scripts/test-errors.sh
# Update via ClawHub
openclaw skill update memory-compression-system
# Manual update
scripts/update.sh
# Check for updates
scripts/check-updates.sh
data/ directoryconfig/ after changes# Clone repository
git clone [repo-url]
# Install dependencies
npm install
# Run tests
npm test
# Build package
npm run build
logs/ directoryscripts/diagnose.shexamples/troubleshooting.mdMIT License - See LICENSE file for details.
Note: This skill is designed for OpenClaw context optimization. Always maintain backups of important data and test in a safe environment before production use.
Generated Mar 1, 2026
Automatically compresses and indexes historical support conversations to reduce storage costs and enable fast retrieval of past issues. Agents can quickly search for similar cases to resolve new tickets efficiently, improving response times.
Compresses large volumes of legal documents and case files into ultra-compact formats for archival. Enables rapid search across compressed files for specific clauses or precedents, aiding in research and compliance.
Manages and compresses patient records and medical logs to meet data retention policies while ensuring quick access. Supports search functionality for retrieving historical data during diagnoses or audits.
Compresses transaction logs and user interaction data to optimize storage for analytics platforms. Allows efficient searching of past trends or customer behaviors to inform marketing strategies.
Archives large datasets from scientific experiments or surveys using extreme compression to save space. Facilitates search across compressed files for specific data points, supporting ongoing analysis.
Offers the skill as a cloud-based service with tiered pricing based on storage volume and compression frequency. Includes premium features like advanced search filters and priority support for enterprise clients.
Sells perpetual licenses for organizations to deploy the skill on their own infrastructure. Revenue comes from one-time license fees plus optional maintenance contracts for updates and support.
Provides a free basic version with limited compression formats and search capabilities. Generates revenue by upselling premium add-ons such as ultra compression, enhanced automation, or extended retention periods.
💬 Integration Tip
Ensure the host system has cron enabled for scheduling and sufficient storage for backups before enabling automatic compression to avoid data loss.
Captures learnings, errors, and corrections to enable continuous improvement. Use when: (1) A command or operation fails unexpectedly, (2) User corrects Clau...
Helps users discover and install agent skills when they ask questions like "how do I do X", "find a skill for X", "is there a skill that can...", or express interest in extending capabilities. This skill should be used when the user is looking for functionality that might exist as an installable skill.
Search and analyze your own session logs (older/parent conversations) using jq.
Typed knowledge graph for structured agent memory and composable skills. Use when creating/querying entities (Person, Project, Task, Event, Document), linking related objects, enforcing constraints, planning multi-step actions as graph transformations, or when skills need to share state. Trigger on "remember", "what do I know about", "link X to Y", "show dependencies", entity CRUD, or cross-skill data access.
Ultimate AI agent memory system for Cursor, Claude, ChatGPT & Copilot. WAL protocol + vector search + git-notes + cloud backup. Never lose context again. Vibe-coding ready.
Headless browser automation CLI optimized for AI agents with accessibility tree snapshots and ref-based element selection