topic-monitorMonitor topics of interest and proactively alert when important developments occur. Use when user wants automated monitoring of specific subjects (e.g., product releases, price changes, news topics, technology updates). Supports scheduled web searches, AI-powered importance scoring, smart alerts vs weekly digests, and memory-aware contextual summaries.
Install via ClawdBot CLI:
clawdbot install robbyczgw-cla/topic-monitorMonitor what matters. Get notified when it happens.
Topic Monitor transforms your assistant from reactive to proactive by continuously monitoring topics you care about and intelligently alerting you only when something truly matters.
Just want to monitor one topic? One command:
python3 scripts/quick.py "AI Model Releases"
That's it! This creates a topic with sensible defaults:
# Basic - just a topic name
python3 scripts/quick.py "Bitcoin Price"
# With keywords
python3 scripts/quick.py "Security CVEs" --keywords "CVE,vulnerability,critical"
# High priority, hourly checks
python3 scripts/quick.py "Production Alerts" --frequency hourly --importance high
# Custom query
python3 scripts/quick.py "Competitor News" --query "CompanyName product launch funding"
# Different channel
python3 scripts/quick.py "Team Updates" --channel discord
| Feature | Quick Start | Full Setup |
|---------|-------------|------------|
| Speed | β‘ 1 command | π Wizard |
| Defaults | Smart | Customizable |
| Use case | Single topic | Multiple topics |
| Configuration | Minimal | Full control |
After Quick Start, you can always customize:
python3 scripts/manage_topics.py edit ai-model-releases --frequency hourly
For configuring multiple topics or advanced options:
python3 scripts/setup.py
The wizard will guide you through:
The wizard creates config.json with your preferences. You can always edit it later or use manage_topics.py to add/remove topics.
Example session:
π Topic Monitor - Setup Wizard
What topics do you want to monitor?
> AI Model Releases
> Security Vulnerabilities
>
--- Topic 1/2: AI Model Releases ---
Search query for 'AI Model Releases' [AI Model Releases news updates]: new AI model release announcement
Keywords to watch for in 'AI Model Releases'?
> GPT, Claude, Llama, release
--- Topic 2/2: Security Vulnerabilities ---
Search query for 'Security Vulnerabilities' [Security Vulnerabilities news updates]: CVE critical vulnerability patch
Keywords to watch for in 'Security Vulnerabilities'?
> CVE, vulnerability, critical, patch
How often should I check for updates?
1. hourly
2. daily *
3. weekly
β
Setup Complete!
Already know what you're doing? Here's the manual approach:
# Initialize config from template
cp config.example.json config.json
# Add a topic
python3 scripts/manage_topics.py add "Product Updates" \
--keywords "release,update,patch" \
--frequency daily \
--importance medium
# Test monitoring (dry run)
python3 scripts/monitor.py --dry-run
# Set up cron for automatic monitoring
python3 scripts/setup_cron.py
Each topic has:
hourly, daily, weeklyhigh (alert immediately), medium (alert if important), low (digest only){
"topics": [
{
"id": "ai-models",
"name": "AI Model Releases",
"query": "new AI model release GPT Claude Llama",
"keywords": ["GPT", "Claude", "Llama", "release", "announcement"],
"frequency": "daily",
"importance_threshold": "high",
"channels": ["telegram"],
"context": "Following AI developments for work",
"alert_on": ["model_release", "major_update"]
},
{
"id": "tech-news",
"name": "Tech Industry News",
"query": "technology startup funding acquisition",
"keywords": ["startup", "funding", "Series A", "acquisition"],
"frequency": "daily",
"importance_threshold": "medium",
"channels": ["telegram"],
"context": "Staying informed on tech trends",
"alert_on": ["major_funding", "acquisition"]
},
{
"id": "security-alerts",
"name": "Security Vulnerabilities",
"query": "CVE critical vulnerability security patch",
"keywords": ["CVE", "vulnerability", "security", "patch", "critical"],
"frequency": "hourly",
"importance_threshold": "high",
"channels": ["telegram", "email"],
"context": "DevOps security monitoring",
"alert_on": ["critical_cve", "zero_day"]
}
],
"settings": {
"digest_day": "sunday",
"digest_time": "18:00",
"max_alerts_per_day": 5,
"deduplication_window_hours": 72,
"learning_enabled": true
}
}
Manage research topics:
# Add topic
python3 scripts/manage_topics.py add "Topic Name" \
--query "search query" \
--keywords "word1,word2" \
--frequency daily \
--importance medium \
--channels telegram
# List topics
python3 scripts/manage_topics.py list
# Edit topic
python3 scripts/manage_topics.py edit eth-price --frequency hourly
# Remove topic
python3 scripts/manage_topics.py remove eth-price
# Test topic (preview results without saving)
python3 scripts/manage_topics.py test eth-price
Main monitoring script (run via cron):
# Normal run (alerts + saves state)
python3 scripts/monitor.py
# Dry run (no alerts, shows what would happen)
python3 scripts/monitor.py --dry-run
# Force check specific topic
python3 scripts/monitor.py --topic eth-price
# Verbose logging
python3 scripts/monitor.py --verbose
How it works:
Generate weekly digest:
# Generate digest for current week
python3 scripts/digest.py
# Generate and send
python3 scripts/digest.py --send
# Preview without sending
python3 scripts/digest.py --preview
Output format:
# Weekly Research Digest - [Date Range]
## π₯ Highlights
- **AI Models**: Claude 4.5 released with improved reasoning
- **Security**: Critical CVE patched in popular framework
## π By Topic
### AI Model Releases
- [3 findings this week]
### Security Vulnerabilities
- [1 finding this week]
## π‘ Recommendations
Based on your interests, you might want to monitor:
- "Kubernetes security" (mentioned 3x this week)
Configure automated monitoring:
# Interactive setup
python3 scripts/setup_cron.py
# Auto-setup with defaults
python3 scripts/setup_cron.py --auto
# Remove cron jobs
python3 scripts/setup_cron.py --remove
Creates cron entries:
# Topic Monitor - Hourly topics
0 * * * * cd /path/to/skills/topic-monitor && python3 scripts/monitor.py --frequency hourly
# Topic Monitor - Daily topics
0 9 * * * cd /path/to/skills/topic-monitor && python3 scripts/monitor.py --frequency daily
# Topic Monitor - Weekly digest
0 18 * * 0 cd /path/to/skills/topic-monitor && python3 scripts/digest.py --send
The scorer uses multiple signals to decide alert priority:
HIGH priority (immediate alert):
MEDIUM priority (digest-worthy):
LOW priority (ignore):
When enabled (learning_enabled: true), the system:
Learning data stored in .learning_data.json (privacy-safe, never shared).
Topic Monitor connects to your conversation history:
Example alert:
π Dirac Live Update
Version 3.8 released with the room correction improvements you asked about last week.
Context: You mentioned struggling with bass response in your studio. This update includes new low-frequency optimization.
[Link] | [Full details]
How it works:
Help the AI connect dots:
# Memory Hints for Topic Monitor
## AI Models
- Using Claude for coding assistance
- Interested in reasoning improvements
- Comparing models for different use cases
## Security
- Running production Kubernetes clusters
- Need to patch critical CVEs quickly
- Interested in zero-day disclosures
## Tech News
- Following startup ecosystem
- Interested in developer tools space
- Tracking potential acquisition targets
Requires OpenClaw message tool:
{
"channels": ["telegram"],
"telegram_config": {
"chat_id": "@your_username",
"silent": false,
"effects": {
"high_importance": "π₯",
"medium_importance": "π"
}
}
}
Agent-delivered (no webhook in skill config):
monitor.py emits DISCORD_ALERT JSON payloads, and OpenClaw sends them via the message tool. This matches the Telegram alert flow (structured output, no direct HTTP in skill code).
{
"channels": ["discord"]
}
SMTP or API:
{
"channels": ["email"],
"email_config": {
"to": "you@example.com",
"from": "research@yourdomain.com",
"smtp_server": "smtp.gmail.com",
"smtp_port": 587
}
}
Fine-tune when to alert:
{
"alert_on": [
"price_change_10pct",
"keyword_exact_match",
"source_tier_1",
"high_engagement"
],
"ignore_sources": [
"spam-site.com",
"clickbait-news.io"
],
"boost_sources": [
"github.com",
"arxiv.org",
"official-site.com"
]
}
Match specific patterns:
{
"patterns": [
"version \\d+\\.\\d+\\.\\d+",
"\\$\\d{1,3}(,\\d{3})*",
"CVE-\\d{4}-\\d+"
]
}
Prevent alert fatigue:
{
"settings": {
"max_alerts_per_day": 5,
"max_alerts_per_topic_per_day": 2,
"quiet_hours": {
"start": "22:00",
"end": "08:00"
}
}
}
Configure these environment variables to customize topic-monitor:
| Variable | Default | Description |
|----------|---------|-------------|
| TOPIC_MONITOR_TELEGRAM_ID | β | Your Telegram chat ID for receiving alerts |
| TOPIC_MONITOR_DATA_DIR | .data/ in skill dir | Where to store state and findings |
| WEB_SEARCH_PLUS_PATH | Relative to skill | Path to web-search-plus search.py |
| SERPER_API_KEY / TAVILY_API_KEY / EXA_API_KEY / YOU_API_KEY / SEARXNG_INSTANCE_URL / WSP_CACHE_DIR | β | Optional search-provider vars passed via subprocess env allowlist |
Example setup:
# Add to ~/.bashrc or .env
export TOPIC_MONITOR_TELEGRAM_ID="123456789"
export TOPIC_MONITOR_DATA_DIR="/home/user/topic-monitor-data"
export WEB_SEARCH_PLUS_PATH="/path/to/skills/web-search-plus/scripts/search.py"
Stored in TOPIC_MONITOR_DATA_DIR (default: .data/ in skill directory).
Tracks:
Example:
{
"topics": {
"eth-price": {
"last_check": "2026-01-28T22:00:00Z",
"last_alert": "2026-01-28T15:30:00Z",
"alerted_urls": [
"https://example.com/eth-news-1"
],
"findings_count": 3,
"alerts_today": 1
}
},
"deduplication": {
"url_hash_map": {
"abc123": "2026-01-28T15:30:00Z"
}
}
}
Stores digest-worthy findings:
.findings/
βββ 2026-01-22_eth-price.json
βββ 2026-01-24_fm26-patches.json
βββ 2026-01-27_ai-breakthroughs.json
importance_threshold: medium initially, adjust based on alert quality"keywords": ["AI", "-clickbait", "-spam"]Automatically uses intelligent routing:
Suggests topics based on conversation patterns:
"You've asked about Rust 12 times this month. Want me to monitor 'Rust language updates'?"
No alerts being sent:
crontab -l--dry-run --verbose to see scoringToo many alerts:
importance_thresholdMissing important news:
importance_threshold.research_state.json for deduplication issuesDigest not generating:
.findings/ directory exists and has contentpython3 scripts/digest.py --previewpython3 scripts/manage_topics.py add "iPhone 17 Release" \
--query "iPhone 17 announcement release date" \
--keywords "iPhone 17,Apple event,September" \
--frequency daily \
--importance high \
--channels telegram \
--context "Planning to upgrade from iPhone 13"
python3 scripts/manage_topics.py add "Competitor Analysis" \
--query "CompetitorCo product launch funding" \
--keywords "CompetitorCo,product,launch,Series,funding" \
--frequency weekly \
--importance medium \
--channels discord,email
python3 scripts/manage_topics.py add "Quantum Computing Papers" \
--query "quantum computing arxiv" \
--keywords "quantum,qubit,arxiv" \
--frequency weekly \
--importance low \
--channels email
Built for ClawHub. Uses web-search-plus skill for intelligent search routing.
Generated Mar 1, 2026
A startup monitors competitor product releases, funding rounds, and technology updates to stay ahead in the market. The AI-powered importance scoring filters out noise, alerting the team immediately to critical developments like a rival's major launch or acquisition, while compiling less urgent findings into a weekly digest for strategic review.
An IT department tracks security vulnerabilities, CVEs, and patch announcements across software used in their organization. By setting high importance thresholds for critical vulnerabilities, the system sends real-time alerts via Telegram, enabling rapid response to mitigate risks, with weekly summaries for low-priority updates.
An investment firm monitors news topics, price changes, and industry developments related to specific stocks or sectors. The skill automates web searches at configurable intervals, using contextual summaries to provide insights on market trends, helping analysts make informed decisions based on AI-filtered alerts and digests.
A media agency uses the skill to track trending topics, product releases, and news updates relevant to their clients. It schedules daily searches, extracts keywords for relevance, and delivers smart alerts for high-importance events, ensuring timely content creation and campaign adjustments while maintaining a memory-aware context of past interests.
Offer the Topic Monitor skill as a SaaS platform where users pay a monthly fee for automated topic tracking and alerts. Revenue is generated through tiered subscriptions based on the number of topics monitored, frequency of searches, and integration channels like Telegram or Discord, with premium features like advanced AI scoring.
Provide the skill as part of a larger enterprise solution, including custom integration with existing systems, training, and support. Revenue comes from one-time setup fees, ongoing maintenance contracts, and consulting services to tailor the monitoring to specific business needs, such as competitive analysis or compliance tracking.
Deploy a free version with basic monitoring capabilities for individual users, while charging for advanced features like higher importance thresholds, more frequent searches, or additional alert channels. Revenue is generated through upselling premium plans, in-app purchases for extra topics, and partnerships with data providers for enhanced search results.
π¬ Integration Tip
Start with the quick start script for a single topic to test functionality, then use the interactive wizard for full setup with multiple topics; ensure optional environment variables like Telegram ID are set for real-time alerts.
Automatically update Clawdbot and all installed skills once daily. Runs via cron, checks for updates, applies them, and messages the user with a summary of what changed.
Full desktop computer use for headless Linux servers. Xvfb + XFCE virtual desktop with xdotool automation. 17 actions (click, type, scroll, screenshot, drag,...
Essential Docker commands and workflows for container management, image operations, and debugging.
Tool discovery and shell one-liner reference for sysadmin, DevOps, and security tasks. AUTO-CONSULT this skill when the user is: troubleshooting network issues, debugging processes, analyzing logs, working with SSL/TLS, managing DNS, testing HTTP endpoints, auditing security, working with containers, writing shell scripts, or asks 'what tool should I use for X'. Source: github.com/trimstray/the-book-of-secret-knowledge
Deploy applications and manage projects with complete CLI reference. Commands for deployments, projects, domains, environment variables, and live documentation access.
Diagnoses common Linux service issues using logs, systemd/PM2, file permissions, Nginx reverse proxy checks, and DNS sanity checks. Use when a server app is failing, unreachable, or misconfigured.