x-timeline-digestBuild a deduplicated digest from X (Twitter) For You and Following timelines using bird. Outputs a payload for upstream delivery.
Install via ClawdBot CLI:
clawdbot install seandong/x-timeline-digestThis skill uses bird to read X/Twitter timelines and build a high-signal digest.
Sources:
What it does:
Delivery (Telegram, email, etc.) is NOT handled here.
Upstream OpenClaw workflows decide how to notify users.
All config is read from: skills.entries["x-timeline-digest"].config
| Name | Type | Default | Description |
|----|----|----|----|
| intervalHours | number | 6 | Interval window in hours |
| fetchLimitForYou | number | 100 | Tweets fetched from For You |
| fetchLimitFollowing | number | 60 | Tweets fetched from Following |
| maxItemsPerDigest | number | 25 | Max tweets in one digest |
| similarityThreshold | number | 0.9 | Near-duplicate similarity threshold |
| statePath | string | ~/.openclaw/state/x-timeline-digest.json | State file path |
Run the digest generator to get a clean, deduplicated JSON payload:
node skills/x-timeline-digest/digest.js
To generate the "Smart Brief" (Categorized, Summarized, Denoised):
node skills/x-timeline-digest/digest.js > digest.jsonread skills/x-timeline-digest/PROMPT.mddigest.json where {{JSON_DATA}} is.Note: The script automatically applies heuristic filtering (removes "gm", ads, short spam) before outputting JSON.
For You timeline:
bird home -n
Following timeline:
bird home --following -n
State is persisted to statePath.
{
"lastRunAt": "2026-02-01T00:00:00+08:00",
"sentTweetIds": {
"123456789": "2026-02-01T00:00:00+08:00"
}
}
The skill returns one JSON object:
{
"window": {
"start": "2026-02-01T00:00:00+08:00",
"end": "2026-02-01T06:00:00+08:00",
"intervalHours": 6
},
"counts": {
"forYouFetched": 100,
"followingFetched": 60,
"afterIncremental": 34,
"afterDedup": 26,
"final": 20
},
"digestText": "中文摘要内容",
"items": [
{
"id": "123456",
"author": "@handle",
"createdAt": "2026-02-01T02:15:00+08:00",
"text": "tweet text",
"url": "https://x.com/handle/status/123456",
"sources": ["following"]
}
]
}
AI Usage Analysis
Analysis is being generated… refresh in a few seconds.
Monitor blogs and RSS/Atom feeds for updates using the blogwatcher CLI.
Comprehensive news aggregator that fetches, filters, and deeply analyzes real-time content from 8 major sources: Hacker News, GitHub Trending, Product Hunt, 36Kr, Tencent News, WallStreetCN, V2EX, and Weibo. Best for 'daily scans', 'tech news briefings', 'finance updates', and 'deep interpretations' of hot topics.
This skill should be used when the user asks for news updates, daily briefings, or what's happening in the world. Fetches news from trusted international RSS feeds and can create voice summaries.
Aggregates and summarizes the latest AI news from multiple sources including AI news websites and web search. Provides concise news briefs with direct links to original articles. Activates when user asks for 'today's AI news', 'AI updates', 'latest AI developments', or mentions wanting a 'daily AI briefing'.
Generates a warm, compact daily briefing with weather, calendar, reminders, birthdays, and important emails for cron or chat delivery.
Provides a personalized morning report with today's reminders, undone Notion tasks, and vault storage summary for daily planning.