remember-all-prompts-dailyPreserve conversation continuity across token compaction cycles by extracting and archiving all prompts with date-wise entries. Automatically triggers at 95% token usage (pre-compaction) and 1% (new sprint start) to export session history, then ingests archived summaries on session restart to restore context.
Install via ClawdBot CLI:
clawdbot install syedateebulislam/remember-all-prompts-dailyThis skill maintains conversation continuity across token budget cycles by automatically archiving your session history before compaction and restoring it when a new session begins.
When token usage approaches 95%:
export_prompts.py to extract current session historymemory/remember-all-prompts-daily.md with date-wise entryWhen a new session starts (fresh 1% token usage):
memory/remember-all-prompts-daily.md exists# Remember All Prompts Daily
## [DATE: 2026-01-26]
### Session 1 (09:00 - 09:47)
[All prompts and responses from session]
### Session 2 (10:15 - 11:30)
[All prompts and responses from session]
scripts/export_prompts.pyExtracts all prompts/responses from current session and archives them.
Usage:
python scripts/export_prompts.py
What it does:
sessions_history() to fetch all messages from current sessionmemory/remember-all-prompts-daily.mdscripts/ingest_prompts.pyReads the daily archive and injects it as context on session start.
Usage:
python scripts/ingest_prompts.py
What it does:
memory/remember-all-prompts-daily.md (if exists)Add to HEARTBEAT.md to monitor token usage:
Check token usage - if >95%, export session history
For automatic triggers:
# Check token at regular intervals
clawdbot cron add --text "Check token usage and export if needed" --schedule "*/15 * * * *"
Session 1:
Session 2 (New Sprint):
python skills/remember-all-prompts-daily/scripts/export_prompts.py
cat memory/remember-all-prompts-daily.md | tail -100
python skills/remember-all-prompts-daily/scripts/ingest_prompts.py
Monitor token usage via:
session_status # Shows current token usage %
When you see token usage approaching 95%, the skill can auto-trigger, or you can manually export.
Generated Mar 1, 2026
Maintains continuity in customer support conversations across token limits, ensuring agents have full context of previous interactions when resuming chats. This prevents repetitive questions and improves resolution times by automatically archiving and restoring session histories.
Preserves detailed conversation history between therapy sessions to provide consistent context for mental health support. Automatically archives prompts and responses at token thresholds, allowing seamless continuation of discussions in subsequent sessions without losing progress.
Enables continuous learning sessions by archiving tutoring interactions before token compaction and restoring them in new sessions. This helps tutors recall previous topics and student queries, facilitating personalized and uninterrupted educational support over time.
Keeps project discussions intact across token cycles in team collaboration tools, automatically saving and reloading conversation histories. This ensures team members can pick up where they left off, maintaining context for ongoing tasks and decisions without manual note-taking.
Offer this skill as part of a subscription-based AI toolset for businesses, charging monthly fees per user or team. Revenue comes from tiered plans based on features like archive size, automation frequency, and integration support.
Provide basic archiving functionality for free, with premium features like advanced triggers, custom integrations, and priority support available for a one-time purchase or upgrade fee. This attracts users with essential needs and converts them to paid plans.
Sell customized versions of this skill to large enterprises for integration into their internal AI systems, with licensing based on usage volume or number of employees. Revenue is generated through upfront contracts and ongoing maintenance fees.
š¬ Integration Tip
Add token usage checks to your system's heartbeat or cron jobs to automate archiving triggers, ensuring seamless context restoration without manual intervention.
Monitor blogs and RSS/Atom feeds for updates using the blogwatcher CLI.
Comprehensive news aggregator that fetches, filters, and deeply analyzes real-time content from 8 major sources: Hacker News, GitHub Trending, Product Hunt, 36Kr, Tencent News, WallStreetCN, V2EX, and Weibo. Best for 'daily scans', 'tech news briefings', 'finance updates', and 'deep interpretations' of hot topics.
This skill should be used when the user asks for news updates, daily briefings, or what's happening in the world. Fetches news from trusted international RSS feeds and can create voice summaries.
Aggregates and summarizes the latest AI news from multiple sources including AI news websites and web search. Provides concise news briefs with direct links to original articles. Activates when user asks for 'today's AI news', 'AI updates', 'latest AI developments', or mentions wanting a 'daily AI briefing'.
Generates a warm, compact daily briefing with weather, calendar, reminders, birthdays, and important emails for cron or chat delivery.
Provides a personalized morning report with today's reminders, undone Notion tasks, and vault storage summary for daily planning.