cloud-backupBack up and restore OpenClaw state. Creates local archives and uploads to S3-compatible cloud storage (AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOce...
Install via ClawdBot CLI:
clawdbot install obuchowski/cloud-backupBacks up OpenClaw state to a local archive and uploads it to cloud storage.
Follow the steps below. Inform the user about implicit defaults applied after execution.
bash "{baseDir}/scripts/cloud-backup.sh" backup full
Default mode is full. Use workspace, skills, or settings only when the user explicitly asks for a narrower backup scope.
Look at the script output from Step 1. If it contains:
WARN: Encryption is disabled — backup archive will be stored in plaintext.
ask the user:
"Your backups are not encrypted. Archives contain config, credentials, and API keys in cleartext. Want to set a passphrase? (AES-256, just the passphrase needed to restore — no key files.)"
gateway config.patch and tell the user what was saved:skills.entries.cloud-backup.config.encrypt = trueskills.entries.cloud-backup.env.GPG_PASSPHRASE = "" Then re-run the backup so the archive is encrypted.
Always execute this step and report the result. Backups may contain secrets — prompt every time encryption is off.
Look at the script output. If it contains WARN: Cloud storage is not configured, go to Step 4. Otherwise go to Step 5.
Tell the user the backup was saved locally, then ask:
"Cloud storage isn't configured yet — backups are local only. Want to set up cloud upload? I support AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOcean Spaces, or any other S3-compatible service."
config.upload=false via gateway config.patch and confirm the change to the user. Done.Always execute this step — offer cloud setup when it's missing and report what was configured.
Report the backup paths from the script output to the user.
Then check if a daily cron job already exists for cloud-backup (cron action=list). If no cron job exists, create one with the default schedule and tell the user:
{
"schedule": { "kind": "cron", "expr": "0 2 * * *" },
"payload": { "kind": "agentTurn", "message": "Run cloud-backup: backup full" },
"sessionTarget": "isolated"
}
Default: daily at 2 AM. Use a different schedule if it was explicitly asked for one.
Tell the user: "Scheduled daily backup at 2 AM." (or whatever time was set).
This step should be executed by default unless user asked not to do it. Report about the result — cron job created or failed to be created.
When the user agrees to configure cloud storage:
references/providers/ — it has exact config keys, endpoint format, and credential steps.gateway config.patch — bucket, credentials, endpoint (if non-AWS).status to verify connectivity, then re-run backup.bash "{baseDir}/scripts/cloud-backup.sh" <command>
| Command | What it does |
|---------|-------------|
| backup [full\|workspace\|skills\|settings] | Create archive + upload if configured. Default: full |
| list | Show local + remote backups |
| restore | Restore from local or cloud. Always --dry-run first |
| cleanup | Prune old archives (local: capped at 7; cloud: count + age) |
| status | Show current config and dependency check |
All in skills.entries.cloud-backup in OpenClaw config. Don't write defaults — the script handles them.
config.*| Key | Default | Description |
|-----|---------|-------------|
| bucket | — | Storage bucket name (required for cloud) |
| region | us-east-1 | Region hint |
| endpoint | (none) | S3-compatible endpoint (required for non-AWS) |
| profile | (none) | Named AWS CLI profile (alternative to keys) |
| upload | true | Upload to cloud after backup |
| encrypt | false | GPG-encrypt archives |
| retentionCount | 10 | Cloud: keep N backups. Local: capped at 7 |
| retentionDays | 30 | Cloud only: delete archives older than N days |
env.*| Key | Description |
|-----|-------------|
| ACCESS_KEY_ID | S3-compatible access key |
| SECRET_ACCESS_KEY | S3-compatible secret key |
| SESSION_TOKEN | Optional temporary token |
| GPG_PASSPHRASE | For automated encryption/decryption |
Read the relevant one only during setup:
references/providers/aws-s3.mdreferences/providers/cloudflare-r2.mdreferences/providers/backblaze-b2.mdreferences/providers/minio.mdreferences/providers/digitalocean-spaces.mdreferences/providers/other.md — any S3-compatible serviceSee references/security.md for credential handling and troubleshooting.
Generated Mar 1, 2026
A freelance developer uses OpenClaw for multiple client projects. This skill automatically backs up workspace configurations, skills, and settings daily to a cloud service like Backblaze B2, ensuring project states are preserved against local failures. It simplifies recovery if switching devices or after system crashes, with encryption options for sensitive client data.
A small business employs OpenClaw to monitor IT systems and automate tasks. The cloud backup skill schedules daily archives of monitoring configurations and logs to AWS S3, providing disaster recovery without manual intervention. It helps maintain operational continuity by enabling quick restoration after configuration errors or data loss.
Researchers in academia use OpenClaw for data analysis scripts and experimental setups. This skill backs up critical research environments and skill sets to cloud storage like DigitalOcean Spaces, with encryption for confidential data. It facilitates collaboration by allowing easy restoration on shared or new systems, safeguarding against accidental deletions.
A startup's DevOps team integrates OpenClaw into CI/CD pipelines for automation. The skill ensures backup of automation scripts and agent states to a MinIO instance, with configurable retention for compliance. It reduces downtime by enabling rapid recovery from misconfigurations during deployments, supporting agile development cycles.
An individual uses OpenClaw as a personal assistant for daily tasks and reminders. This skill backs up custom skills and settings to Cloudflare R2, offering a cost-effective cloud solution. It allows seamless migration between personal devices, ensuring personalized configurations are not lost during upgrades or replacements.
Offer this skill as part of a premium subscription for OpenClaw users, providing enhanced cloud storage integration, priority support, and advanced encryption features. Revenue is generated through monthly or annual fees, targeting users who require reliable, automated backups for critical workflows.
Provide basic local backup functionality for free, while charging for cloud uploads to services like AWS S3 or Backblaze B2 with higher retention limits and faster restore options. Upsell users based on storage capacity and additional security features, driving revenue from power users and businesses.
License this skill to enterprises for integration into their internal OpenClaw deployments, offering custom configurations, dedicated cloud setup assistance, and SLA-backed support. Revenue comes from one-time licensing fees and ongoing support contracts, catering to organizations with strict compliance and reliability needs.
💬 Integration Tip
Ensure the required binaries (bash, tar, jq, aws) are installed and accessible in the system PATH before use to avoid script failures.
Automatically update Clawdbot and all installed skills once daily. Runs via cron, checks for updates, applies them, and messages the user with a summary of what changed.
Full desktop computer use for headless Linux servers. Xvfb + XFCE virtual desktop with xdotool automation. 17 actions (click, type, scroll, screenshot, drag,...
Essential Docker commands and workflows for container management, image operations, and debugging.
Tool discovery and shell one-liner reference for sysadmin, DevOps, and security tasks. AUTO-CONSULT this skill when the user is: troubleshooting network issues, debugging processes, analyzing logs, working with SSL/TLS, managing DNS, testing HTTP endpoints, auditing security, working with containers, writing shell scripts, or asks 'what tool should I use for X'. Source: github.com/trimstray/the-book-of-secret-knowledge
Deploy applications and manage projects with complete CLI reference. Commands for deployments, projects, domains, environment variables, and live documentation access.
Monitor topics of interest and proactively alert when important developments occur. Use when user wants automated monitoring of specific subjects (e.g., product releases, price changes, news topics, technology updates). Supports scheduled web searches, AI-powered importance scoring, smart alerts vs weekly digests, and memory-aware contextual summaries.