gcloudManage Google Cloud Platform resources via gcloud CLI. Use for Compute Engine VMs, Cloud Run services, Firebase Hosting, Cloud Storage, and project management. Covers deployment, monitoring, logs, and SSH access.
Install via ClawdBot CLI:
clawdbot install jortega0033/gcloudManage GCP resources using gcloud, gsutil, and firebase CLIs.
# Download and extract
cd ~ && curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-cli-linux-x86_64.tar.gz
tar -xzf google-cloud-cli-linux-x86_64.tar.gz
# Install (adds to PATH via .bashrc)
./google-cloud-sdk/install.sh --quiet --path-update true
# Reload shell or source
source ~/.bashrc
# Authenticate
gcloud auth login
npm install -g firebase-tools
firebase login
# List authenticated accounts
gcloud auth list
# Switch active account
gcloud config set account EMAIL
# List projects
gcloud projects list
# Set default project
gcloud config set project PROJECT_ID
# View current config
gcloud config list
# All instances across projects
gcloud compute instances list --project PROJECT_ID
# With specific fields
gcloud compute instances list --project PROJECT_ID \
--format="table(name,zone,status,networkInterfaces[0].accessConfigs[0].natIP)"
gcloud compute instances start INSTANCE_NAME --zone ZONE --project PROJECT_ID
gcloud compute instances stop INSTANCE_NAME --zone ZONE --project PROJECT_ID
gcloud compute instances reset INSTANCE_NAME --zone ZONE --project PROJECT_ID
# Interactive SSH
gcloud compute ssh INSTANCE_NAME --zone ZONE --project PROJECT_ID
# Run command remotely
gcloud compute ssh INSTANCE_NAME --zone ZONE --project PROJECT_ID --command "uptime"
# With tunneling (e.g., for local port forwarding)
gcloud compute ssh INSTANCE_NAME --zone ZONE --project PROJECT_ID -- -L 8080:localhost:8080
# Serial port output (boot logs)
gcloud compute instances get-serial-port-output INSTANCE_NAME --zone ZONE --project PROJECT_ID
# Tail logs via SSH
gcloud compute ssh INSTANCE_NAME --zone ZONE --project PROJECT_ID --command "journalctl -f"
# List all services in a region
gcloud run services list --region REGION --project PROJECT_ID
# All regions
gcloud run services list --project PROJECT_ID
# Deploy from source (builds container automatically)
gcloud run deploy SERVICE_NAME \
--source . \
--region REGION \
--project PROJECT_ID \
--allow-unauthenticated
# Deploy existing container image
gcloud run deploy SERVICE_NAME \
--image gcr.io/PROJECT_ID/IMAGE:TAG \
--region REGION \
--project PROJECT_ID
gcloud run services describe SERVICE_NAME --region REGION --project PROJECT_ID
# Stream logs
gcloud run services logs read SERVICE_NAME --region REGION --project PROJECT_ID --limit 50
# Or use Cloud Logging
gcloud logging read "resource.type=cloud_run_revision AND resource.labels.service_name=SERVICE_NAME" \
--project PROJECT_ID --limit 20 --format="table(timestamp,textPayload)"
gcloud run services update SERVICE_NAME \
--region REGION \
--project PROJECT_ID \
--set-env-vars "KEY1=value1,KEY2=value2"
# Route 100% traffic to latest
gcloud run services update-traffic SERVICE_NAME --to-latest --region REGION --project PROJECT_ID
# Split traffic (canary)
gcloud run services update-traffic SERVICE_NAME \
--to-revisions=REVISION1=90,REVISION2=10 \
--region REGION --project PROJECT_ID
firebase projects:list
# Deploy everything (hosting + functions + rules)
firebase deploy --project PROJECT_ID
# Hosting only
firebase deploy --only hosting --project PROJECT_ID
# Specific site (multi-site setup)
firebase deploy --only hosting:SITE_NAME --project PROJECT_ID
# Create preview channel
firebase hosting:channel:deploy CHANNEL_NAME --project PROJECT_ID
# List channels
firebase hosting:channel:list --project PROJECT_ID
# Delete channel
firebase hosting:channel:delete CHANNEL_NAME --project PROJECT_ID
# List recent deploys
firebase hosting:releases:list --project PROJECT_ID
# Rollback to specific version
firebase hosting:rollback --project PROJECT_ID
# List buckets
gsutil ls
# List contents
gsutil ls gs://BUCKET_NAME/
# Copy file
gsutil cp LOCAL_FILE gs://BUCKET_NAME/path/
gsutil cp gs://BUCKET_NAME/path/file LOCAL_PATH
# Sync directory
gsutil -m rsync -r LOCAL_DIR gs://BUCKET_NAME/path/
# Make public
gsutil iam ch allUsers:objectViewer gs://BUCKET_NAME
# Read recent logs
gcloud logging read "resource.type=gce_instance" --project PROJECT_ID --limit 20
# Filter by severity
gcloud logging read "severity>=ERROR" --project PROJECT_ID --limit 20
# Specific resource
gcloud logging read "resource.type=cloud_run_revision AND resource.labels.service_name=my-service" \
--project PROJECT_ID --limit 20
# List available metrics
gcloud monitoring metrics list --project PROJECT_ID | head -50
# Describe metric
gcloud monitoring metrics-scopes describe projects/PROJECT_ID
# List billing accounts
gcloud billing accounts list
# Get billing account linked to project
gcloud billing projects describe PROJECT_ID
# View cost breakdown (requires billing export to BigQuery or use console)
# Quick estimate via APIs enabled:
gcloud services list --enabled --project PROJECT_ID
# Create budget (via gcloud beta)
gcloud billing budgets create \
--billing-account=BILLING_ACCOUNT_ID \
--display-name="Monthly Budget" \
--budget-amount=50EUR \
--threshold-rule=percent=50 \
--threshold-rule=percent=90 \
--threshold-rule=percent=100
# List budgets
gcloud billing budgets list --billing-account=BILLING_ACCOUNT_ID
# Describe budget
gcloud billing budgets describe BUDGET_ID --billing-account=BILLING_ACCOUNT_ID
# Stop unused VMs (saves $$)
gcloud compute instances stop INSTANCE_NAME --zone ZONE --project PROJECT_ID
# Schedule auto-start/stop (use Cloud Scheduler + Cloud Functions or cron)
# Check for idle resources
gcloud recommender recommendations list \
--project=PROJECT_ID \
--location=global \
--recommender=google.compute.instance.IdleResourceRecommender
# Enable API
gcloud services enable secretmanager.googleapis.com --project PROJECT_ID
# Create a secret
echo -n "my-secret-value" | gcloud secrets create SECRET_NAME \
--data-file=- \
--project PROJECT_ID
# Or from file
gcloud secrets create SECRET_NAME --data-file=./secret.txt --project PROJECT_ID
# Get latest version
gcloud secrets versions access latest --secret=SECRET_NAME --project PROJECT_ID
# Get specific version
gcloud secrets versions access 1 --secret=SECRET_NAME --project PROJECT_ID
# List all secrets
gcloud secrets list --project PROJECT_ID
# List versions of a secret
gcloud secrets versions list SECRET_NAME --project PROJECT_ID
# Add new version
echo -n "new-value" | gcloud secrets versions add SECRET_NAME --data-file=- --project PROJECT_ID
# Disable old version
gcloud secrets versions disable VERSION_ID --secret=SECRET_NAME --project PROJECT_ID
# Delete version (permanent!)
gcloud secrets versions destroy VERSION_ID --secret=SECRET_NAME --project PROJECT_ID
# Deploy with secret as env var
gcloud run deploy SERVICE_NAME \
--image IMAGE \
--region REGION \
--project PROJECT_ID \
--set-secrets="ENV_VAR_NAME=SECRET_NAME:latest"
# Mount as file
gcloud run deploy SERVICE_NAME \
--image IMAGE \
--region REGION \
--project PROJECT_ID \
--set-secrets="/path/to/secret=SECRET_NAME:latest"
# Enable API
gcloud services enable artifactregistry.googleapis.com --project PROJECT_ID
# Create Docker repository
gcloud artifacts repositories create REPO_NAME \
--repository-format=docker \
--location=REGION \
--project PROJECT_ID \
--description="Docker images"
# Configure Docker to use gcloud credentials
gcloud auth configure-docker REGION-docker.pkg.dev
# Build with Cloud Build (no local Docker needed)
gcloud builds submit --tag REGION-docker.pkg.dev/PROJECT_ID/REPO_NAME/IMAGE:TAG
# Or with local Docker
docker build -t REGION-docker.pkg.dev/PROJECT_ID/REPO_NAME/IMAGE:TAG .
docker push REGION-docker.pkg.dev/PROJECT_ID/REPO_NAME/IMAGE:TAG
# List images
gcloud artifacts docker images list REGION-docker.pkg.dev/PROJECT_ID/REPO_NAME
# List tags for an image
gcloud artifacts docker tags list REGION-docker.pkg.dev/PROJECT_ID/REPO_NAME/IMAGE
# Delete image
gcloud artifacts docker images delete REGION-docker.pkg.dev/PROJECT_ID/REPO_NAME/IMAGE:TAG
# Enable API
gcloud services enable sqladmin.googleapis.com --project PROJECT_ID
# Create PostgreSQL instance
gcloud sql instances create INSTANCE_NAME \
--database-version=POSTGRES_15 \
--tier=db-f1-micro \
--region=REGION \
--project PROJECT_ID
# Create MySQL instance
gcloud sql instances create INSTANCE_NAME \
--database-version=MYSQL_8_0 \
--tier=db-f1-micro \
--region=REGION \
--project PROJECT_ID
# Create database
gcloud sql databases create DB_NAME --instance=INSTANCE_NAME --project PROJECT_ID
# List databases
gcloud sql databases list --instance=INSTANCE_NAME --project PROJECT_ID
# Create user
gcloud sql users create USERNAME \
--instance=INSTANCE_NAME \
--password=PASSWORD \
--project PROJECT_ID
# List users
gcloud sql users list --instance=INSTANCE_NAME --project PROJECT_ID
# Connect via Cloud SQL Proxy (recommended)
# First, download proxy: https://cloud.google.com/sql/docs/mysql/sql-proxy
# Direct connection (requires public IP & authorized networks)
gcloud sql connect INSTANCE_NAME --user=USERNAME --project PROJECT_ID
# Get connection info
gcloud sql instances describe INSTANCE_NAME --project PROJECT_ID \
--format="value(connectionName)"
# Create on-demand backup
gcloud sql backups create --instance=INSTANCE_NAME --project PROJECT_ID
# List backups
gcloud sql backups list --instance=INSTANCE_NAME --project PROJECT_ID
# Restore from backup
gcloud sql backups restore BACKUP_ID --restore-instance=INSTANCE_NAME --project PROJECT_ID
# Deploy with Cloud SQL connection
gcloud run deploy SERVICE_NAME \
--image IMAGE \
--region REGION \
--project PROJECT_ID \
--add-cloudsql-instances=PROJECT_ID:REGION:INSTANCE_NAME \
--set-env-vars="DB_HOST=/cloudsql/PROJECT_ID:REGION:INSTANCE_NAME"
# Enable an API
gcloud services enable run.googleapis.com --project PROJECT_ID
gcloud services enable compute.googleapis.com --project PROJECT_ID
# Check IAM roles
gcloud projects get-iam-policy PROJECT_ID --flatten="bindings[].members" \
--format="table(bindings.role)" --filter="bindings.members:EMAIL"
gcloud auth login
gcloud auth application-default login # For ADC (used by libraries)
gcloud auth login --force
Generated Mar 1, 2026
A tech startup uses this skill to deploy a web application on Cloud Run, manage environment variables for staging and production, and monitor logs for debugging. They leverage Firebase Hosting for frontend deployment with preview channels for testing features before release.
An e-commerce company utilizes the skill to manage Compute Engine VMs for their backend servers, including starting and stopping instances based on traffic patterns. They use Cloud Storage for storing product images and sync data with gsutil commands for backups.
A media organization employs this skill to host videos and files on Cloud Storage, making them publicly accessible via gsutil. They deploy microservices on Cloud Run for content processing and use logging to track errors and performance metrics.
A software development team integrates the skill into CI/CD pipelines to automate deployments on Cloud Run, manage SSH access to VMs for debugging, and view serial port logs for instance health checks. They use traffic splitting for canary releases.
An educational institution uses the skill to scale Compute Engine VMs during peak usage times like exams, deploy learning tools on Cloud Run, and monitor costs via billing commands. They manage multiple projects for different departments with gcloud config.
Companies can deploy and manage their SaaS applications on GCP using this skill, offering scalable services via Cloud Run and Firebase Hosting. Revenue is generated through subscription fees based on usage tiers and feature access.
IT consulting firms leverage this skill to provide managed GCP services for clients, handling deployments, monitoring, and cost optimization. Revenue comes from retainer fees or project-based contracts for infrastructure management.
Businesses use the skill to host and distribute digital content via Cloud Storage and Firebase, charging clients for storage, bandwidth, and premium hosting features. Revenue is based on pay-as-you-go pricing or bundled packages.
💬 Integration Tip
Integrate this skill with CI/CD tools like GitHub Actions or Jenkins to automate deployments and monitoring, and use gcloud commands in scripts for batch operations to save time.
Automatically update Clawdbot and all installed skills once daily. Runs via cron, checks for updates, applies them, and messages the user with a summary of what changed.
Full desktop computer use for headless Linux servers. Xvfb + XFCE virtual desktop with xdotool automation. 17 actions (click, type, scroll, screenshot, drag,...
Essential Docker commands and workflows for container management, image operations, and debugging.
Tool discovery and shell one-liner reference for sysadmin, DevOps, and security tasks. AUTO-CONSULT this skill when the user is: troubleshooting network issues, debugging processes, analyzing logs, working with SSL/TLS, managing DNS, testing HTTP endpoints, auditing security, working with containers, writing shell scripts, or asks 'what tool should I use for X'. Source: github.com/trimstray/the-book-of-secret-knowledge
Deploy applications and manage projects with complete CLI reference. Commands for deployments, projects, domains, environment variables, and live documentation access.
Monitor topics of interest and proactively alert when important developments occur. Use when user wants automated monitoring of specific subjects (e.g., product releases, price changes, news topics, technology updates). Supports scheduled web searches, AI-powered importance scoring, smart alerts vs weekly digests, and memory-aware contextual summaries.