posthogInteract with PostHog analytics via its REST API. Capture events, evaluate feature flags, query data with HogQL, manage persons, insights, dashboards, experi...
Install via ClawdBot CLI:
clawdbot install simonfunk/posthogInteract with PostHog via its REST API. Two types of endpoints:
export POSTHOG_API_KEY="phx_..."
export POSTHOG_PROJECT_ID="12345"
export POSTHOG_PROJECT_API_KEY="phc_..." # optional, for capture/flags
# For EU Cloud:
# export POSTHOG_HOST="https://eu.posthog.com"
# export POSTHOG_INGEST_HOST="https://eu.i.posthog.com"
bash scripts/posthog.sh whoamiscripts/posthog.sh wraps common operations. Run bash scripts/posthog.sh help for full usage.
# Capture an event
bash scripts/posthog.sh capture "signup" "user_123" '{"plan":"pro"}'
# Evaluate feature flags
bash scripts/posthog.sh evaluate-flags "user_123"
# HogQL query — top events last 7 days
bash scripts/posthog.sh query "SELECT event, count() FROM events WHERE timestamp >= now() - INTERVAL 7 DAY GROUP BY event ORDER BY count() DESC LIMIT 20"
# List persons
bash scripts/posthog.sh list-persons 10 | jq '.results[] | {name, distinct_ids}'
# List feature flags
bash scripts/posthog.sh list-flags | jq '.results[] | {id, key, active}'
# Create a feature flag
echo '{"key":"new-dashboard","name":"New Dashboard","active":true,"filters":{"groups":[{"rollout_percentage":50}]}}' | \
bash scripts/posthog.sh create-flag
# List dashboards
bash scripts/posthog.sh list-dashboards | jq '.results[] | {id, name}'
/i/v0/e/, /batch/, /flags): Use project API key in body. No auth header. No rate limits./api/projects/:project_id/...): Use personal API key via Authorization: Bearer. Rate limited.The query endpoint (POST /api/projects/:project_id/query/) is the most powerful way to extract data. Uses SQL-like HogQL syntax against tables: events, persons, sessions, groups, plus data warehouse tables.
Always include time ranges and LIMIT. Use timestamp-based pagination for large exports.
| Type | Limit |
|------|-------|
| Analytics (insights, persons, recordings) | 240/min, 1200/hr |
| Query endpoint | 2400/hr |
| Feature flag local evaluation | 600/min |
| Other CRUD | 480/min, 4800/hr |
Limits apply per organization. On 429: back off and retry.
| Cloud | Public | Private |
|-------|--------|---------|
| US | us.i.posthog.com | us.posthog.com |
| EU | eu.i.posthog.com | eu.posthog.com |
The /api/projects/:project_id/events/ endpoint is deprecated. Use HogQL queries or batch exports instead.
# Private endpoint
curl -H "Authorization: Bearer $POSTHOG_API_KEY" \
"$POSTHOG_HOST/api/projects/$POSTHOG_PROJECT_ID/feature_flags/"
# HogQL query
curl -H "Authorization: Bearer $POSTHOG_API_KEY" \
-H "Content-Type: application/json" \
-X POST -d '{"query":{"kind":"HogQLQuery","query":"SELECT count() FROM events WHERE timestamp >= now() - INTERVAL 1 DAY"}}' \
"$POSTHOG_HOST/api/projects/$POSTHOG_PROJECT_ID/query/"
# Capture event (public)
curl -H "Content-Type: application/json" \
-X POST -d '{"api_key":"'$POSTHOG_PROJECT_API_KEY'","event":"test","distinct_id":"u1"}' \
"$POSTHOG_INGEST_HOST/i/v0/e/"
See references/api-endpoints.md for complete endpoint listing with parameters, body schemas, scopes, and response formats.
Sections: Public Endpoints (Capture, Batch, Flags), Private Endpoints (Persons, Feature Flags, Insights, Dashboards, Annotations, Cohorts, Experiments, Surveys, Actions, Session Recordings, Users, Definitions), Query API (HogQL).
Generated Mar 1, 2026
An e-commerce platform uses this skill to capture events like product views, cart additions, and purchases via public endpoints, enabling real-time analytics without rate limits. It also employs HogQL queries to analyze sales trends and customer segments, helping optimize marketing campaigns and inventory management based on data-driven insights.
A SaaS company leverages the skill to manage feature flags, allowing gradual rollouts and A/B testing of new features using private endpoints. By evaluating flags for specific users, they can measure impact on engagement and retention, while using insights dashboards to monitor experiment results and make data-informed product decisions.
A marketing team uses this skill to create and manage cohorts based on user actions, such as sign-ups or in-app events, via private endpoints for persons and cohorts. They run HogQL queries to analyze cohort performance over time, enabling targeted campaigns and personalized messaging to improve conversion rates and customer loyalty.
A startup integrates the skill to build custom dashboards with insights and visualizations using private endpoints, tracking key metrics like user activation and retention. They capture events via public endpoints to monitor product usage in real-time, helping teams quickly iterate on features and align with business goals through data visualization.
A UX team utilizes the skill to access session recordings via private endpoints, analyzing user interactions to identify pain points and improve website or app usability. Combined with event capture and HogQL queries, they correlate recordings with behavioral data to enhance user experience and reduce churn through targeted optimizations.
A business offers analytics services by integrating this skill to provide clients with real-time event tracking, dashboards, and insights. Revenue is generated through tiered subscription plans based on data volume and feature access, such as advanced HogQL queries or cohort management, catering to SMEs needing scalable analytics solutions.
A consultancy uses this skill to help clients set up and optimize PostHog analytics, including custom event schemas, feature flag strategies, and data queries. Revenue comes from project-based fees or retainer models for ongoing support, enabling businesses to leverage analytics for growth without in-house expertise.
An agency incorporates this skill into product development workflows, using feature flags and A/B testing to validate hypotheses and capture user feedback. Revenue is generated from development contracts and performance-based pricing, where success metrics tied to analytics insights drive client outcomes and iterative improvements.
💬 Integration Tip
Ensure environment variables like POSTHOG_API_KEY and POSTHOG_PROJECT_ID are correctly set, and use the helper script for common operations to simplify API interactions and avoid rate limit issues.
Use the @steipete/oracle CLI to bundle a prompt plus the right files and get a second-model review (API or browser) for debugging, refactors, design checks, or cross-validation.
Manage Things 3 via the `things` CLI on macOS (add/update projects+todos via URL scheme; read/search/list from the local Things database). Use when a user asks Clawdbot to add a task to Things, list inbox/today/upcoming, search tasks, or inspect projects/areas/tags.
Local search/indexing CLI (BM25 + vectors + rerank) with MCP mode.
Use when designing database schemas, writing migrations, optimizing SQL queries, fixing N+1 problems, creating indexes, setting up PostgreSQL, configuring EF Core, implementing caching, partitioning tables, or any database performance question.
Connect to Supabase for database operations, vector search, and storage. Use for storing data, running SQL queries, similarity search with pgvector, and managing tables. Triggers on requests involving databases, vector stores, embeddings, or Supabase specifically.
Query, design, migrate, and optimize SQL databases. Use when working with SQLite, PostgreSQL, or MySQL — schema design, writing queries, creating migrations, indexing, backup/restore, and debugging slow queries. No ORMs required.