dual-stream-architectureDual-stream event publishing combining Kafka for durability with Redis Pub/Sub for real-time delivery. Use when building event-driven systems needing both guaranteed delivery and low-latency updates. Triggers on dual stream, event publishing, Kafka Redis, real-time events, pub/sub, streaming architecture.
Install via ClawdBot CLI:
clawdbot install wpank/dual-stream-architecturePublish events to Kafka (durability) and Redis Pub/Sub (real-time) simultaneously for systems needing both guaranteed delivery and instant updates.
npx clawhub@latest install dual-stream-architecture
type DualPublisher struct {
kafka *kafka.Writer
redis *redis.Client
logger *slog.Logger
}
func (p *DualPublisher) Publish(ctx context.Context, event Event) error {
// 1. Kafka: Critical path - must succeed
payload, _ := json.Marshal(event)
err := p.kafka.WriteMessages(ctx, kafka.Message{
Key: []byte(event.SourceID),
Value: payload,
})
if err != nil {
return fmt.Errorf("kafka publish failed: %w", err)
}
// 2. Redis: Best-effort - don't fail the operation
p.publishToRedis(ctx, event)
return nil
}
func (p *DualPublisher) publishToRedis(ctx context.Context, event Event) {
// Lightweight payload (full event in Kafka)
notification := map[string]interface{}{
"id": event.ID,
"type": event.Type,
"source_id": event.SourceID,
}
payload, _ := json.Marshal(notification)
channel := fmt.Sprintf("events:%s:%s", event.SourceType, event.SourceID)
// Fire and forget - log errors but don't propagate
if err := p.redis.Publish(ctx, channel, payload).Err(); err != nil {
p.logger.Warn("redis publish failed", "error", err)
}
}
ββββββββββββββββ βββββββββββββββββββ ββββββββββββββββ
β Ingester ββββββΆβ DualPublisher ββββββΆβ Kafka ββββΆ Event Processor
β β β β β (durable) β
ββββββββββββββββ β β ββββββββββββββββ
β β ββββββββββββββββ
β ββββββΆβ Redis PubSub ββββΆ WebSocket Gateway
β β β (real-time) β
βββββββββββββββββββ ββββββββββββββββ
events:{source_type}:{source_id}
Examples:
- events:user:octocat - Events for user octocat
- events:repo:owner/repo - Events for a repository
- events:org:microsoft - Events for an organization
For high throughput:
func (p *DualPublisher) PublishBatch(ctx context.Context, events []Event) error {
// 1. Batch to Kafka
messages := make([]kafka.Message, len(events))
for i, event := range events {
payload, _ := json.Marshal(event)
messages[i] = kafka.Message{
Key: []byte(event.SourceID),
Value: payload,
}
}
if err := p.kafka.WriteMessages(ctx, messages...); err != nil {
return fmt.Errorf("kafka batch failed: %w", err)
}
// 2. Redis: Pipeline for efficiency
pipe := p.redis.Pipeline()
for _, event := range events {
channel := fmt.Sprintf("events:%s:%s", event.SourceType, event.SourceID)
notification, _ := json.Marshal(map[string]interface{}{
"id": event.ID,
"type": event.Type,
})
pipe.Publish(ctx, channel, notification)
}
if _, err := pipe.Exec(ctx); err != nil {
p.logger.Warn("redis batch failed", "error", err)
}
return nil
}
| Requirement | Stream | Why |
|-------------|--------|-----|
| Must not lose event | Kafka only | Ack required, replicated |
| User sees immediately | Redis only | Sub-ms delivery |
| Both durability + real-time | Dual stream | This pattern |
| High volume (>10k/sec) | Kafka, batch Redis | Redis can bottleneck |
| Many subscribers per channel | Redis + local fan-out | Don't hammer Redis |
| Case | Solution |
|------|----------|
| Redis down | Log warning, continue with Kafka only |
| Client connects mid-stream | Query API for recent events, then subscribe |
| High channel cardinality | Use wildcard patterns or aggregate channels |
| Kafka backpressure | Buffer in memory with timeout, fail if full |
| Need event replay | Consume from Kafka from offset, not Redis |
Generated Mar 1, 2026
A trading platform uses dual-stream architecture to publish market data events. Kafka ensures no trade or price update is lost for compliance and auditing, while Redis Pub/Sub pushes live updates to user dashboards with sub-millisecond latency, enabling traders to react instantly to market changes.
An online retailer implements this skill to handle order status updates. Kafka durably stores all order events for inventory management and analytics, while Redis delivers real-time notifications to customers via web or mobile apps, showing order confirmation, shipping, and delivery status as they happen.
A logistics company uses dual-stream publishing for vehicle telemetry data. Kafka captures all sensor readings for long-term analysis and regulatory reporting, while Redis streams real-time location and health updates to a central dashboard, allowing dispatchers to monitor fleet movements and respond to incidents immediately.
A social networking app employs this architecture for user activity events. Kafka stores posts, likes, and comments for data processing and recommendations, while Redis pushes live updates to followers' feeds via WebSocket connections, ensuring users see new content without refresh delays.
A hospital integrates dual-stream publishing for patient vital signs. Kafka archives all medical data for electronic health records and compliance, while Redis broadcasts real-time alerts to nurses' stations and mobile devices, enabling immediate response to critical changes in patient conditions.
Offer a cloud-based event streaming platform with dual-stream capabilities as a subscription service. Clients pay monthly fees based on event volume and throughput, benefiting from managed infrastructure, scalability, and reduced operational overhead for real-time applications.
Provide expert consulting to design and deploy dual-stream architectures for enterprise clients. Revenue comes from project-based fees for system integration, custom development, and ongoing support, helping businesses optimize event-driven workflows for durability and low latency.
Distribute the dual-stream skill as open-source software to build community adoption. Generate revenue by offering premium enterprise features such as advanced monitoring, security enhancements, and dedicated support, targeting large organizations with high-scale needs.
π¬ Integration Tip
Ensure Kafka is configured for high durability with replication, and monitor Redis Pub/Sub performance to avoid bottlenecks in high-volume scenarios.
Guide any property decision for buyers, sellers, landlords, investors, or agents in any jurisdiction.
Use when designing new system architecture, reviewing existing designs, or making architectural decisions. Invoke for system design, architecture review, design patterns, ADRs, scalability planning.
Document significant technical decisions with context, rationale, and consequences to maintain clear, lightweight architectural records for future reference.
Predict construction project costs using Machine Learning. Use Linear Regression, K-Nearest Neighbors, and Random Forest models on historical project data. Train, evaluate, and deploy cost prediction models.
Generate photorealistic architectural renders and visualizations using each::sense AI. Create exterior views, interior renders, sketch-to-render conversions,...
Convert IFC files (2x3, 4x1, 4x3) to Excel databases using IfcExporter CLI. Extract BIM data, properties, and geometry without proprietary software.