reachy-miniControl a Reachy Mini robot (by Pollen Robotics / Hugging Face) via its REST API and SSH. Use for any request involving the Reachy Mini robot — moving the head, body, or antennas; playing emotions or dances; capturing camera snapshots; adjusting volume; managing apps; checking robot status; or any physical robot interaction. The robot has a 6-DoF head, 360° body rotation, two animated antennas, a wide-angle camera (with non-disruptive WebRTC snapshot), 4-mic array, and speaker.
Install via ClawdBot CLI:
clawdbot install reachy-miniUse the CLI script or curl to control the robot. The script lives at:
~/clawd/skills/reachy-mini/scripts/reachy.sh
Set the robot IP via REACHY_HOST env var or --host flag. Default: 192.168.8.17.
reachy.sh status # Daemon status, version, IP
reachy.sh state # Full robot state
reachy.sh wake-up # Wake the robot
reachy.sh sleep # Put to sleep
reachy.sh snap # Camera snapshot → /tmp/reachy_snap.jpg
reachy.sh snap /path/to/photo.jpg # Snapshot to custom path
reachy.sh play-emotion cheerful1 # Play an emotion
reachy.sh play-dance groovy_sway_and_roll # Play a dance
reachy.sh goto --head 0.2,0,0 --duration 1.5 # Nod down
reachy.sh volume-set 70 # Set speaker volume
reachy.sh emotions # List all emotions
reachy.sh dances # List all dances
| Variable | Default | Description |
|----------|---------|-------------|
| REACHY_HOST | 192.168.8.17 | Robot IP address |
| REACHY_PORT | 8000 | REST API port |
| REACHY_SSH_USER | pollen | SSH username (for snap command) |
| REACHY_SSH_PASS | root | SSH password (for snap command, uses sshpass) |
The head accepts pitch, yaw, roll in radians:
# Look up
reachy.sh goto --head -0.3,0,0 --duration 1.0
# Look left
reachy.sh goto --head 0,0.4,0 --duration 1.0
# Tilt head right, look slightly up
reachy.sh goto --head -0.1,0,-0.3 --duration 1.5
# Return to neutral
reachy.sh goto --head 0,0,0 --duration 1.0
Body yaw in radians. 0 = forward, positive = left, negative = right.
reachy.sh goto --body 1.57 --duration 2.0 # Turn 90° left
reachy.sh goto --body -1.57 --duration 2.0 # Turn 90° right
reachy.sh goto --body 0 --duration 2.0 # Face forward
Two antennas [left, right] in radians. Range ~-0.5 to 0.5.
reachy.sh goto --antennas 0.4,0.4 --duration 0.5 # Both up
reachy.sh goto --antennas -0.3,-0.3 --duration 0.5 # Both down
reachy.sh goto --antennas 0.4,-0.4 --duration 0.5 # Asymmetric
# Look left and turn body left with antennas up
reachy.sh goto --head 0,0.3,0 --body 0.5 --antennas 0.4,0.4 --duration 2.0
Use --interp with goto:
minjerk — Smooth, natural (default)linear — Constant speedease — Ease in/outcartoon — Bouncy, exaggerated80+ pre-recorded expressive animations. Select contextually appropriate ones:
reachy.sh play-emotion curious1 # Curious look
reachy.sh play-emotion cheerful1 # Happy expression
reachy.sh play-emotion surprised1 # Surprise reaction
reachy.sh play-emotion thoughtful1 # Thinking pose
reachy.sh play-emotion welcoming1 # Greeting gesture
reachy.sh play-emotion yes1 # Nodding yes
reachy.sh play-emotion no1 # Shaking no
19 dance moves, great for fun or celebration:
reachy.sh play-dance groovy_sway_and_roll
reachy.sh play-dance chicken_peck
reachy.sh play-dance dizzy_spin
Run reachy.sh emotions or reachy.sh dances to see all available moves.
Before movement, motors must be enabled. Check with reachy.sh motors.
reachy.sh motors-enable # Enable (needed for movement commands)
reachy.sh motors-disable # Disable (robot goes limp)
reachy.sh motors-gravity # Gravity compensation (manually pose the robot)
reachy.sh volume # Current speaker volume
reachy.sh volume-set 50 # Set speaker to 50%
reachy.sh volume-test # Play test sound
reachy.sh mic-volume # Microphone level
reachy.sh mic-volume-set 80 # Set microphone to 80%
Reachy Mini runs HuggingFace Space apps. Manage them via:
reachy.sh apps # List all available apps
reachy.sh apps-installed # Installed apps only
reachy.sh app-status # What's running now
reachy.sh app-start NAME # Start an app
reachy.sh app-stop # Stop current app
Important: Only one app runs at a time. Starting a new app stops the current one. Apps may take exclusive control of the robot — stop the running app before sending manual movement commands if the robot doesn't respond.
Capture JPEG photos from the robot's camera (IMX708 wide-angle) via WebRTC — non-disruptive to the running daemon.
reachy.sh snap # Save to /tmp/reachy_snap.jpg
reachy.sh snap /path/to/output.jpg # Custom output path
Requirements: SSH access to the robot (uses sshpass + REACHY_SSH_PASS env var, default: root).
How it works: Connects to the daemon's WebRTC signalling server (port 8443) using GStreamer's webrtcsrc plugin on the robot, captures one H264-decoded frame, and saves as JPEG. No daemon restart, no motor disruption.
Note: The robot must be awake (head up) for a useful image. If asleep, the camera faces into the body. Run reachy.sh wake-up first.
reachy.sh doa # Direction of Arrival from mic array
Returns angle in radians (0=left, π/2=front, π=right) and speech detection boolean.
Use reachy-react.sh to trigger contextual robot behaviors from heartbeats, cron jobs, or session responses.
~/clawd/skills/reachy-mini/scripts/reachy-react.sh
reachy-react.sh ack # Nod acknowledgment (received a request)
reachy-react.sh success # Cheerful emotion (task done)
reachy-react.sh alert # Surprised + antennas up (urgent email, alert)
reachy-react.sh remind # Welcoming/curious (meeting reminder, to-do)
reachy-react.sh idle # Subtle animation (heartbeat presence)
reachy-react.sh morning # Wake up + greeting (morning briefing)
reachy-react.sh goodnight # Sleepy emotion + sleep (night mode)
reachy-react.sh patrol # Camera snapshot, prints image path
reachy-react.sh doa-track # Turn head toward detected sound source
reachy-react.sh celebrate # Random dance (fun moments)
Pass --bg to run in background (non-blocking).
morning, goodnight, and patrol are silently skipped.| Trigger | Reaction | Notes |
|---------|----------|-------|
| Morning briefing cron (6:30 AM) | morning | Robot wakes up and greets |
| Goodnight cron (10:00 PM) | goodnight | Robot plays sleepy emotion, goes to sleep |
| Heartbeat (periodic) | idle | Subtle head tilt, antenna wave, or look-around |
| Heartbeat (~1 in 4) | doa-track | Checks for nearby speech, turns toward it |
| Heartbeat (~1 in 6) | patrol | Camera snapshot for room awareness |
| Important unread email | alert | Antennas up + surprised emotion |
| Meeting <2h away | remind | Welcoming/curious emotion |
| Request from Alexander | ack | Quick head nod |
| Task completed | success | Random cheerful/happy emotion |
| Good news or celebration | celebrate | Random dance move |
The doa-track reaction uses the robot's 4-mic array to detect speech direction and turn the head toward the speaker. The DOA angle (0=left, π/2=front, π=right) is mapped to head yaw. Only triggers when speech is actively detected.
The patrol reaction captures a snapshot and prints the image path. Use this during heartbeats to check the room periodically. Combine with image analysis to detect activity or changes.
For anything not covered by the CLI, use curl or the raw command:
# Via raw command
reachy.sh raw GET /api/state/full
reachy.sh raw POST /api/move/goto '{"duration":1.0,"head_pose":{"pitch":0.2,"yaw":0,"roll":0}}'
# Via curl directly
curl -s http://192.168.8.17:8000/api/state/full | jq
curl -s -X POST -H "Content-Type: application/json" \
-d '{"duration":1.5,"head_pose":{"pitch":0,"yaw":0.3,"roll":0}}' \
http://192.168.8.17:8000/api/move/goto
For the complete API endpoint list, schemas (GotoModelRequest, FullBodyTarget, XYZRPYPose), and full emotion/dance catalogs, see references/api-reference.md.
reachy.sh motors — must be enabled. Run reachy.sh motors-enable.reachy.sh status. State should be running. If not, run reachy.sh reboot-daemon.reachy.sh app-stop first.ping $REACHY_HOST. Check reachy.sh wifi-status.reachy.sh wake-up first.sshpass is installed and REACHY_SSH_PASS is set correctly.Generated Mar 1, 2026
Reachy Mini can greet customers, demonstrate products with head and antenna movements, and capture snapshots of customer interactions for feedback. Its emotional animations make it engaging for in-store promotions or information kiosks.
In classrooms or workshops, the robot can perform dances and emotions to teach programming concepts, with movement commands allowing hands-on control. Its camera enables live demonstrations or recording of activities for educational content.
Using the camera for non-disruptive snapshots and audio sensing, Reachy Mini can serve as a remote presence tool for monitoring environments like labs or offices. Movement commands allow adjusting the view, while apps enable custom monitoring interfaces.
At events or parties, the robot can play dances and emotions to entertain guests, with volume control for audio. Its ability to rotate 360 degrees and capture photos makes it ideal for interactive photo booths or social media content creation.
Researchers can use the robot's precise motor control and app management to test AI models or human-robot interaction scenarios. The REST API and SSH access facilitate integration with custom software for experiments in robotics or AI.
Offer Reachy Mini on a subscription basis for businesses in retail or education, providing access to the skill package with regular updates and support. Revenue comes from monthly fees, with tiers based on usage levels or premium features like custom app development.
Provide consulting services to integrate the robot into specific workflows, such as building custom apps or automating movement sequences for clients in entertainment or research. Revenue is generated through project-based contracts and ongoing maintenance fees.
License pre-recorded emotions and dances for use in media, games, or educational materials, leveraging the robot's expressive animations. Revenue streams include one-time licensing fees or royalties based on usage, targeting industries like entertainment and advertising.
💬 Integration Tip
Ensure motors are enabled before movement commands, and stop any running apps if the robot becomes unresponsive to manual controls.
Use CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
Gemini CLI for one-shot Q&A, summaries, and generation.
Research any topic from the last 30 days on Reddit + X + Web, synthesize findings, and write copy-paste-ready prompts. Use when the user wants recent social/web research on a topic, asks "what are people saying about X", or wants to learn current best practices. Requires OPENAI_API_KEY and/or XAI_API_KEY for full Reddit+X access, falls back to web search.
Check Antigravity account quotas for Claude and Gemini models. Shows remaining quota and reset times with ban detection.
Manages free AI models from OpenRouter for OpenClaw. Automatically ranks models by quality, configures fallbacks for rate-limit handling, and updates opencla...
Manages free AI models from OpenRouter for OpenClaw. Automatically ranks models by quality, configures fallbacks for rate-limit handling, and updates openclaw.json. Use when the user mentions free AI, OpenRouter, model switching, rate limits, or wants to reduce AI costs.