azure-proxyEnable Azure OpenAI integration with OpenClaw via a lightweight local proxy. Use when configuring Azure OpenAI as a model provider, when encountering 404 errors with Azure OpenAI in OpenClaw, or when needing to use Azure credits (e.g. Visual Studio subscription) with OpenClaw subagents. Solves the api-version query parameter issue that prevents direct Azure OpenAI integration.
Install via ClawdBot CLI:
clawdbot install BenediktSchackenberg/azure-proxyA lightweight Node.js proxy that bridges Azure OpenAI with OpenClaw.
OpenClaw constructs API URLs like this:
const endpoint = `${baseUrl}/chat/completions`;
Azure OpenAI requires:
https://{resource}.openai.azure.com/openai/deployments/{model}/chat/completions?api-version=2025-01-01-preview
When api-version is in the baseUrl, OpenClaw's path append breaks it.
# Set your Azure details
export AZURE_OPENAI_ENDPOINT="your-resource.openai.azure.com"
export AZURE_OPENAI_DEPLOYMENT="gpt-4o"
export AZURE_OPENAI_API_VERSION="2025-01-01-preview"
# Run the proxy
node scripts/server.js
Add to ~/.openclaw/openclaw.json:
{
"models": {
"providers": {
"azure-gpt4o": {
"baseUrl": "http://127.0.0.1:18790",
"apiKey": "YOUR_AZURE_API_KEY",
"api": "openai-completions",
"authHeader": false,
"headers": {
"api-key": "YOUR_AZURE_API_KEY"
},
"models": [
{ "id": "gpt-4o", "name": "GPT-4o (Azure)" }
]
}
}
},
"agents": {
"defaults": {
"models": {
"azure-gpt4o/gpt-4o": {}
}
}
}
}
Important: Set authHeader: false ā Azure uses api-key header, not Bearer tokens.
Save Azure credits by routing automated tasks through Azure:
{
"agents": {
"defaults": {
"subagents": {
"model": "azure-gpt4o/gpt-4o"
}
}
}
}
Copy the template and configure:
mkdir -p ~/.config/systemd/user
cp scripts/azure-proxy.service ~/.config/systemd/user/
# Edit the service file with your Azure details
nano ~/.config/systemd/user/azure-proxy.service
# Enable and start
systemctl --user daemon-reload
systemctl --user enable azure-proxy
systemctl --user start azure-proxy
| Variable | Default | Description |
|----------|---------|-------------|
| AZURE_PROXY_PORT | 18790 | Local proxy port |
| AZURE_PROXY_BIND | 127.0.0.1 | Bind address |
| AZURE_OPENAI_ENDPOINT | ā | Azure resource hostname |
| AZURE_OPENAI_DEPLOYMENT | gpt-4o | Deployment name |
| AZURE_OPENAI_API_VERSION | 2025-01-01-preview | API version |
curl http://localhost:18790/health
# {"status":"ok","deployment":"gpt-4o"}
404 Resource not found: Check endpoint hostname and deployment name match Azure Portal.
401 Unauthorized: API key is wrong or expired.
Content Filter Errors: Azure has aggressive content filtering ā some prompts that work on OpenAI may get blocked.
Generated Feb 23, 2026
Software development teams using Azure credits from Visual Studio subscriptions to build AI-powered applications with OpenClaw. This allows cost-effective prototyping and development while maintaining Azure's enterprise security and compliance standards.
University research teams leveraging Azure education grants to conduct AI experiments through OpenClaw. The proxy enables academic institutions to use their allocated Azure credits for research while avoiding API version compatibility issues.
Early-stage startups using Azure startup programs to build minimum viable products with AI capabilities. The proxy allows them to utilize free Azure credits while developing their OpenClaw-powered applications without infrastructure costs.
Business operations teams implementing automated workflows using OpenClaw subagents powered by Azure OpenAI. This enables cost-effective automation of routine tasks while leveraging existing Azure enterprise agreements and credits.
IT consulting firms delivering AI solutions to clients using their Azure subscriptions. The proxy allows consultants to demonstrate and deploy OpenClaw-based solutions while billing through existing Azure contracts.
Companies purchase Azure credits in bulk and resell them to clients with OpenClaw integration services. This creates a recurring revenue stream while providing clients with cost-effective AI development capabilities through the proxy solution.
Offering the proxy as a hosted service with monitoring, maintenance, and support. Businesses pay a monthly fee to avoid managing the proxy infrastructure themselves while ensuring reliable Azure OpenAI integration with OpenClaw.
Providing professional services to help organizations configure and optimize the Azure proxy for their specific OpenClaw use cases. This includes custom deployment, troubleshooting, and performance tuning services.
š¬ Integration Tip
Always set authHeader to false in OpenClaw configuration since Azure uses api-key headers instead of Bearer tokens. Test with the health endpoint before full deployment.
Use CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
Gemini CLI for one-shot Q&A, summaries, and generation.
Research any topic from the last 30 days on Reddit + X + Web, synthesize findings, and write copy-paste-ready prompts. Use when the user wants recent social/web research on a topic, asks "what are people saying about X", or wants to learn current best practices. Requires OPENAI_API_KEY and/or XAI_API_KEY for full Reddit+X access, falls back to web search.
Check Antigravity account quotas for Claude and Gemini models. Shows remaining quota and reset times with ban detection.
Manages free AI models from OpenRouter for OpenClaw. Automatically ranks models by quality, configures fallbacks for rate-limit handling, and updates opencla...
Manages free AI models from OpenRouter for OpenClaw. Automatically ranks models by quality, configures fallbacks for rate-limit handling, and updates openclaw.json. Use when the user mentions free AI, OpenRouter, model switching, rate limits, or wants to reduce AI costs.