local-researcher完全本地的深度研究助手 Skill。使用 Ollama 或 LMStudio 本地 LLM 进行迭代式网络研究,生成带引用来源的 Markdown 报告。当用户需要进行隐私优先的研究、本地文档分析或生成结构化研究报告时触发。
Install via ClawdBot CLI:
clawdbot install antonia-sz/local-researcherGrade Fair — based on market validation, documentation quality, package completeness, maintenance status, and authenticity signals.
Calls external URL not in known-safe list
https://ollama.com/install.shAudited Apr 17, 2026 · audit v1.0
Generated Mar 29, 2026
Researchers can use this skill to conduct preliminary literature reviews for academic papers, generating structured summaries with citations from multiple sources. It helps identify key studies and gaps in existing research without exposing sensitive data to cloud services.
Business analysts can perform competitive market research on topics like product trends or industry forecasts, using local models to ensure proprietary data remains confidential. The iterative search process builds comprehensive reports with cited sources for strategic planning.
Technology firms can assess emerging technologies or evaluate open-source projects by generating detailed reports on technical advancements and risks. The privacy-focused setup allows analysis of sensitive information without external data leaks.
Healthcare professionals can research medical topics or analyze clinical data locally to comply with privacy regulations like HIPAA. The skill produces summarized findings with references, aiding in evidence-based decision-making without cloud exposure.
Law firms can use this skill to gather and summarize legal precedents or regulatory changes, maintaining client confidentiality. The markdown output with citations supports drafting briefs or compliance documents efficiently.
Offer a hosted version with enhanced features like team collaboration and advanced analytics, charging monthly fees based on usage tiers. This model targets enterprises needing scalable, secure research tools without infrastructure management.
Provide custom integration and training services for organizations deploying the skill in-house, such as setting up private search APIs or optimizing workflows. Revenue comes from project-based contracts and ongoing support packages.
Monetize through premium support, enterprise licenses for proprietary extensions, or donations from users who value privacy-first tools. This model leverages community contributions while offering paid features for commercial use.
💬 Integration Tip
Start by configuring a local LLM like Ollama with a model such as llama3.2, then test with DuckDuckGo search to avoid API key dependencies before scaling to paid search services.
Scored Apr 19, 2026
Use CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
Gemini CLI for one-shot Q&A, summaries, and generation.
Manages free AI models from OpenRouter for OpenClaw. Automatically ranks models by quality, configures fallbacks for rate-limit handling, and updates openclaw.json. Use when the user mentions free AI, OpenRouter, model switching, rate limits, or wants to reduce AI costs.
Manages free AI models from OpenRouter for OpenClaw. Automatically ranks models by quality, configures fallbacks for rate-limit handling, and updates opencla...
Reduce OpenClaw AI costs by 97%. Haiku model routing, free Ollama heartbeats, prompt caching, and budget controls. Go from $1,500/month to $50/month in 5 min...
HTML-first PDF production skill for reports, papers, and structured documents. Must be applied before generating PDF deliverables from HTML.