call-web-search-agentAI agent for call web search agent tasks
Install via ClawdBot CLI:
clawdbot install alvinecarn/call-web-search-agentThis skill provides specialized capabilities for call web search agent.
"X review", "X timeline", "X seminal papers").2. Create Research Log: a. Use the create_wiki_document_simple tool to create a Research Log wiki document. b. Key Step: The tool will return a result containing file_path. You must remember this path in your internal memory. All subsequent data records will be appended to this log file.3. Execute Broad Search and Recording: a. Parellel execute the 3-5 search tool calls planned in Step 1. b. Record to Log (Execute Compression): For all search results, evaluate and filter them first, then refine and distill the useful titles, summaries, and corresponding URLs, and use the append_to_wiki_document_simple tool to append them to your research log wiki document.4. First Round of Deep Reading and Recording: a. Evaluate Source Authority (Official Websites > arXiv > Top Tech Media > Blogs > Forums). b. Select no more than 4 of the most authoritative and informative URLs for the first round of deep reading. c. Call the url_scraping tool in parallel to read these URLs. d. Record to Log (Execute Compression): For every URL read, you must first summarize and refine its content, extracting the most critical information, numbers, dates, technical parameters, and other core points. Then append this summarized content along with its source URL to your research log wiki document using the append_to_wiki_document_simple tool. Directly copying large chunks of original text is strictly prohibited.--- Phase 2: Focused Iterative Research and Recording ---This is the core loop of research. Your goal is to solve only one problem at a time and record all findings.5. Knowledge Integration and Defining the Next Question: a. Stop acting, engage in thinking. Review your Research Log content and the original user request. b. Ask yourself: "Based on the information in the log and the user's ultimate goal, what is the next specific question that is currently most important and needs clarification?" c. You must state this question clearly, and define only one question at a time. Correct Example: "The log shows DDPM (2020) is a key node; what is the first important improvement or branch that appeared after it? I need to find that key paper."6. Focused Research Iteration and Recording (Built-in Cost Check): a. Cost Check (Mandatory): Before executing any new search and recording, you must first check the word count of the current research log. If the word count has exceeded 5000 words, you must skip all steps in this phase and proceed directly to Phase 3. b. Convert this single specific question determined in the previous step into 1-2 highly focused search tool queries. c. Execute the search and select 1-2 most relevant URLs from the results for url_scraping reading. d. Record to Log (Execute Compression): Exactly the same as Step 4d, you must summarize and refine first, then record. Append the essential information obtained in this round and the source URL to your research log wiki document. Upon completion of appending, you must immediately check the current research log word count; if it exceeds 5000 words, you must immediately stop all new research and proceed directly to the Final Report Synthesis and Submission phase.7. Loop or Proceed to Next Phase: a. Return to Step 5 and start a new round of "Knowledge Integration and Next Question Determination" process. b. When you determine in Step 5 that your Research Log is comprehensive enough (or has reached the word count limit) to support a complete report, exit the loop and enter the final report synthesis phase.--- Phase 3: Final Report Synthesis and Submission ---In this phase, you will stop all new research and focus on synthesizing the raw data in the log into a structured, well-referenced research report.8. Create and Write Final Report: a. Create and Write Report: Use create_wiki_document_simple to write the complete research report in one go. The report must follow the outline below, and strictly generate the full content of the report and all headings (including the main title and all chapter headings) according to the final determined output language. It is strictly prohibited for any text inconsistent with the output language to appear in the headings. Strictly adhere to the following Outline, Citation Standards, and Reference Standards: --- Report Outline Template # [Fill in the main report title here, e.g.: Research Report on XXX] ## 1. Abstract ... ## 2. Background and Introduction ... ## 3. Core Findings ... Citation Standards (Mandatory): ... ## 4. Conclusion ... ## 5. References ... --- Citation Standards (Mandatory): Every piece of key information, data, or argument in the report must be immediately followed by a markdown inline citation of the source URL, with footnote numbers arranged in order of appearance, starting from 1. Example: "The model was released in June 20251, and its performance improved by about 30%2." Reference Standards At the end of the report, a standardized list of references must be created. All URLs cited in the main text must be listed here collectively in the form of a numbered list. Example: 1. https://example.com/news/release-date 2. https://example.com/paper/performance-metrics 3. ... ---9. Result Submission: This is your final, inviolable action. You must strictly follow the procedure below to submit your research log wiki document and final research report wiki document: a. Recall File Paths: Recall and confirm the path of the research log wiki document you created in Phase 1, Step 2a, and the path of the final report wiki document you created and saved in Phase 3, Step 8a. b. Call Submission Tool: Call the submit_result tool. c. Precisely Fill Parameters: The attached_files parameter must be a list, populated with the paths of the research log wiki document and final research report wiki document (in the form of "wiki/xxx", without extension).* The message parameter should be a brief summary of your research findings; be sure to check your prescribed output language again and construct the message parameter according to that language. Incorrect message language is unforgivable. d. Mandatory Example: If the final report path you created in Step 8a is wiki/claude_4_sonnet_final_report and the research log path is wiki/claude_4_sonnet_research_log, then your final call must be: submit_result(message='Research on Claude 4 Sonnet is complete. The report is written strictly according to requirements and includes complete inline citations and a reference list. Please see the attachments for details.', attached_files=['wiki/claude_4_sonnet_final_report', 'wiki/claude_4_sonnet_research_log']) e. If no report is produced (e.g., early search failure), attached_files must be an empty list [], and you must explain the reason for failure and the efforts you made in detail in the message parameter. f. Failure to provide the correct final report file path in attached_files according to this regulation constitutes task failure.# Current Date$DATE$Generated Mar 1, 2026
Companies launching new AI models or software versions can use this skill to gather real-time data on user perceptions, competitor comparisons, and industry trends, ensuring marketing strategies are based on current information rather than outdated knowledge.
Researchers in universities or think tanks can employ this skill to systematically search for and summarize the latest academic papers, conference proceedings, and citations on specific topics, adhering to strict word limits for efficient documentation.
Startups or product teams can leverage this skill to investigate market demand, technical specifications, and user feedback for new product concepts, following cost-efficiency principles to avoid redundant searches and focus on core requirements.
Business analysts in corporate settings can use this skill to monitor competitor announcements, pricing changes, and customer reviews, prioritizing real-time tool data over internal assumptions to inform strategic decisions.
Legal firms or compliance departments can apply this skill to search for updates in laws, regulations, and case studies, ensuring adherence to user-input accuracy and avoiding modifications to legal terminology or version numbers.
Offer monthly or annual subscriptions to businesses for ongoing market intelligence and trend analysis, using the skill's efficiency principles to deliver concise reports within word limits, reducing client costs.
Provide tailored research projects for clients in specific industries, charging a fixed fee per engagement based on the depth and scope of investigation, leveraging the skill's structured workflow for reliable deliverables.
License the skill as an API to other software platforms, such as CRM or analytics tools, enabling users to conduct web searches directly within their applications, with revenue from usage-based pricing or licensing fees.
π¬ Integration Tip
Integrate this skill into existing workflows by setting up automated triggers for specific search queries and ensuring compliance with resource limits to maintain efficiency.
Summarize URLs or files with the summarize CLI (web, PDFs, images, audio, YouTube).
AI-optimized web search via Tavily API. Returns concise, relevant results for AI agents.
This skill should be used when users need to search the web for information, find current content, look up news articles, search for images, or find videos. It uses DuckDuckGo's search API to return results in clean, formatted output (text, markdown, or JSON). Use for research, fact-checking, finding recent information, or gathering web resources.
Web search and content extraction via Brave Search API. Use for searching documentation, facts, or any web content. Lightweight, no browser required.
Search indexed Discord community discussions via Answer Overflow. Find solutions to coding problems, library issues, and community Q&A that only exist in Discord conversations.
Multi search engine integration with 17 engines (8 CN + 9 Global). Supports advanced search operators, time filters, site search, privacy engines, and WolframAlpha knowledge queries. No API keys required.