nexus-data-profileStatistical profiling and quality assessment of datasets
Install via ClawdBot CLI:
clawdbot install cyberforexblockchain/nexus-data-profileGrade Fair — based on market validation, documentation quality, package completeness, maintenance status, and authenticity signals.
Sends data to undocumented external endpoint (potential exfiltration)
POST → https://ai-service-hub-15.emergent.host/api/original-services/data-profileCalls external URL not in known-safe list
https://ai-service-hub-15.emergent.host/api/original-services/data-profileAI Analysis
The skill sends user data to an external API endpoint for processing, which is consistent with its stated purpose of data profiling. While the endpoint is not on a known-safe list, the documentation transparently discloses this external call, uses HTTPS, and describes a payment verification mechanism. No hidden instructions, credential harvesting, or obfuscation are evident.
Audited Apr 18, 2026 · audit v1.0
Generated Mar 22, 2026
Medical researchers can use this skill to profile patient datasets for statistical anomalies and missing values before analysis, ensuring data integrity for clinical studies. It helps identify outliers or inconsistencies in electronic health records, improving the reliability of research outcomes.
Banks and fintech companies can assess transaction datasets for quality and completeness, enabling better preprocessing for machine learning models that detect fraudulent activities. This profiling step ensures data is clean and structured, reducing false positives in fraud alerts.
Retailers can analyze sales and inventory datasets to profile trends and data quality, helping optimize stock levels and predict demand. It identifies gaps or errors in historical data, supporting more accurate forecasting and supply chain decisions.
Universities and research institutions can use this skill to profile experimental or survey datasets, checking for statistical distributions and quality issues before publication. It ensures data meets academic standards, enhancing the credibility of research findings.
Manufacturing firms can assess production line datasets for quality metrics and anomalies, aiding in process optimization and defect reduction. Profiling helps maintain consistent data from sensors and logs, supporting predictive maintenance initiatives.
Charge users $0.20 per request for data profiling via the NEXUS API, leveraging the Masumi Protocol for secure, non-custodial payments on Cardano. This model suits small to medium businesses needing occasional data quality checks without subscription commitments.
Offer bulk pricing or monthly subscriptions for high-volume users like corporations, providing discounted rates and priority support for continuous data profiling needs. This model encourages long-term partnerships and stable revenue streams from large-scale deployments.
Provide free testing using 'sandbox_test' to attract new users, then upsell to paid plans for production use with full features and higher request limits. This model lowers adoption barriers and converts trial users into paying customers over time.
💬 Integration Tip
Ensure the NEXUS_PAYMENT_PROOF environment variable is set for authentication, and use the provided API endpoint with JSON input for seamless integration into existing data pipelines.
Scored Apr 19, 2026
Use the @steipete/oracle CLI to bundle a prompt plus the right files and get a second-model review (API or browser) for debugging, refactors, design checks, or cross-validation.
Local search/indexing CLI (BM25 + vectors + rerank) with MCP mode.
Use when designing database schemas, writing migrations, optimizing SQL queries, fixing N+1 problems, creating indexes, setting up PostgreSQL, configuring EF Core, implementing caching, partitioning tables, or any database performance question.
Connect to Supabase for database operations, vector search, and storage. Use for storing data, running SQL queries, similarity search with pgvector, and managing tables. Triggers on requests involving databases, vector stores, embeddings, or Supabase specifically.
Use SQLite correctly with proper concurrency, pragmas, and type handling.
Write correct MySQL queries avoiding common pitfalls with character sets, indexes, and locking.