alicloud-data-lake-dlfManage Alibaba Cloud Data Lake Formation (DataLake) via OpenAPI/SDK. Use for listing resources, creating or updating configurations, querying status, and troubleshooting workflows for this product.
Install via ClawdBot CLI:
clawdbot install cinience/alicloud-data-lake-dlfCategory: service
Use Alibaba Cloud OpenAPI (RPC) with official SDKs or OpenAPI Explorer to manage resources for Data Lake Formation.
1) Confirm region, resource identifiers, and desired action.
2) Discover API list and required parameters (see references).
3) Call API with SDK or OpenAPI Explorer.
4) Verify results with describe/list APIs.
1) Environment variables: ALICLOUD_ACCESS_KEY_ID / ALICLOUD_ACCESS_KEY_SECRET / ALICLOUD_REGION_ID
Region policy: ALICLOUD_REGION_ID is an optional default. If unset, decide the most reasonable region for the task; if unclear, ask the user.
2) Shared config file: ~/.alibabacloud/credentials
DataLake2020-07-101) Inventory/list: prefer List / Describe APIs to get current resources.
2) Change/configure: prefer Create / Update / Modify / Set APIs for mutations.
3) Status/troubleshoot: prefer Get / Query / Describe*Status APIs for diagnosis.
Use metadata-first discovery before calling business APIs:
python scripts/list_openapi_meta_apis.py
Optional overrides:
python scripts/list_openapi_meta_apis.py --product-code <ProductCode> --version <Version>
The script writes API inventory artifacts under the skill output directory.
If you need to save responses or generated artifacts, write them under:
output/alicloud-data-lake-dlf/
references/sources.mdGenerated Mar 1, 2026
IT administrators use this skill to list and describe existing Data Lake Formation resources like databases, tables, and catalogs across regions. This helps in auditing resource usage, ensuring compliance, and planning capacity by regularly querying inventory via List* and Describe* APIs.
DevOps teams automate the creation and updating of Data Lake configurations, such as setting up new databases or modifying table schemas, using Create* and Update* APIs. This streamlines deployment pipelines, reduces manual errors, and supports scalable data infrastructure management.
Data engineers monitor the status and health of Data Lake workflows by calling Get* and Query* APIs to diagnose issues like failed jobs or performance bottlenecks. This enables proactive maintenance, quick resolution of operational problems, and ensures data pipeline reliability.
Organizations in regulated industries use this skill to configure Data Lake resources across multiple regions to meet data residency requirements. By leveraging API calls with region-specific parameters, they ensure compliance while maintaining a unified data management strategy.
Companies offer managed services where they use this skill to automate Data Lake Formation management for clients, handling tasks like provisioning, monitoring, and optimization. This generates revenue through subscription fees based on usage or service tiers, targeting businesses lacking in-house expertise.
Consulting firms integrate this skill into their offerings to help clients set up and maintain Data Lake environments for analytics and data integration projects. Revenue comes from project-based contracts or retainer fees, focusing on industries needing tailored data solutions.
SaaS providers embed this skill into platforms that offer data governance and cataloging features for Data Lake Formation. Revenue is generated through software licensing or per-user pricing, catering to enterprises seeking centralized data management tools.
💬 Integration Tip
Prioritize setting environment variables for AccessKey credentials to streamline authentication, and use the provided scripts for API discovery to avoid manual parameter lookups.
Automatically update Clawdbot and all installed skills once daily. Runs via cron, checks for updates, applies them, and messages the user with a summary of what changed.
Full desktop computer use for headless Linux servers. Xvfb + XFCE virtual desktop with xdotool automation. 17 actions (click, type, scroll, screenshot, drag,...
Essential Docker commands and workflows for container management, image operations, and debugging.
Tool discovery and shell one-liner reference for sysadmin, DevOps, and security tasks. AUTO-CONSULT this skill when the user is: troubleshooting network issues, debugging processes, analyzing logs, working with SSL/TLS, managing DNS, testing HTTP endpoints, auditing security, working with containers, writing shell scripts, or asks 'what tool should I use for X'. Source: github.com/trimstray/the-book-of-secret-knowledge
Deploy applications and manage projects with complete CLI reference. Commands for deployments, projects, domains, environment variables, and live documentation access.
Monitor topics of interest and proactively alert when important developments occur. Use when user wants automated monitoring of specific subjects (e.g., product releases, price changes, news topics, technology updates). Supports scheduled web searches, AI-powered importance scoring, smart alerts vs weekly digests, and memory-aware contextual summaries.