Skip to main content

Starlake Skills

Open-source Claude Code plugin providing 48 specialized skills for building, configuring, and operating Starlake data pipelines.

git clone https://github.com/starlake-ai/starlake-skills.git ~/.claude/skills/starlake-skills
48Specialized Skills
6+Data Warehouses
10Skill Categories
2Orchestrators

Why Starlake Skills?

Your AI-powered co-pilot for declarative data pipeline development.

🔓

Open source & auditable

Every skill, every prompt, every configuration pattern is inspectable and extensible. Apache-2.0 licensed for full transparency.

🔌

Cross-platform, not single-vendor

Covers BigQuery, Snowflake, DuckDB, PostgreSQL, Redshift, and Databricks. Write once, deploy anywhere.

🤖

AI-native workflow

Purpose-built for Claude Code. Ask questions in natural language and get expert Starlake guidance with ready-to-use configurations.

📦

Complete coverage

48 skills covering every CLI command, configuration pattern, write strategy, data quality expectation, and production best practice.

See It in Action

Natural language commands that generate production-ready configurations.

# Set up a new Starlake project with BigQuery
> /bootstrap a new project targeting BigQuery with Airflow orchestration

# Configure data ingestion for CSV files
> /load CSV files from GCS into the customers domain with OVERWRITE strategy

# Generate column-level lineage
> /col-lineage for the revenue_summary transform

# Create Airflow DAGs from your pipeline config
> /dag-generate for all domains using Airflow with daily schedule

# Validate your entire project configuration
> /validate the full project and fix any schema errors

# Extract schemas from an existing Snowflake database
> /extract-schema from Snowflake connection "prod" for the analytics schema

Multi-Platform Support

One plugin, every warehouse and orchestrator.

BigQuery
Snowflake
DuckDB
PostgreSQL
Redshift
Databricks
Airflow
Dagster