Prompt in plain English. Starflow AI ships.
The what, not the how.
Pipelines as configuration, not code.
Two Ways to Use the Skills
Greenfield project or migration? Start with Starflow for the full lifecycle. Quick targeted task? Use a CLI skill directly.
Starflow
Guided methodology layer. Five expert personas (Lea, Winston, Amelia, Quinn, Max) walk you through Discovery → Architecture → Pipeline Design → Implementation, with adversarial code review and end-of-epic retrospectives.
Open the Starflow guide →
Direct CLI Skills
One skill per Starlake command: load, transform, extract, dag-generate, and 45 more. Ask in natural language; get production-ready YAML, SQL, or shell.
Browse the catalog →
Starflow: Guided Methodology
Four phases, five expert personas, persistent step-file workflows that resume across sessions.
1. Discovery
Map data domains, sources, and ownership before writing any configuration.
- starflow-domain-discovery
- starflow-source-analysis
2. Architecture
Design the platform, layers, engines, and table schemas that will support your pipelines.
- starflow-create-data-architecture
- starflow-schema-design
3. Pipeline Design
Specify pipelines end-to-end (extract, load, transform, orchestrate) before implementation.
- starflow-create-pipeline-spec
- starflow-transform-design
- starflow-orchestration-design
4. Implementation
Build, review, deploy, and reflect. Adversarial parallel code review and end-of-epic retros.
- starflow-sprint-planning
- starflow-dev-pipeline
- starflow-code-review
- starflow-retrospective
Plus five agent personas (Lea, Winston, Amelia, Quinn, Max) covering data analysis, architecture, engineering, quality, and platform; and the cross-cutting data-quality-review, lineage-review, and adaptive starflow-help skills.
Skill Catalog
49 skills across 11 categories, one per Starlake CLI command, with the configuration patterns to match.
Ingestion & Loading
8 skills- autoload
- load
- cnxload
- esload
- kafkaload
- ingest
- preload
- stage
Transformation
2 skills- transform
- job
Extraction
7 skills- extract
- extract-schema
- extract-data
- extract-bq-schema
- extract-rest-schema
- extract-rest-data
- extract-script
Schema Management
6 skills- bootstrap
- infer-schema
- xls2yml
- xls2ymljob
- yml2ddl
- yml2xls
Data Quality
1 skills- expectations
Lineage
4 skills- lineage
- col-lineage
- table-dependencies
- acl-dependencies
Orchestration
2 skills- dag-generate
- dag-deploy
Operations
8 skills- validate
- metrics
- freshness
- gizmosql
- console
- serve
- settings
- migrate
Security
2 skills- secure
- iam-policies
Configuration
2 skills- config
- connection
Utilities
6 skills- bq-info
- compare
- parquet2csv
- site
- summarize
- test
See It in Action
Natural-language commands that produce production-ready configurations.
# Bootstrap a new project targeting BigQuery with Airflow
> /bootstrap a new project targeting BigQuery with Airflow orchestration
# Configure ingestion for CSV files
> /load CSV files from GCS into the customers domain with OVERWRITE strategy
# Generate column-level lineage
> /col-lineage for the revenue_summary transform
# Generate Airflow DAGs from your pipeline config
> /dag-generate for all domains using Airflow with daily schedule
# Or use Starflow for a guided lifecycle
# Talk to the data architect persona
> /starflow-data-architect Design a data platform for our e-commerce analytics
# Ask Starflow what to do next based on your project state
> /starflow-help What should I work on next?The Starlake Stack
One bundle, every layer.