Skip to main content

Orchestration

2 skills for generating and deploying pipeline orchestration with Apache Airflow and Dagster.

Skills

dag-generate

Generate orchestration DAGs from pipeline configuration. Automatically creates Airflow or Dagster definitions based on your domain and task dependencies.

You: /dag-generate Create Airflow DAGs for all domains with daily schedule and email alerts

Supported orchestrators:

  • Apache Airflow — Python DAG files with operators
  • Dagster — Asset-based pipeline definitions

Generated outputs:

  • DAG definition files with correct task dependencies
  • Schedule configurations (cron expressions)
  • Alerting and retry policies
  • Connection references
  • Environment-specific parameterization

Airflow Example

# Generated by Starlake dag-generate
from airflow import DAG
from airflow.operators.bash import BashOperator
from datetime import datetime, timedelta

default_args = {
'owner': 'starlake',
'depends_on_past': False,
'email_on_failure': True,
'retries': 2,
'retry_delay': timedelta(minutes=5),
}

with DAG(
'starlake_customers_load',
default_args=default_args,
schedule_interval='0 6 * * *',
start_date=datetime(2024, 1, 1),
catchup=False,
) as dag:
stage = BashOperator(
task_id='stage_customers',
bash_command='starlake stage --domain customers',
)
load = BashOperator(
task_id='load_customers',
bash_command='starlake load --domain customers',
)
stage >> load

dag-deploy

Deploy generated DAGs to your orchestration platform.

You: /dag-deploy Deploy the generated Airflow DAGs to our Cloud Composer environment

Deployment targets:

  • Local Airflow (file copy to dags folder)
  • Cloud Composer (GCS upload)
  • MWAA (S3 upload)
  • Dagster Cloud
  • Docker-based deployments