Set Up Starlake with Airflow or Dagster
Starlake supports Airflow and Dagster as orchestration engines. The starlake-docker repository provides Docker Compose files that install and configure Starlake, the web UI, and the chosen orchestrator in a single command. This setup is designed for developers and data engineers who want a ready-to-use pipeline environment without manual orchestrator configuration.
The core project is open source: starlake-ai/starlake.
Deploy Starlake with Airflow or Dagster via Docker Compose
Prerequisites
- Docker and Docker Compose installed and running
- Git installed (to clone the repository)
Step-by-Step Setup
- Clone the starlake-docker repository
git clone https://github.com/starlake-ai/starlake-docker.git
- Navigate to the docker directory
cd starlake-docker/docker
- Start the stack
For Airflow:
docker compose up
For Dagster:
docker compose -f docker-compose-dagster.yml up
- Open the Starlake UI -- Navigate to
http://localhostin your browser.
To run on a different port, set the SL_UI_PORT environment variable:
SL_UI_PORT=8080 docker compose up
Stop the Starlake Docker Compose Stack
To stop all services, run in the same directory:
docker compose down
Mount External Starlake Projects in Docker
If you have existing Starlake projects and want to access them from the Docker setup, mount their parent folder as an NFS volume.