Welcome to Astronomer! This project contains example DAGs covered in the webinar Writing Functional DAGs with Decorators. If you missed the webinar, you can watch the recording here (link to come shortly). This project was generated by running astro dev init
using the Astronomer CLI. This readme describes the contents of the project, as well as how to run Apache Airflow on your local machine.
Your Astronomer project contains the following files and folders:
- dags: This folder contains the Python files for your Airflow DAGs. The DAGs in this repo are designed to highlight the use of decorators in Airflow 2. More info on the included DAGs is in the section below.
- Dockerfile: This file contains a versioned Astronomer Runtime Docker image that provides a differentiated Airflow experience. If you want to execute other commands or overrides at runtime, specify them here.
- include: This folder contains any additional files that you want to include as part of your project. It is empty by default.
- packages.txt: Install OS-level packages needed for your project by adding them to this file. It is empty by default.
- requirements.txt: Install Python packages needed for your project by adding them to this file. It is empty by default.
- plugins: Add custom or community plugins for your project to this file. It is empty by default.
- airflow_settings.yaml: Use this local-only file to specify Airflow Connections, Variables, and Pools instead of entering them in the Airflow UI as you develop DAGs in this project.
The dags/
directory in this project contains the following DAGs which highlight the use of decorators.
classic-python-operator.py
: Shows a 'before' basic ETL example wiwth three tasks using traditional Python operators.taskflow.py
: Shows the same ETL example asclassic-python-operator.py
but implemented using the Python task decorator.decorators-and-traditional.py
: Shows the same example astaskflow.py
but with an additionalEmailOperator
at the end to highlight mixing decorators and traditional operators. Note that to run this DAG successfully you need to have SMTP set up in your Airflow environment.task-groups-dynamic.py
: Shows an example of dynamically generating tasks in a task group using the task group decorator.task-groups.py
: Shows an example of using the task group decorator for two tasks whose outputs are used by a downstream task.
- Start Airflow on your local machine by running
astro dev start
.
This command will spin up 3 Docker containers on your machine, each for a different Airflow component:
- Postgres: Airflow's Metadata Database
- Webserver: The Airflow component responsible for rendering the Airflow UI
- Scheduler: The Airflow component responsible for monitoring and triggering tasks
- Verify that all 3 Docker containers were created by running 'docker ps'.
Note: Running astro dev start
will start your project with the Airflow Webserver exposed at port 8080 and Postgres exposed at port 5432. If you already have either of those ports allocated, you can either stop your existing Docker containers or change the port.
- Access the Airflow UI for your local Airflow project. To do so, go to http://localhost:8080/ and log in with 'admin' for both your Username and Password.
You should also be able to access your Postgres Database at 'localhost:5432/postgres'.
If you have an Astronomer account, pushing code to a Deployment on Astronomer is simple. For deploying instructions, refer to Astronomer documentation: https://docs.astronomer.io/cloud/deploy-code/
The Astronomer CLI is maintained with love by the Astronomer team. To report a bug or suggest a change, reach out to our support team: https://support.astronomer.io/