Skip to content

Latest commit

 

History

History
73 lines (47 loc) · 2.47 KB

README.md

File metadata and controls

73 lines (47 loc) · 2.47 KB

Apache Airflow

In this HashiQube DevOps lab you will get hands on experience with Apache Airflow. Mot only that but you will learn how to install Airflow with Helm charts and run it on Kubernetes using Minikube.

We will configure Airflow, and create a DAG in Python that runs DBT. That's incredible learnings!

Be sure to checkout the DBT section as well and have fun1

Airflow is a platform created by the community to programmatically author, schedule and monitor workflows

Airflow

Provision

In order to provision apache airflow you need bastetools, docker, minikube as dependencies.

vagrant up --provision-with basetools,docker,docsify,postgresql,minikube,dbt,apache-airflow

Web UI Access

To access the web UI visit http://localhost:18889. Default login is:

Username: admin
Password: admin

Further Info

Airflow is deployed on Minikube (Kubernetes) using Helm, and additional values are supplied in the values.yaml file.

Example DAGs are supplied in the dags folder and they are mounted into the airflow scheduler pod, see the details in the values.yaml file

Airflow Information

In the dags folder you will find 2 dags

  • example-dag.py
  • test-ssh.py

The example-dag.py runs dbt commands by using the SSHOperator and ssh'ing into Hashiqube. The test-ssh.py just ssh into hashiqube to test the connection

Airflow DAGs

Airflow

Airflow Connections

Airflow

Airflow DAG run

Airflow

Airflow Task Instance

Airflow

Airflow Task Instance Result

Airflow

google ads

Links and further reading

Apache Airflow Vagrant Provisioner

filename

filename

google ads