- Running instance of Airflow. There are a few ways to get this. The easiest is to use the Docker Compose to get a local instance running. See docs for more information.
- Install
dbt-af
if you are not using the Docker Compose method.- via pip:
pip install dbt-af[tests,examples]
- via pip:
- Build dbt manifest. You can use the provided script to build the manifest.
cd examples/dags
./build_manifest.sh
-
Add
dbt_dev
anddbt_sensor_pool
pools to Airflow.Start with some small numbers of open slots in pools. If you are using your local machine, a large number of tasks can overflow your machine's resources.
- Basic Project: a single domain, small tests, and a single target.
- Advanced Project: several domains, medium and large tests, and different targets.
- Dependencies management: how to manage dependencies between models in different domains.
- Manual scheduling: domains with manual scheduling.
- Maintenance and source freshness: how to manage maintenance tasks and source freshness.
- Kubernetes tasks: how to run dbt models in Kubernetes.
- Integration with other tools: how to integrate dbt-af with other tools.
- [Preview] Extras and scripts: available extras and scripts.