Skip to content

Commit

Permalink
[CI] Reduce the amount taking to run tests in the CI from 5h to 11min (
Browse files Browse the repository at this point in the history
…#1297)

**Context**

Closes: #1299

Some CI jobs took an outrageous amount of time to run.

One of our CI test pipeline jobs was taking over five hours to run:

<img width="1624" alt="Screenshot 2024-11-05 at 14 52 58"
src="https://github.com/user-attachments/assets/8d8264c9-70f7-4381-b2dd-847244e58a98">

https://github.com/astronomer/astronomer-cosmos/actions/runs/11505558596

More recently, this same job started taking over 6 hours to run and
started timing out in the CI, making Cosmos' main branch red for both
unit and integration tests for Airflow 2.6. The underlying reason is
that it took a long time to resolve dependencies. This seems to have
happened since October 29, as seen on the commit
84c5fbd to main.

Example:
https://github.com/astronomer/astronomer-cosmos/actions/runs/11594745046/job/32330864858

**About this change**

This PR solves the original issue by changing where and how we install
Airflow dependencies, simplifying the previous setup. We manage Airflow
test dependencies in the `pre-install-airflow.sh` file, not in
`pyproject.toml`, other `sh`, or the Github action. We are also being
strict as we can dependent on the Airflow version. Where possible, we
use constraints. Where different providers' dependencies conflict with
previous versions of Airflow, we just ensure the expected version
remains being used after the installation.

Example of a successful run:

<img width="1624" alt="Screenshot 2024-11-05 at 14 52 48"
src="https://github.com/user-attachments/assets/7e16c8fc-0357-43e1-9ad9-e033d9d1a3b5">

https://github.com/astronomer/astronomer-cosmos/actions/runs/11685652312

**Bonus**

I realised with this change that users who use K8s and want to define
different paths for their dbt projects in Airflow and K8s were facing an
issue. This problem was evident when running the k8s example DAG. I've
fixed the problem as part of this PR.

**Follow-up actions**

Since this has been taking a long time to solve and our main branch is
still red, I commented out two tasks that were failing tests - and I've
logged a follow-up issue for us to address this:

#1304
  • Loading branch information
tatiana authored Nov 6, 2024
1 parent 739db95 commit 091e62b
Show file tree
Hide file tree
Showing 9 changed files with 109 additions and 78 deletions.
21 changes: 8 additions & 13 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -309,6 +309,7 @@ jobs:
AIRFLOW_CONN_AZURE_ABFS_CONN: ${{ secrets.AIRFLOW_CONN_AZURE_ABFS_CONN }}
AIRFLOW__CORE__DAGBAG_IMPORT_TIMEOUT: 90.0
PYTHONPATH: /home/runner/work/astronomer-cosmos/astronomer-cosmos/:$PYTHONPATH
AIRFLOW__COSMOS__ENABLE_CACHE: 0
COSMOS_CONN_POSTGRES_PASSWORD: ${{ secrets.COSMOS_CONN_POSTGRES_PASSWORD }}
DATABRICKS_CLUSTER_ID: mock
DATABRICKS_HOST: mock
Expand Down Expand Up @@ -388,6 +389,7 @@ jobs:
AIRFLOW_CONN_AZURE_ABFS_CONN: ${{ secrets.AIRFLOW_CONN_AZURE_ABFS_CONN }}
AIRFLOW__CORE__DAGBAG_IMPORT_TIMEOUT: 90.0
PYTHONPATH: /home/runner/work/astronomer-cosmos/astronomer-cosmos/:$PYTHONPATH
AIRFLOW__COSMOS__ENABLE_CACHE: 0
COSMOS_CONN_POSTGRES_PASSWORD: ${{ secrets.COSMOS_CONN_POSTGRES_PASSWORD }}
DATABRICKS_CLUSTER_ID: mock
DATABRICKS_HOST: mock
Expand Down Expand Up @@ -489,7 +491,7 @@ jobs:
strategy:
matrix:
python-version: [ "3.11" ]
airflow-version: [ "2.9" ]
airflow-version: [ "2.10" ]
steps:
- uses: actions/checkout@v3
with:
Expand All @@ -511,21 +513,14 @@ jobs:

- name: Install packages and dependencies
run: |
python -m venv venv
source venv/bin/activate
pip install --upgrade pip
pip install -e ".[tests]"
pip install apache-airflow-providers-cncf-kubernetes
pip install dbt-postgres==1.8.2 psycopg2==2.9.3 pytz
pip install apache-airflow==${{ matrix.airflow-version }}
# hatch -e tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }} run pip freeze
python -m pip install uv
uv pip install --system hatch
hatch -e tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }} run pip freeze
- name: Run kubernetes tests
run: |
source venv/bin/activate
sh ./scripts/test/kubernetes-setup.sh
cd dev && sh ../scripts/test/integration-kubernetes.sh
# hatch run tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }}:test-kubernetes
hatch run tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }}:test-kubernetes-setup
hatch run tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }}:test-kubernetes
env:
AIRFLOW_HOME: /home/runner/work/astronomer-cosmos/astronomer-cosmos/
AIRFLOW_CONN_EXAMPLE_CONN: postgres://postgres:[email protected]:5432/postgres
Expand Down
1 change: 1 addition & 0 deletions cosmos/converter.py
Original file line number Diff line number Diff line change
Expand Up @@ -210,6 +210,7 @@ def __init__(
*args: Any,
**kwargs: Any,
) -> None:

project_config.validate_project()

execution_config = execution_config or ExecutionConfig()
Expand Down
56 changes: 29 additions & 27 deletions dev/dags/cosmos_manifest_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,40 +71,42 @@ def cosmos_manifest_example() -> None:
# [END aws_s3_example]

# [START gcp_gs_example]
gcp_gs_example = DbtTaskGroup(
group_id="gcp_gs_example",
project_config=ProjectConfig(
manifest_path="gs://cosmos_remote_target/manifest.json",
manifest_conn_id="gcp_gs_conn",
# `manifest_conn_id` is optional. If not provided, the default connection ID `google_cloud_default` is used.
project_name="jaffle_shop",
),
profile_config=profile_config,
render_config=render_config,
execution_config=execution_config,
operator_args={"install_deps": True},
)
# gcp_gs_example = DbtTaskGroup(
# group_id="gcp_gs_example",
# project_config=ProjectConfig(
# manifest_path="gs://cosmos_remote_target/manifest.json",
# manifest_conn_id="gcp_gs_conn",
# # `manifest_conn_id` is optional. If not provided, the default connection ID `google_cloud_default` is used.
# project_name="jaffle_shop",
# ),
# profile_config=profile_config,
# render_config=render_config,
# execution_config=execution_config,
# operator_args={"install_deps": True},
# )
# [END gcp_gs_example]

# [START azure_abfs_example]
azure_abfs_example = DbtTaskGroup(
group_id="azure_abfs_example",
project_config=ProjectConfig(
manifest_path="abfs://cosmos-manifest-test/manifest.json",
manifest_conn_id="azure_abfs_conn",
# `manifest_conn_id` is optional. If not provided, the default connection ID `wasb_default` is used.
project_name="jaffle_shop",
),
profile_config=profile_config,
render_config=render_config,
execution_config=execution_config,
operator_args={"install_deps": True},
)
# azure_abfs_example = DbtTaskGroup(
# group_id="azure_abfs_example",
# project_config=ProjectConfig(
# manifest_path="abfs://cosmos-manifest-test/manifest.json",
# manifest_conn_id="azure_abfs_conn",
# # `manifest_conn_id` is optional. If not provided, the default connection ID `wasb_default` is used.
# project_name="jaffle_shop",
# ),
# profile_config=profile_config,
# render_config=render_config,
# execution_config=execution_config,
# operator_args={"install_deps": True},
# )
# [END azure_abfs_example]

post_dbt = EmptyOperator(task_id="post_dbt")

(pre_dbt >> local_example >> aws_s3_example >> gcp_gs_example >> azure_abfs_example >> post_dbt)
(pre_dbt >> local_example >> aws_s3_example >> post_dbt)
# TODO: re-enable the following
# (pre_dbt >> local_example >> aws_s3_example >> gcp_gs_example >> azure_abfs_example >> post_dbt)


cosmos_manifest_example()
20 changes: 15 additions & 5 deletions dev/dags/jaffle_shop_kubernetes.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,9 @@
"""

import os
from pathlib import Path

from airflow import DAG
from airflow.providers.cncf.kubernetes.secret import Secret
from pendulum import datetime
Expand All @@ -21,9 +24,17 @@
ExecutionMode,
ProfileConfig,
ProjectConfig,
RenderConfig,
)
from cosmos.profiles import PostgresUserPasswordProfileMapping

DEFAULT_DBT_ROOT_PATH = Path(__file__).parent / "dbt"

DBT_ROOT_PATH = Path(os.getenv("DBT_ROOT_PATH", DEFAULT_DBT_ROOT_PATH))
AIRFLOW_PROJECT_DIR = DBT_ROOT_PATH / "jaffle_shop"

K8S_PROJECT_DIR = "dags/dbt/jaffle_shop"

DBT_IMAGE = "dbt-jaffle-shop:1.0.0"

project_seeds = [{"project": "jaffle_shop", "seeds": ["raw_customers", "raw_payments", "raw_orders"]}]
Expand Down Expand Up @@ -51,7 +62,7 @@
# [START kubernetes_seed_example]
load_seeds = DbtSeedKubernetesOperator(
task_id="load_seeds",
project_dir="dags/dbt/jaffle_shop",
project_dir=K8S_PROJECT_DIR,
get_logs=True,
schema="public",
image=DBT_IMAGE,
Expand All @@ -72,6 +83,7 @@

# [START kubernetes_tg_example]
run_models = DbtTaskGroup(
project_config=ProjectConfig(),
profile_config=ProfileConfig(
profile_name="postgres_profile",
target_name="dev",
Expand All @@ -82,10 +94,8 @@
},
),
),
project_config=ProjectConfig(dbt_project_path="dags/dbt/jaffle_shop"),
execution_config=ExecutionConfig(
execution_mode=ExecutionMode.KUBERNETES,
),
render_config=RenderConfig(dbt_project_path=AIRFLOW_PROJECT_DIR),
execution_config=ExecutionConfig(execution_mode=ExecutionMode.KUBERNETES, dbt_project_path=K8S_PROJECT_DIR),
operator_args={
"image": DBT_IMAGE,
"get_logs": True,
Expand Down
39 changes: 14 additions & 25 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -85,20 +85,6 @@ docs = [
"sphinx-autoapi",
"apache-airflow-providers-cncf-kubernetes>=5.1.1",
]
tests = [
"packaging>=22.0",
"pytest>=6.0",
"pytest-split",
"pytest-dotenv",
"requests-mock",
"pytest-cov",
"pytest-describe",
"sqlalchemy-stubs", # Change when sqlalchemy is upgraded https://docs.sqlalchemy.org/en/14/orm/extensions/mypy.html
"types-pytz",
"types-requests",
"sqlalchemy-stubs", # Change when sqlalchemy is upgraded https://docs.sqlalchemy.org/en/14/orm/extensions/mypy.html
"pre-commit",
]
docker = [
"apache-airflow-providers-docker>=3.5.0",
]
Expand Down Expand Up @@ -141,19 +127,23 @@ packages = ["/cosmos"]

[tool.hatch.envs.tests]
dependencies = [
"astronomer-cosmos[tests]",
"apache-airflow-providers-cncf-kubernetes>=5.1.1",
"apache-airflow-providers-amazon[s3fs]>=3.0.0",
"apache-airflow-providers-docker>=3.5.0",
"apache-airflow-providers-google",
"apache-airflow-providers-microsoft-azure",
"apache-airflow-providers-postgres",
# do not install Airflow dependencies here!
# we need to use constrantis and we're doing this in the `pre-install-commands`
"packaging>=22.0",
"pytest>=6.0",
"pytest-split",
"pytest-dotenv",
"requests-mock",
"pytest-cov",
"pytest-describe",
"sqlalchemy-stubs", # Change when sqlalchemy is upgraded https://docs.sqlalchemy.org/en/14/orm/extensions/mypy.html
"pre-commit",
"types-PyYAML",
"types-attrs",
"types-pytz",
"types-requests",
"types-python-dateutil",
"Werkzeug<3.0.0",
"apache-airflow~={matrix:airflow}.0,!=2.9.0,!=2.9.1", # https://github.com/apache/airflow/pull/39670
]
pre-install-commands = ["sh scripts/test/pre-install-airflow.sh {matrix:airflow} {matrix:python}"]

Expand All @@ -164,8 +154,6 @@ airflow = ["2.4", "2.5", "2.6", "2.7", "2.8", "2.9", "2.10"]
[tool.hatch.envs.tests.overrides]
matrix.airflow.dependencies = [
{ value = "typing_extensions<4.6", if = ["2.6"] },
{ value = "apache-airflow-providers-google<10.11.0", if = ["2.4"] },
{ value = "apache-airflow-providers-google>=10.11.0", if = ["2.5", "2.6", "2.7", "2.8", "2.9", "2.10"] },
]

[tool.hatch.envs.tests.scripts]
Expand All @@ -174,14 +162,15 @@ test = 'sh scripts/test/unit.sh'
test-cov = 'sh scripts/test/unit-cov.sh'
test-integration = 'sh scripts/test/integration.sh'
test-kubernetes = "sh scripts/test/integration-kubernetes.sh"
test-kubernetes-setup = "sh scripts/test/kubernetes-setup.sh"
test-integration-dbt-1-5-4 = 'sh scripts/test/integration-dbt-1-5-4.sh'
test-integration-expensive = 'sh scripts/test/integration-expensive.sh'
test-integration-setup = 'sh scripts/test/integration-setup.sh'
test-integration-sqlite = 'sh scripts/test/integration-sqlite.sh'
test-integration-sqlite-setup = 'sh scripts/test/integration-sqlite-setup.sh'
test-performance = 'sh scripts/test/performance.sh'
test-performance-setup = 'sh scripts/test/performance-setup.sh'
type-check = " pre-commit run mypy --files cosmos/**/*"
type-check = "pre-commit run mypy --files cosmos/**/*"

[tool.pytest.ini_options]
filterwarnings = ["ignore::DeprecationWarning"]
Expand Down
2 changes: 1 addition & 1 deletion scripts/test/integration-kubernetes.sh
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@ pytest -vv \
--cov-report=xml \
--durations=0 \
-m integration \
../tests/test_example_k8s_dags.py
tests/test_example_k8s_dags.py
3 changes: 3 additions & 0 deletions scripts/test/kubernetes-setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,9 @@
set -x
set -e


pip install dbt-postgres==1.8.2 psycopg2==2.9.3 pytz

# Create a Kubernetes secret named 'postgres-secrets' with the specified literals for host and password
kubectl create secret generic postgres-secrets \
--from-literal=host=postgres-postgresql.default.svc.cluster.local \
Expand Down
38 changes: 31 additions & 7 deletions scripts/test/pre-install-airflow.sh
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
#!/bin/bash

set -x
set -v

AIRFLOW_VERSION="$1"
PYTHON_VERSION="$2"

Expand All @@ -18,19 +21,40 @@ curl -sSL $CONSTRAINT_URL -o /tmp/constraint.txt
# Workaround to remove PyYAML constraint that will work on both Linux and MacOS
sed '/PyYAML==/d' /tmp/constraint.txt > /tmp/constraint.txt.tmp
mv /tmp/constraint.txt.tmp /tmp/constraint.txt

# Install Airflow with constraints
pip install uv
uv pip install "apache-airflow==$AIRFLOW_VERSION" --constraint /tmp/constraint.txt
uv pip install pydantic --constraint /tmp/constraint.txt
uv pip install pip --upgrade

uv pip install "apache-airflow==$AIRFLOW_VERSION" --constraint /tmp/constraint.txt
uv pip install apache-airflow-providers-docker --constraint /tmp/constraint.txt
uv pip install apache-airflow-providers-postgres --constraint /tmp/constraint.txt

if [ "$AIRFLOW_VERSION" = "2.7" ]; then
if [ "$AIRFLOW_VERSION" = "2.4" ] || [ "$AIRFLOW_VERSION" = "2.5" ] || [ "$AIRFLOW_VERSION" = "2.6" ] ; then
uv pip install "apache-airflow-providers-amazon" "apache-airflow==$AIRFLOW_VERSION" "urllib3<2"
uv pip install "apache-airflow-providers-cncf-kubernetes" "apache-airflow==$AIRFLOW_VERSION"
uv pip install "apache-airflow-providers-google<10.11" "apache-airflow==$AIRFLOW_VERSION"
uv pip install "apache-airflow-providers-microsoft-azure" "apache-airflow==$AIRFLOW_VERSION"
uv pip install pyopenssl --upgrade
elif [ "$AIRFLOW_VERSION" = "2.7" ] ; then
uv pip install "apache-airflow-providers-amazon" --constraint /tmp/constraint.txt
uv pip install "apache-airflow-providers-cncf-kubernetes" --constraint /tmp/constraint.txt
uv pip install "apache-airflow-providers-google>10.11" "apache-airflow==$AIRFLOW_VERSION"
uv pip install apache-airflow-providers-microsoft-azure --constraint /tmp/constraint.txt
else
uv pip install "apache-airflow-providers-amazon[s3fs]" --constraint /tmp/constraint.txt
uv pip install "apache-airflow-providers-cncf-kubernetes" --constraint /tmp/constraint.txt
uv pip install "apache-airflow-providers-docker" --constraint /tmp/constraint.txt
uv pip install "apache-airflow-providers-google" --constraint /tmp/constraint.txt
uv pip install "apache-airflow-providers-microsoft-azure" --constraint /tmp/constraint.txt
uv pip install "apache-airflow-providers-postgres" --constraint /tmp/constraint.txt
uv pip install "apache-airflow-providers-google>=10.11.0" --constraint /tmp/constraint.txt
uv pip install apache-airflow-providers-microsoft-azure --constraint /tmp/constraint.txt
fi

rm /tmp/constraint.txt

actual_version=$(airflow version | cut -d. -f1,2)

if [ "$actual_version" = $AIRFLOW_VERSION ]; then
echo "Version is as expected: $AIRFLOW_VERSION"
else
echo "Version does not match. Expected: $AIRFLOW_VERSION, but got: $actual_version"
exit 1
fi
7 changes: 7 additions & 0 deletions tests/operators/test_kubernetes.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,10 @@
from unittest.mock import MagicMock, patch

import pytest
from airflow import __version__ as airflow_version
from airflow.models import TaskInstance
from airflow.utils.context import Context, context_merge
from packaging import version
from pendulum import datetime

from cosmos.operators.kubernetes import (
Expand Down Expand Up @@ -115,6 +117,11 @@ def test_dbt_kubernetes_operator_get_env(p_context_to_airflow_vars: MagicMock, b
"no_version_check": True,
}


if version.parse(airflow_version) == version.parse("2.4"):
base_kwargs["name"] = "some-pod-name"


result_map = {
"ls": DbtLSKubernetesOperator(**base_kwargs),
"run": DbtRunKubernetesOperator(**base_kwargs),
Expand Down

0 comments on commit 091e62b

Please sign in to comment.