Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add functional tests for unit testing #976

Merged
merged 6 commits into from
Feb 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .changes/unreleased/Features-20240220-195925.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
kind: Features
body: Implement spark__safe_cast and add functional tests for unit testing
time: 2024-02-20T19:59:25.907821-05:00
custom:
Author: michelleark
Issue: "987"
1 change: 1 addition & 0 deletions dbt/include/spark/macros/adapters.sql
Original file line number Diff line number Diff line change
Expand Up @@ -387,6 +387,7 @@
"identifier": tmp_identifier
}) -%}

{%- set tmp_relation = tmp_relation.include(database=false, schema=false) -%}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For posterity, I believe this change was made to avoid including the database and schema when rendering the create statement for a view, which was necessary by the unit testing framework.

I believe it could be implemented more precisely by the changes here: https://github.com/dbt-labs/dbt-spark/pull/978/files#diff-786bb6587e86e50a2d01888eb2d4a5257e9a0025f75379214fa93cd5a033c9fbR141

{% do return(tmp_relation) %}
{% endmacro %}

Expand Down
8 changes: 8 additions & 0 deletions dbt/include/spark/macros/utils/safe_cast.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
{% macro spark__safe_cast(field, type) %}
{%- set field_clean = field.strip('"').strip("'") if (cast_from_string_unsupported_for(type) and field is string) else field -%}
cast({{field_clean}} as {{type}})
{% endmacro %}

{% macro cast_from_string_unsupported_for(type) %}
{{ return(type.lower().startswith('struct') or type.lower().startswith('array') or type.lower().startswith('map')) }}
{% endmacro %}
34 changes: 34 additions & 0 deletions tests/functional/adapter/unit_testing/test_unit_testing.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
import pytest

from dbt.tests.adapter.unit_testing.test_types import BaseUnitTestingTypes
from dbt.tests.adapter.unit_testing.test_case_insensitivity import BaseUnitTestCaseInsensivity
from dbt.tests.adapter.unit_testing.test_invalid_input import BaseUnitTestInvalidInput


class TestSparkUnitTestingTypes(BaseUnitTestingTypes):
@pytest.fixture
def data_types(self):
# sql_value, yaml_value
return [
["1", "1"],
["2.0", "2.0"],
["'12345'", "12345"],
["'string'", "string"],
["true", "true"],
["date '2011-11-11'", "2011-11-11"],
["timestamp '2013-11-03 00:00:00-0'", "2013-11-03 00:00:00-0"],
["array(1, 2, 3)", "'array(1, 2, 3)'"],
[
"map('10', 't', '15', 'f', '20', NULL)",
"""'map("10", "t", "15", "f", "20", NULL)'""",
],
['named_struct("a", 1, "b", 2, "c", 3)', """'named_struct("a", 1, "b", 2, "c", 3)'"""],
]


class TestSparkUnitTestCaseInsensitivity(BaseUnitTestCaseInsensivity):
pass


class TestSparkUnitTestInvalidInput(BaseUnitTestInvalidInput):
pass
Loading