-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TRY_CAST SQL compilation error #26
Comments
@brown5628 Thanks for the issue! This is stemming from a fix in 0.3.2 whereby we realized that the min/max aggregations were not working properly on every column except the The _min_or_max.sql aggregation macro, is using a trick to prepend the Otherwise, a lot of complexity would have to be introduced to handle pure aggregations, vs "aggregations" that actually require some complex window functions and nested CTEs. All that being said, the _min_or_max.sql macro uses the dbt Cross database macros to perform the Would you be able to look at that macro, and investigate if it can be tweaked to solve the issue for the Snowflake adapter? Either via some correction, or a dispatch to a Snowflake specific implementation? Eg"
|
@tnightengale Took a spin at this and am noting the following:
Given all of that, I'm leaning towards needing a Snowflake specific implementation over a correction to the existing code. From a maintainability perspective, do you have a preference between adapting the prepend approach vs. Snowflake's {% macro _min_or_max(min_or_max, qualified_col) %}
{% set aggregation = "min_by" if min_or_max == "min" else "max_by" %}
{% set qualified_ts_col = "{}.{}".format(
dbt_activity_schema.appended(), dbt_activity_schema.columns().ts
) %}
{# Apply min or max by to the selected column. #}
{% set aggregated_col %}
{{ aggregation }}({{ qualified_col }}, {{ qualified_ts_col }})
{% endset %}
{# Return output. #}
{% do return(aggregated_col) %}
{% endmacro %}
Based on your thoughts, I'm happy to take a first cut on code but am unfamiliar with the process of contributing to a project like this. Would the best next step there to be to fork & tee up a pull request with the changes? |
Hey @brown5628 - I've been collaborating on this project with Teghan as well. Thanks for working on this! Teghan may have some opinions around how to best organize the code, but the Snowflake-specific implementation you proposed using built-in Nice job spotting the edge case with casting For reference, this question is more project management-y in nature - I'm wondering if specifying |
Thanks @bcodell. Think your intuition on use is spot on- thus far 100% of the time I parse out individual features and cannot think of a pattern where I would want to retain the |
FYI @brown5628 - while you work on the PR for this issue, I believe a workaround is now in place (from #34) which now allows users to specify json columns by their key names in the |
Hey @brown5628 , @bcodell , @tnightengale I can see that there is an open PR #32 that might help here is it planned to get merged soon? thanks |
Hey @InbarShirizly thanks for the ping - based on my previous comment, I think there's a workaround to explicitly declare the columns you want in the |
@InbarShirizly The read @bcodell shared matches my own. The PR got stuck on how the contribution was structured, but the code itself works & produces expected results per the documented testing approach. I don't have an expectation that it will be merged since this project hasn't seen activity in months. If you're interested, I'd recommend either trying the
I've been using the forked version in my Snowflake environment since April with no issues. |
Hello!
Appreciate the awesome work on this package and am excited to get into using it.
I'm getting a SQL compilation error when attempting to make use of the package on Snowflake, dbt version is 1.3, dbt-activity-schema version is .32.
My stream is generated using the narrator activity schema dbt package, and the macro I'm using is:
Variables from the
project.yml
are:Error is:
Am I missing something that could get this working as expected? From my own troubleshooting, I see that:
feature_json
column from the variable list, I get code that executes.try_cast
needs a string as an input, so I would expect that this also will fail for numbers and thefeature_json
object on Snowflake. Soselect try_cast(1 as text)
fails butselect try_cast('1' as text)
works. It looks like thedbt
safe_cast
macro doesn't work the way I would expect it to here.If I am not missing something and this is a DB specific issue, I'd be interested in understanding how I can help ensure the package is Snowflake compatible.
The text was updated successfully, but these errors were encountered: