-
Notifications
You must be signed in to change notification settings - Fork 234
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for 3.12 in workflow files and configs. #1081
Conversation
Thank you for your pull request! We could not find a changelog entry for this change. For details on how to document a change, see the dbt-spark contributing guide. |
fb1abd9
to
8dc6b6e
Compare
800eba0
to
539cd0c
Compare
Temporarily set integration test parameter to 3.12 to show integration tests working. |
Before merge I'm going to revert the test version to |
@cla-bot check |
The cla-bot has been summoned, and re-checked this pull request! |
resolves #981
Problem
READ BEFORE REVIEWING
Adding Python
3.12
should pass a series of tests✅
pytest tests/unit
local⏩
(we do this with Dagger now)pytest tests/functional
local⏩
dbt seed && dbt run in a local jaffle shop using plain pip install dbt-spark withcovered by smoke testing run3.12
active✅ smoke testing run using test-bundle on snowflake against the jaffle-shop-base scenario with
3.12.3
activebrew install unixodbc
to Always Sunny for it to work withpyodbc 5.*
which is needed forspark
🐢 ♾️✅ GHA workflow involving integration tests (see below)
✅ release workflow still works (I'm not changing the
3.11
version hereSolution
We needed to update the
pyodbc
to get this working! It had no wheels for3.12
.See this thread for one such documentation of this fact
No other changes beyond GHA workflows and
setup.py
are needed for this adapter.Checklist