-
Notifications
You must be signed in to change notification settings - Fork 235
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to support SQL Standard Catalogs? #281
Comments
@superdupershant Love it! Thanks for opening the issue, and for your work advocating for this change. Option (3) makes the most sense to me. With a minor inversion: We should make Questions from me:
|
This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please remove the stale label or comment on the issue, or it will be closed in 7 days. |
Spark is switching over to a three level catalog hierarchy based on the official ISO/ANSI SQL Standard. This means, instead of
db_name.table_name
there will be an additional layer above of organization above what currently exists. This introduces a couple changes:catalog_name
.schema_name
.table_name
.USE CATALOG foo
.INFORMATION_SCHEMA
How should we support the catalog namespace in dbt?
There exists a connection configuration option named
database
that acts similarly to the SQL Standard's definition of a catalog. Historically though the term database was synonymous with schema, and as often been a source of confusion. Some potential options:catalog
that specifies the CATALOG to use for the project, leave thedatabase
option currently as is functioning as an alias forschema
.catalog
config option and just use the value indatabase
to refer to a catalog within the system.catalog
and makedatabase
an alias forcatalog
.With option (3) above we would want some intermediate period that warns anyone using the
database
config option without specifying aschema
config that its meaning is in the process of changing.The text was updated successfully, but these errors were encountered: