From 2d1bcf55ce6a6572a8932fe03b5f7fb1d386a81f Mon Sep 17 00:00:00 2001 From: Talla Date: Tue, 5 Dec 2023 07:27:32 +0530 Subject: [PATCH 001/143] Updated as per dbt-teradata 1.7.0 --- .../docs/docs/core/connect-data-platform/teradata-setup.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/website/docs/docs/core/connect-data-platform/teradata-setup.md b/website/docs/docs/core/connect-data-platform/teradata-setup.md index 1a30a1a4a54..4f467968716 100644 --- a/website/docs/docs/core/connect-data-platform/teradata-setup.md +++ b/website/docs/docs/core/connect-data-platform/teradata-setup.md @@ -38,6 +38,7 @@ import SetUpPages from '/snippets/_setup-pages-intro.md'; |1.4.x.x | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ |1.5.x | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ |1.6.x | ❌ | ❌ | ✅ | ✅ | ✅ | ✅ +|1.7.x | ❌ | ❌ | ✅ | ✅ | ✅ | ✅ ## dbt dependent packages version compatibility @@ -45,6 +46,7 @@ import SetUpPages from '/snippets/_setup-pages-intro.md'; |--------------|------------|-------------------|----------------| | 1.2.x | 1.2.x | 0.1.0 | 0.9.x or below | | 1.6.7 | 1.6.7 | 1.1.1 | 1.1.1 | +| 1.7.0 | 1.7.3 | 1.1.1 | 1.1.1 | ### Connecting to Teradata @@ -172,6 +174,8 @@ For using cross DB macros, teradata-utils as a macro namespace will not be used, | Cross-database macros | type_string | :white_check_mark: | custom macro provided | | Cross-database macros | last_day | :white_check_mark: | no customization needed, see [compatibility note](#last_day) | | Cross-database macros | width_bucket | :white_check_mark: | no customization +| Cross-database macros | generate_series | :white_check_mark: | custom macro provided +| Cross-database macros | date_spine | :white_check_mark: | no customization #### examples for cross DB macros From 54b430a9f32f21139de64d9b1cb2a5da10abda4c Mon Sep 17 00:00:00 2001 From: Przemek Denkiewicz Date: Tue, 5 Dec 2023 11:25:52 +0100 Subject: [PATCH 002/143] Add oauth_console authentication to Starburst/Trino --- .../core/connect-data-platform/trino-setup.md | 33 +++++++++++++++++-- 1 file changed, 31 insertions(+), 2 deletions(-) diff --git a/website/docs/docs/core/connect-data-platform/trino-setup.md b/website/docs/docs/core/connect-data-platform/trino-setup.md index a7dc658358f..354e95ef03d 100644 --- a/website/docs/docs/core/connect-data-platform/trino-setup.md +++ b/website/docs/docs/core/connect-data-platform/trino-setup.md @@ -30,7 +30,7 @@ The parameters for setting up a connection are for Starburst Enterprise, Starbur ## Host parameters -The following profile fields are always required except for `user`, which is also required unless you're using the `oauth`, `cert`, or `jwt` authentication methods. +The following profile fields are always required except for `user`, which is also required unless you're using the `oauth`, `oauth_console`, `cert`, or `jwt` authentication methods. | Field | Example | Description | | --------- | ------- | ----------- | @@ -71,6 +71,7 @@ The authentication methods that dbt Core supports are: - `jwt` — JSON Web Token (JWT) - `certificate` — Certificate-based authentication - `oauth` — Open Authentication (OAuth) +- `oauth_console` — Open Authentication (OAuth) with authentication URL printed to the console - `none` — None, no authentication Set the `method` field to the authentication method you intend to use for the connection. For a high-level introduction to authentication in Trino, see [Trino Security: Authentication types](https://trino.io/docs/current/security/authentication-types.html). @@ -85,6 +86,7 @@ Click on one of these authentication methods for further details on how to confi {label: 'JWT', value: 'jwt'}, {label: 'Certificate', value: 'certificate'}, {label: 'OAuth', value: 'oauth'}, + {label: 'OAuth (console)', value: 'oauth_console'}, {label: 'None', value: 'none'}, ]} > @@ -269,7 +271,34 @@ sandbox-galaxy: host: bunbundersders.trino.galaxy-dev.io catalog: dbt_target schema: dataders - port: 433 + port: 443 +``` + + + + + +The only authentication parameter to set for OAuth 2.0 is `method: oauth_console`. If you're using Starburst Enterprise or Starburst Galaxy, you must enable OAuth 2.0 in Starburst before you can use this authentication method. + +For more information, refer to both [OAuth 2.0 authentication](https://trino.io/docs/current/security/oauth2.html) in the Trino docs and the [README](https://github.com/trinodb/trino-python-client#oauth2-authentication) for the Trino Python client. + +The only difference between `oauth_console` and `oauth` is that in the latter a browser is automatically opened with authentication URL and in `oauth_console` URL is printed to the console. + +It's recommended that you install `keyring` to cache the OAuth 2.0 token over multiple dbt invocations by running `python -m pip install 'trino[external-authentication-token-cache]'`. The `keyring` package is not installed by default. + +#### Example profiles.yml for OAuth + +```yaml +sandbox-galaxy: + target: oauth_console + outputs: + oauth: + type: trino + method: oauth_console + host: bunbundersders.trino.galaxy-dev.io + catalog: dbt_target + schema: dataders + port: 443 ``` From f4693b9813897634602b27fdea54d8089f509633 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 18 Dec 2023 13:57:28 -0500 Subject: [PATCH 003/143] add rn --- .../74-Dec-2023/dec-sl-updates.md | 19 +++++++++++++++++++ 1 file changed, 19 insertions(+) create mode 100644 website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md new file mode 100644 index 00000000000..3f43222685a --- /dev/null +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -0,0 +1,19 @@ +--- +title: “Updates and fixes: dbt Semantic Layer and MetricFlow updates for the month of December 2023.” +description: “December 2023: Enhanced Tableau integration, BIGINT support, LookML to MetricFlow conversion, and deprecation of legacy features.” +sidebar_label: “Update ad fixes: dbt Semantic Layer and MetricFlow.” +sidebar_position: 08 +date: 2023-12-22 +--- +The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer and MetricFlow. Here are the updates and fixes for the month of December 2023. + +## Bug fixes +- The dbt Semantic Layer integration with Tableau now supports using exclude in its user interface. Previously it wasn’t supported. +- The dbt Semantic Layer can support `BIGINT` with over 18 digits. Previously it would return an error. +- The [dbt converter tool](https://github.com/dbt-labs/dbt-converter) can now convert data definitions from LookML to MetricFlow and help users upgrade. Previously this wasn’t available. (converts from lookml to metricflow specs). ROXI TO CLARIFY WITH NICK TO DETERMINE IF WE WANT TO TALK ABOUT THIS NOW OR LATER ON WHEN IT HAS MORE FEATURES. + +## Improvements +- dbt Labs deprecated [dbt Metrics and the legacy dbt Semantic Layer](/docs/dbt-versions/release-notes/Dec-2023/legacy-sl), both supported on dbt version 1.5 or lower. This change came into effect on December 15th, 2023. + +## New features +- Test From 8117a3cdee24c10b139daacc8289ae7204428c41 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Mon, 18 Dec 2023 14:10:03 -0500 Subject: [PATCH 004/143] Update dec-sl-updates.md --- .../release-notes/74-Dec-2023/dec-sl-updates.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 3f43222685a..b2db3ef7adb 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -1,11 +1,11 @@ --- -title: “Updates and fixes: dbt Semantic Layer and MetricFlow updates for the month of December 2023.” -description: “December 2023: Enhanced Tableau integration, BIGINT support, LookML to MetricFlow conversion, and deprecation of legacy features.” -sidebar_label: “Update ad fixes: dbt Semantic Layer and MetricFlow.” +title: "Updates and fixes: dbt Semantic Layer and MetricFlow updates for December 2023." +description: "December 2023: Enhanced Tableau integration, BIGINT support, LookML to MetricFlow conversion, and deprecation of legacy features." +sidebar_label: "Update ad fixes: dbt Semantic Layer and MetricFlow." sidebar_position: 08 date: 2023-12-22 --- -The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer and MetricFlow. Here are the updates and fixes for the month of December 2023. +The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer and MetricFlow. Here are the updates and fixes for December 2023. ## Bug fixes - The dbt Semantic Layer integration with Tableau now supports using exclude in its user interface. Previously it wasn’t supported. From 17dcbc1cc80a08f19e2dbd274d3fd2cce230afb8 Mon Sep 17 00:00:00 2001 From: rpourzand Date: Mon, 18 Dec 2023 11:50:02 -0800 Subject: [PATCH 005/143] Update dec-sl-updates.md A couple of edits after talking to the team! --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index b2db3ef7adb..cfdc8dd8b69 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -8,12 +8,13 @@ date: 2023-12-22 The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer and MetricFlow. Here are the updates and fixes for December 2023. ## Bug fixes -- The dbt Semantic Layer integration with Tableau now supports using exclude in its user interface. Previously it wasn’t supported. +- The dbt Semantic Layer integration with Tableau now supports queries that resolve to a "NOT IN" clause (for example: using "exclude" in the filtering user interface). Previously it wasn’t supported. - The dbt Semantic Layer can support `BIGINT` with over 18 digits. Previously it would return an error. -- The [dbt converter tool](https://github.com/dbt-labs/dbt-converter) can now convert data definitions from LookML to MetricFlow and help users upgrade. Previously this wasn’t available. (converts from lookml to metricflow specs). ROXI TO CLARIFY WITH NICK TO DETERMINE IF WE WANT TO TALK ABOUT THIS NOW OR LATER ON WHEN IT HAS MORE FEATURES. +- We fixed a memory leak that would amount in intermittent errors when querying our JDBC API. ## Improvements - dbt Labs deprecated [dbt Metrics and the legacy dbt Semantic Layer](/docs/dbt-versions/release-notes/Dec-2023/legacy-sl), both supported on dbt version 1.5 or lower. This change came into effect on December 15th, 2023. +- The [dbt converter tool](https://github.com/dbt-labs/dbt-converter) can now help automate some of the work in converting from LookML (Looker's modeling language) for those who are migrating. Previously this wasn’t available. ## New features - Test From 6edcf6509865a31359a6a4590d76adbdf9390f9c Mon Sep 17 00:00:00 2001 From: rpourzand Date: Mon, 18 Dec 2023 12:55:02 -0800 Subject: [PATCH 006/143] Update dec-sl-updates.md Diego recommendation --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index cfdc8dd8b69..6757a59b86d 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -9,7 +9,7 @@ The dbt Labs team continues to work on adding new features, fixing bugs, and inc ## Bug fixes - The dbt Semantic Layer integration with Tableau now supports queries that resolve to a "NOT IN" clause (for example: using "exclude" in the filtering user interface). Previously it wasn’t supported. -- The dbt Semantic Layer can support `BIGINT` with over 18 digits. Previously it would return an error. +- The dbt Semantic Layer can support `BIGINT` with precision greater than 18. Previously it would return an error. - We fixed a memory leak that would amount in intermittent errors when querying our JDBC API. ## Improvements From 2eadd7126b895110bd816849d8f85b36850ae29d Mon Sep 17 00:00:00 2001 From: rpourzand Date: Mon, 18 Dec 2023 12:57:42 -0800 Subject: [PATCH 007/143] Update dec-sl-updates.md more recommendations from diego --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 1 + 1 file changed, 1 insertion(+) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 6757a59b86d..8f0bdd593c7 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -11,6 +11,7 @@ The dbt Labs team continues to work on adding new features, fixing bugs, and inc - The dbt Semantic Layer integration with Tableau now supports queries that resolve to a "NOT IN" clause (for example: using "exclude" in the filtering user interface). Previously it wasn’t supported. - The dbt Semantic Layer can support `BIGINT` with precision greater than 18. Previously it would return an error. - We fixed a memory leak that would amount in intermittent errors when querying our JDBC API. +- Added support for converting various Redshift and Postgres specific data types. Previously, the driver would throw an error when encountering columns with those types. ## Improvements - dbt Labs deprecated [dbt Metrics and the legacy dbt Semantic Layer](/docs/dbt-versions/release-notes/Dec-2023/legacy-sl), both supported on dbt version 1.5 or lower. This change came into effect on December 15th, 2023. From 87caeaaef510a9b4f950638dd1acb8662f284059 Mon Sep 17 00:00:00 2001 From: sachinthakur96 Date: Tue, 19 Dec 2023 14:49:56 +0530 Subject: [PATCH 008/143] Adding --- website/docs/docs/core/connect-data-platform/vertica-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/core/connect-data-platform/vertica-setup.md b/website/docs/docs/core/connect-data-platform/vertica-setup.md index 9274c22ebbe..525e1be86fc 100644 --- a/website/docs/docs/core/connect-data-platform/vertica-setup.md +++ b/website/docs/docs/core/connect-data-platform/vertica-setup.md @@ -6,7 +6,7 @@ meta: authors: 'Vertica (Former authors: Matthew Carter, Andy Regan, Andrew Hedengren)' github_repo: 'vertica/dbt-vertica' pypi_package: 'dbt-vertica' - min_core_version: 'v1.6.0 and newer' + min_core_version: 'v1.7.0 and newer' cloud_support: 'Not Supported' min_supported_version: 'Vertica 23.4.0' slack_channel_name: 'n/a' From 5748454b5bfb9f52b65c5bbfbebe1a5f152f4768 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Tue, 19 Dec 2023 08:10:01 -0500 Subject: [PATCH 009/143] Update website/docs/docs/core/connect-data-platform/trino-setup.md --- website/docs/docs/core/connect-data-platform/trino-setup.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/website/docs/docs/core/connect-data-platform/trino-setup.md b/website/docs/docs/core/connect-data-platform/trino-setup.md index 354e95ef03d..28d158758e3 100644 --- a/website/docs/docs/core/connect-data-platform/trino-setup.md +++ b/website/docs/docs/core/connect-data-platform/trino-setup.md @@ -282,7 +282,9 @@ The only authentication parameter to set for OAuth 2.0 is `method: oauth_console For more information, refer to both [OAuth 2.0 authentication](https://trino.io/docs/current/security/oauth2.html) in the Trino docs and the [README](https://github.com/trinodb/trino-python-client#oauth2-authentication) for the Trino Python client. -The only difference between `oauth_console` and `oauth` is that in the latter a browser is automatically opened with authentication URL and in `oauth_console` URL is printed to the console. +The only difference between `oauth_console` and `oauth` is: +- `oauth` — An authentication URL automatically opens in a browser. +- `oauth_console` — A URL is printed to the console. It's recommended that you install `keyring` to cache the OAuth 2.0 token over multiple dbt invocations by running `python -m pip install 'trino[external-authentication-token-cache]'`. The `keyring` package is not installed by default. From 326af16079cb867dbc9eb9a99f76df8dc9a4da4c Mon Sep 17 00:00:00 2001 From: Amy Chen Date: Tue, 19 Dec 2023 10:01:05 -0500 Subject: [PATCH 010/143] upload the guide --- .../2023-12-20-partner-integration-guide.md | 102 ++++++++++++++++++ 1 file changed, 102 insertions(+) create mode 100644 website/blog/2023-12-20-partner-integration-guide.md diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md new file mode 100644 index 00000000000..0eed3302716 --- /dev/null +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -0,0 +1,102 @@ +--- +title: "How to integrate with dbt" +description: "This guide will cover the ways to integrate with dbt Cloud" +slug: integrating-with-dbtcloud + +authors: [amy_chen] + +tags: [dbt Cloud, Integrations, APIs] +hide_table_of_contents: false + +date: 2023-12-20 +is_featured: false +--- + + +## Overview + +Over the course of my 3 years running the Partner Engineering team at dbt Labs, the most common question I have been asked is “How do we integrate with dbt?”. Because those conversations often start out at the same place, I decided to create this guide so I’m no longer the blocker to fundamental information. This also allows us to skip the intro and get to the fun conversations like what a joint solution for our customers would look like so much faster. + +Now this guide does not include how to integrate with dbt Core. If you’re interested in creating an dbt Adapter, **[please check out this documentation instead.](/guides/dbt-ecosystem/adapter-development/1-what-are-adapters)** + +Instead we are going to focus on integrating with dbt Cloud. Integrating with dbt Cloud is a key requirement to become a dbt Labs technology partner, opening the door to a variety of collaborative commercial opportunities. + +Here I will cover how to get started, potential use cases you want to solve for, and points of integrations to do so. + +## New to dbt Cloud? + +If you're new to dbt and dbt Cloud, we recommend you and your software developers try our [Getting Started Quickstarts](quickstarts) after reading [What is dbt?](/docs/introduction). The documentation will help you familiarize yourself with how our users interact with dbt. By going through this, you will also create a sample dbt project to test your integration. + +If you require a partner dbt Cloud account to test on, we can upgrade an existing account or a trial account. **This account may only be used for development, training, and demonstration purposes.** Please speak to your partner manager if you're interested and provide the account id (provided in the URL). Our partner account has all of the enterprise level functionality and can be provided with a signed partnerships agreement. + +## Integration Points + +- [Discovery API (formerly referred to as Metadata API)](/docs/dbt-cloud-apis/discovery-api) + - **Overview**: This GraphQL API allows you to query the metadata that dbt Cloud generates every time you run a dbt Project. We have two schemas available (environment and job level). By default, we always recommend that you integrate with the environment level schema because it contains the latest state and historical run results of all the jobs run on the dbt Cloud project. The job level will only provide you the metadata of one job, giving you only a small snapshot of part of the project. +- [Administrative API (also referred to as the Admin API)](/docs/dbt-cloud-apis/admin-cloud-api) + - **Overview:** This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. +- Webhooks + - **Overview:** Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information on your dbt jobs in real time. + - [Link to documentation](/docs/deploy/webhooks) +- Semantic Layers/Metrics + - **Overview: Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](/guides/dbt-ecosystem/sl-partner-integration-guide).** + - Metrics definitions can be pulled from the Discovery API (linked above) or the Semantic Layer Driver/GraphQL API. The key difference is the Discovery API is not able to pull the semantic graph which provides the list of available dimensions that one can query per metric. That is only available via the SL Driver/APIs. The tradeoff is the SL Driver/APIs does not have access to the lineage of the entire dbt project (i.e how the dbt metrics dependencies on dbt models) + - [We have three available integration points for the Semantic Layer API.](/docs/dbt-cloud-apis/sl-api-overview) + +## dbt Cloud Hosting and Authentication + +To use the dbt Cloud APIs, you will need access to the customer’s access urls. Depending on their dbt Cloud setup, they will have a different access url. To find out more, here is the [documentation](/docs/cloud/about-cloud/regions-ip-addresses) to understand all the possible configurations. My recommendation is to allow the customer to provide their own url to simplify support. + +If the customer is on an Azure Single Tenant instance, they do not currently have access to the Discovery API or the Semantic Layer APIs. + +For authentication, we highly recommend that your integration uses account service tokens. You can read more about how to create a service token and what permission sets to provide it [here](/docs/dbt-cloud-apis/service-tokens). Please note depending on their plan type, they will have access to different permission sets. We **do not** recommend that users supply their user bearer tokens for authentication. This can cause issues if the user leaves the organization and provides you access to all the dbt Cloud accounts associated to the user rather than just the account (and related projects) that they want to integrate with. + +## Potential Use Cases + +- Event-based orchestration + - **Desired Action:** You wish to receive information that a scheduled dbt Cloud Job has been completed or kick off a dbt Cloud job. You can align your product schedule to the dbt Cloud run schedule. + - **Examples:** Kicking off a dbt Job after the ETL job of extracting and loading the data is completed. Or receiving a webhook after the job has been completed to kick off your reverse ETL job. + - **Integration Points:** Webhooks and/or Admin API +- dbt Lineage + - **Desired Action:** You wish to interpolate the dbt lineage metadata into your tool. + - **Example: In your tool, you wish to pull in the dbt DAG into your lineage diagram. [This is what you could pull and how to do this.](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-full-data-lineage)** + - **Integration Points:** Discovery API +- dbt Environment/Job metadata + - **Desired Action:** You wish to interpolate dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. + - **Example:** In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. [This is what you could pull and how to do this.](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model) + - **Integration Points:** Discovery API +- dbt Model Documentation + - **Desired Action:** You wish to interpolate dbt Project Information, including model descriptions, column descriptions, etc. + - **Example:** You want to extract out the dbt model description so that you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. [This is what you could pull and how to do this.](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean) + - **Integration Points:** Discovery API + +**dbt Core only users will have no access to the above integration points.** For dbt metadata, oftentimes our partners will create a dbt core integration by using the [dbt artifacts](/product/semantic-layer/) files generated by each run and provided by the user. With our Discovery API, we are providing a dynamic way to get the latest up to date information, parsed out for you. + +## dbt Cloud Plans & Permissions + +[The dbt Cloud plan type](https://www.getdbt.com/pricing) will change what the user has access to. There are four different types of plans: + +- **Developer**: This is free and available to one user with a limited amount of successful models built. This plan cannot access the APIs, Webhooks, or Semantic Layer. Limited to 1 project. +- **Team:** This plan has access to the APIs, Webhooks, and Semantic Layer. You may have up to 8 users on the account and one dbt Cloud Project. This is limited to 15,000 successful models built. +- **Enterprise** (Multi-tenant/Multi-cell): This plan has access to the APIs, Webhooks, and Semantic Layer. They may have more than one dbt Cloud Project based on how many dbt projects/domains they have using dbt. Majority of our enterprise customers are on multi-tenant dbt Cloud instances. +- **Enterprise** (Single-tenant): This plan may have access to the APIs, Webhooks, and Semantic Layer. If you are working with a specific customer, let us know, and we can confirm if their instance has access. + +## Frequently Asked Questions + +- What is a dbt Cloud Project? + - A dbt Cloud project is made up of two connections: one to the git repository and one to the data warehouse/platform. Most customers will have only one dbt Cloud Project in their account but there are enterprise clients who might have more depending on their use cases.The project also encapsulates two types of environments at minimal: a development environment and deployment environment. + - Oftentimes folks refer to the [dbt Project](/docs/build/projects) as the code hosted in their git repository. +- What is a dbt Cloud Environment? + - [For an overview, check out this documentation.](/docs/environments-in-dbt) At minimal an project will have one deployment type environment that they will be executing jobs on. The development environment powers the dbt Cloud IDE and Cloud CLI. +- Can we write back to the dbt project? + - At this moment, we do not have a Write API. A dbt project is hosted in a git repository, so if you have a git provider integration, you can manually open up a Pull Request on the project to maintain the version control process. +- Can you provide column-level information in the lineage? + - Column-level lineage is currently in beta release with more information to come. +- How do I get a Partner Account? + - Contact your Partner Manager with your account id (in your URL) +- Why should I not use the Admin API to pull out the dbt artifacts for metadata? + - We recommend not integrating with the Admin API to extract the dbt artifacts documentation. This is because the Discovery API provides more extensive information, a user-friendly structure and more reliable integration point. +- How do I get access to the dbt Brand assets? + - Check out this [page](https://www.getdbt.com/brand-guidelines/). Please make sure you’re not using our old logo(hint: there should only be one hole in the logo). Please also note that the name dbt and the dbt logo are trademarked by dbt Labs, and that use is governed by our brand guidelines - which are fairly specific for commercial uses. If you have any questions about proper use of our marks, please ask for your partner manager. +- How do I engage with the partnerships team? + - Email partnerships@dbtlabs.com. \ No newline at end of file From 4b7af55488c0fe2b5d2a710f744b8052028664a4 Mon Sep 17 00:00:00 2001 From: Amy Chen Date: Tue, 19 Dec 2023 12:40:33 -0500 Subject: [PATCH 011/143] fix broken links --- .../2023-12-20-partner-integration-guide.md | 32 +++++++++---------- website/blog/authors.yml | 2 +- 2 files changed, 16 insertions(+), 18 deletions(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 0eed3302716..f51181bf588 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -11,13 +11,11 @@ hide_table_of_contents: false date: 2023-12-20 is_featured: false --- - - ## Overview Over the course of my 3 years running the Partner Engineering team at dbt Labs, the most common question I have been asked is “How do we integrate with dbt?”. Because those conversations often start out at the same place, I decided to create this guide so I’m no longer the blocker to fundamental information. This also allows us to skip the intro and get to the fun conversations like what a joint solution for our customers would look like so much faster. -Now this guide does not include how to integrate with dbt Core. If you’re interested in creating an dbt Adapter, **[please check out this documentation instead.](/guides/dbt-ecosystem/adapter-development/1-what-are-adapters)** +Now this guide does not include how to integrate with dbt Core. If you’re interested in creating an dbt Adapter, **[please check out this documentation instead.](https://docs.getdbt.com/guides/dbt-ecosystem/adapter-development/1-what-are-adapters)** Instead we are going to focus on integrating with dbt Cloud. Integrating with dbt Cloud is a key requirement to become a dbt Labs technology partner, opening the door to a variety of collaborative commercial opportunities. @@ -25,31 +23,31 @@ Here I will cover how to get started, potential use cases you want to solve for, ## New to dbt Cloud? -If you're new to dbt and dbt Cloud, we recommend you and your software developers try our [Getting Started Quickstarts](quickstarts) after reading [What is dbt?](/docs/introduction). The documentation will help you familiarize yourself with how our users interact with dbt. By going through this, you will also create a sample dbt project to test your integration. +If you're new to dbt and dbt Cloud, we recommend you and your software developers try our [Getting Started Quickstarts](https://docs.getdbt.com/quickstarts) after reading [What is dbt?](https://docs.getdbt.com/docs/introduction). The documentation will help you familiarize yourself with how our users interact with dbt. By going through this, you will also create a sample dbt project to test your integration. If you require a partner dbt Cloud account to test on, we can upgrade an existing account or a trial account. **This account may only be used for development, training, and demonstration purposes.** Please speak to your partner manager if you're interested and provide the account id (provided in the URL). Our partner account has all of the enterprise level functionality and can be provided with a signed partnerships agreement. ## Integration Points -- [Discovery API (formerly referred to as Metadata API)](/docs/dbt-cloud-apis/discovery-api) +- [Discovery API (formerly referred to as Metadata API)](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-api) - **Overview**: This GraphQL API allows you to query the metadata that dbt Cloud generates every time you run a dbt Project. We have two schemas available (environment and job level). By default, we always recommend that you integrate with the environment level schema because it contains the latest state and historical run results of all the jobs run on the dbt Cloud project. The job level will only provide you the metadata of one job, giving you only a small snapshot of part of the project. -- [Administrative API (also referred to as the Admin API)](/docs/dbt-cloud-apis/admin-cloud-api) +- [Administrative API (also referred to as the Admin API)](https://docs.getdbt.com/docs/dbt-cloud-apis/admin-cloud-api) - **Overview:** This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. - Webhooks - **Overview:** Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information on your dbt jobs in real time. - - [Link to documentation](/docs/deploy/webhooks) + - [Link to documentation](https://docs.getdbt.com/docs/deploy/webhooks) - Semantic Layers/Metrics - - **Overview: Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](/guides/dbt-ecosystem/sl-partner-integration-guide).** + - **Overview: Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](https://docs.getdbt.com/docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](https://docs.getdbt.com/guides/dbt-ecosystem/sl-partner-integration-guide).** - Metrics definitions can be pulled from the Discovery API (linked above) or the Semantic Layer Driver/GraphQL API. The key difference is the Discovery API is not able to pull the semantic graph which provides the list of available dimensions that one can query per metric. That is only available via the SL Driver/APIs. The tradeoff is the SL Driver/APIs does not have access to the lineage of the entire dbt project (i.e how the dbt metrics dependencies on dbt models) - - [We have three available integration points for the Semantic Layer API.](/docs/dbt-cloud-apis/sl-api-overview) + - [We have three available integration points for the Semantic Layer API.](https://docs.getdbt.com/docs/dbt-cloud-apis/sl-api-overview) ## dbt Cloud Hosting and Authentication -To use the dbt Cloud APIs, you will need access to the customer’s access urls. Depending on their dbt Cloud setup, they will have a different access url. To find out more, here is the [documentation](/docs/cloud/about-cloud/regions-ip-addresses) to understand all the possible configurations. My recommendation is to allow the customer to provide their own url to simplify support. +To use the dbt Cloud APIs, you will need access to the customer’s access urls. Depending on their dbt Cloud setup, they will have a different access url. To find out more, here is the [documentation](https://docs.getdbt.com/docs/cloud/about-cloud/regions-ip-addresses) to understand all the possible configurations. My recommendation is to allow the customer to provide their own url to simplify support. If the customer is on an Azure Single Tenant instance, they do not currently have access to the Discovery API or the Semantic Layer APIs. -For authentication, we highly recommend that your integration uses account service tokens. You can read more about how to create a service token and what permission sets to provide it [here](/docs/dbt-cloud-apis/service-tokens). Please note depending on their plan type, they will have access to different permission sets. We **do not** recommend that users supply their user bearer tokens for authentication. This can cause issues if the user leaves the organization and provides you access to all the dbt Cloud accounts associated to the user rather than just the account (and related projects) that they want to integrate with. +For authentication, we highly recommend that your integration uses account service tokens. You can read more about how to create a service token and what permission sets to provide it [here](https://docs.getdbt.com/docs/dbt-cloud-apis/service-tokens). Please note depending on their plan type, they will have access to different permission sets. We **do not** recommend that users supply their user bearer tokens for authentication. This can cause issues if the user leaves the organization and provides you access to all the dbt Cloud accounts associated to the user rather than just the account (and related projects) that they want to integrate with. ## Potential Use Cases @@ -59,18 +57,18 @@ For authentication, we highly recommend that your integration uses account servi - **Integration Points:** Webhooks and/or Admin API - dbt Lineage - **Desired Action:** You wish to interpolate the dbt lineage metadata into your tool. - - **Example: In your tool, you wish to pull in the dbt DAG into your lineage diagram. [This is what you could pull and how to do this.](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-full-data-lineage)** + - **Example: In your tool, you wish to pull in the dbt DAG into your lineage diagram. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-full-data-lineage)** - **Integration Points:** Discovery API - dbt Environment/Job metadata - **Desired Action:** You wish to interpolate dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. - - **Example:** In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. [This is what you could pull and how to do this.](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model) + - **Example:** In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model) - **Integration Points:** Discovery API - dbt Model Documentation - **Desired Action:** You wish to interpolate dbt Project Information, including model descriptions, column descriptions, etc. - - **Example:** You want to extract out the dbt model description so that you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. [This is what you could pull and how to do this.](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean) + - **Example:** You want to extract out the dbt model description so that you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean) - **Integration Points:** Discovery API -**dbt Core only users will have no access to the above integration points.** For dbt metadata, oftentimes our partners will create a dbt core integration by using the [dbt artifacts](/product/semantic-layer/) files generated by each run and provided by the user. With our Discovery API, we are providing a dynamic way to get the latest up to date information, parsed out for you. +**dbt Core only users will have no access to the above integration points.** For dbt metadata, oftentimes our partners will create a dbt core integration by using the [dbt artifacts](https://www.getdbt.com/product/semantic-layer/) files generated by each run and provided by the user. With our Discovery API, we are providing a dynamic way to get the latest up to date information, parsed out for you. ## dbt Cloud Plans & Permissions @@ -85,9 +83,9 @@ For authentication, we highly recommend that your integration uses account servi - What is a dbt Cloud Project? - A dbt Cloud project is made up of two connections: one to the git repository and one to the data warehouse/platform. Most customers will have only one dbt Cloud Project in their account but there are enterprise clients who might have more depending on their use cases.The project also encapsulates two types of environments at minimal: a development environment and deployment environment. - - Oftentimes folks refer to the [dbt Project](/docs/build/projects) as the code hosted in their git repository. + - Oftentimes folks refer to the [dbt Project](https://docs.getdbt.com/docs/build/projects) as the code hosted in their git repository. - What is a dbt Cloud Environment? - - [For an overview, check out this documentation.](/docs/environments-in-dbt) At minimal an project will have one deployment type environment that they will be executing jobs on. The development environment powers the dbt Cloud IDE and Cloud CLI. + - [For an overview, check out this documentation.](https://docs.getdbt.com/docs/environments-in-dbt) At minimal an project will have one deployment type environment that they will be executing jobs on. The development environment powers the dbt Cloud IDE and Cloud CLI. - Can we write back to the dbt project? - At this moment, we do not have a Write API. A dbt project is hosted in a git repository, so if you have a git provider integration, you can manually open up a Pull Request on the project to maintain the version control process. - Can you provide column-level information in the lineage? diff --git a/website/blog/authors.yml b/website/blog/authors.yml index cd2bd162935..82cc300bdc8 100644 --- a/website/blog/authors.yml +++ b/website/blog/authors.yml @@ -1,6 +1,6 @@ amy_chen: image_url: /img/blog/authors/achen.png - job_title: Staff Partner Engineer + job_title: Product Partnerships Manager links: - icon: fa-linkedin url: https://www.linkedin.com/in/yuanamychen/ From 7b0efb2a96e769b240915fd0d4bc7c43ce4495cd Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Tue, 19 Dec 2023 13:05:53 -0500 Subject: [PATCH 012/143] add discourse link + simplify language this pr adds a discourse link to incremental strategies discussion for large datasets and simplifies the 'when should i use an incremental model' paragraph/section. --- website/docs/docs/build/incremental-models.md | 14 +++++++++----- 1 file changed, 9 insertions(+), 5 deletions(-) diff --git a/website/docs/docs/build/incremental-models.md b/website/docs/docs/build/incremental-models.md index 2a247263159..ed0e6b51f02 100644 --- a/website/docs/docs/build/incremental-models.md +++ b/website/docs/docs/build/incremental-models.md @@ -154,17 +154,21 @@ For detailed usage instructions, check out the [dbt run](/reference/commands/run # Understanding incremental models ## When should I use an incremental model? -It's often desirable to build models as tables in your data warehouse since downstream queries are more performant. While the `table` materialization also creates your models as tables, it rebuilds the table on each dbt run. These runs can become problematic in that they use a lot of compute when either: -* source data tables have millions, or even billions, of rows. -* the transformations on the source data are computationally expensive (that is, take a long time to execute), for example, complex Regex functions, or UDFs are being used to transform data. -Like many things in programming, incremental models are a trade-off between complexity and performance. While they are not as straightforward as the `view` and `table` materializations, they can lead to significantly better performance of your dbt runs. +Building models as tables in your data warehouse is often preferred for better query performance. However, using `table` materialization can be computationally intensive, especially when: + +- Source data has millions or billions of rows. +- Data transformations on the source data are computationally expensive (take a long time to execute) and complex, like using Regex or UDFs. + +Incremental models offer a balance between complexity and improved performance compared to `view` and `table` materializations and offer better performance of your dbt runs. + +In addition to these considerations for incremental models, it's important to understand their limits and challenges, particularly with large datasets. For more insights into efficient strategies, performance considerations, and the handling of late-arriving data in incremental models, refer to the [On the Limits of Incrementality](https://discourse.getdbt.com/t/on-the-limits-of-incrementality/303) discourse discussion. ## Understanding the is_incremental() macro The `is_incremental()` macro will return `True` if _all_ of the following conditions are met: * the destination table already exists in the database * dbt is _not_ running in full-refresh mode -* the running model is configured with `materialized='incremental'` +* The running model is configured with `materialized='incremental'` Note that the SQL in your model needs to be valid whether `is_incremental()` evaluates to `True` or `False`. From 03eb38de5abe022ef49a96ab7899bab68c882cc6 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Tue, 19 Dec 2023 13:26:37 -0500 Subject: [PATCH 013/143] Update website/docs/docs/build/incremental-models.md Co-authored-by: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> --- website/docs/docs/build/incremental-models.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/build/incremental-models.md b/website/docs/docs/build/incremental-models.md index ed0e6b51f02..cc45290ae15 100644 --- a/website/docs/docs/build/incremental-models.md +++ b/website/docs/docs/build/incremental-models.md @@ -162,7 +162,7 @@ Building models as tables in your data warehouse is often preferred for better q Incremental models offer a balance between complexity and improved performance compared to `view` and `table` materializations and offer better performance of your dbt runs. -In addition to these considerations for incremental models, it's important to understand their limits and challenges, particularly with large datasets. For more insights into efficient strategies, performance considerations, and the handling of late-arriving data in incremental models, refer to the [On the Limits of Incrementality](https://discourse.getdbt.com/t/on-the-limits-of-incrementality/303) discourse discussion. +In addition to these considerations for incremental models, it's important to understand their limitations and challenges, particularly with large datasets. For more insights into efficient strategies, performance considerations, and the handling of late-arriving data in incremental models, refer to the [On the Limits of Incrementality](https://discourse.getdbt.com/t/on-the-limits-of-incrementality/303) discourse discussion. ## Understanding the is_incremental() macro The `is_incremental()` macro will return `True` if _all_ of the following conditions are met: From eb27064c4d942eb8266aebffcf03586fafbb546e Mon Sep 17 00:00:00 2001 From: Jordan Stein Date: Tue, 19 Dec 2023 14:15:28 -0800 Subject: [PATCH 014/143] add mf bug fixes and ambigous resolution --- .../release-notes/74-Dec-2023/dec-sl-updates.md | 11 ++++++++++- 1 file changed, 10 insertions(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 8f0bdd593c7..cc40dd88461 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -12,10 +12,19 @@ The dbt Labs team continues to work on adding new features, fixing bugs, and inc - The dbt Semantic Layer can support `BIGINT` with precision greater than 18. Previously it would return an error. - We fixed a memory leak that would amount in intermittent errors when querying our JDBC API. - Added support for converting various Redshift and Postgres specific data types. Previously, the driver would throw an error when encountering columns with those types. +- Apply time offset for nested dervied & ratio metrics ([#882](https://github.com/dbt-labs/metricflow/issues/882)) +- Fix Incorrect SQL Column Name Rendering for WhereConstraintNode ([#908](https://github.com/dbt-labs/metricflow/issues/908)) +- `Unable To Satisfy Query Error` with Cumulative Metrics in Saved Queries ([#917](https://github.com/dbt-labs/metricflow/issues/917)) +- Fixes a bug in dimension-only queries where the filter column is removed before the filter has been applied. ([#923](https://github.com/dbt-labs/metricflow/issues/923)) +- Bug fix: Keep where constraint column until used for nested derived offset metric queries. ([#930](https://github.com/dbt-labs/metricflow/issues/930)) ## Improvements - dbt Labs deprecated [dbt Metrics and the legacy dbt Semantic Layer](/docs/dbt-versions/release-notes/Dec-2023/legacy-sl), both supported on dbt version 1.5 or lower. This change came into effect on December 15th, 2023. - The [dbt converter tool](https://github.com/dbt-labs/dbt-converter) can now help automate some of the work in converting from LookML (Looker's modeling language) for those who are migrating. Previously this wasn’t available. ## New features -- Test +- Support for ambiguous group-by-item resolution. Previously, group-by-items were input by the user in a relatively specific form. For example, the group-by-item: +``` +guest__listing__created_at__month +``` +refers to the created_at time dimension at a month grain that is resolved by joining the measure source to the dimension sources by the guest and listing entities. Now we handle this complexity for the user, and allow you to simply request ``listing__created_at__month``. If there is only one possible resolution, we will resolve it for the user. If there are multiple possible resolutions, we will ask for additional user input. From 71657295a252d6ab8c97e2b346dde8b519940cc0 Mon Sep 17 00:00:00 2001 From: Pat Kearns Date: Wed, 20 Dec 2023 14:21:48 +1100 Subject: [PATCH 015/143] Update snowflake-setup.md --- .../docs/docs/core/connect-data-platform/snowflake-setup.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/website/docs/docs/core/connect-data-platform/snowflake-setup.md b/website/docs/docs/core/connect-data-platform/snowflake-setup.md index 2b426ef667b..d9d4aa6f3cb 100644 --- a/website/docs/docs/core/connect-data-platform/snowflake-setup.md +++ b/website/docs/docs/core/connect-data-platform/snowflake-setup.md @@ -98,7 +98,8 @@ Along with adding the `authenticator` parameter, be sure to run `alter account s ### Key Pair Authentication -To use key pair authentication, omit a `password` and instead provide a `private_key_path` and, optionally, a `private_key_passphrase` in your target. **Note:** Versions of dbt before 0.16.0 required that private keys were encrypted and a `private_key_passphrase` was provided. This behavior was changed in dbt v0.16.0. +To use key pair authentication, omit a `password` and instead provide a `private_key_path` and, optionally, a `private_key_passphrase`. +**Note:** Versions of dbt before 0.16.0 required that private keys were encrypted and a `private_key_passphrase` was provided. Since dbt 0.16.0, unencrypted private keys are allowed. Only add the passphrase if necessary. Starting from [dbt v1.5.0](/docs/dbt-versions/core), you have the option to use a `private_key` string instead of a `private_key_path`. The `private_key` string should be in either Base64-encoded DER format, representing the key bytes, or a plain-text PEM format. Refer to [Snowflake documentation](https://docs.snowflake.com/developer-guide/python-connector/python-connector-example#using-key-pair-authentication-key-pair-rotation) for more info on how they generate the key. From 3f3e4378a55c5a364f9ae10001769ea152a285ae Mon Sep 17 00:00:00 2001 From: Pat Kearns Date: Wed, 20 Dec 2023 14:29:41 +1100 Subject: [PATCH 016/143] Update connect-snowflake.md --- .../docs/cloud/connect-data-platform/connect-snowflake.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md index 5f1c4cae725..0de67e17d9d 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md @@ -42,10 +42,10 @@ alter user jsmith set rsa_public_key='MIIBIjANBgkqh...'; ``` 2. Finally, set the **Private Key** and **Private Key Passphrase** fields in the **Credentials** page to finish configuring dbt Cloud to authenticate with Snowflake using a key pair. - - **Note:** At this time ONLY Encrypted Private Keys are supported by dbt Cloud, and the keys must be of size 4096 or smaller. + **Note:** Since dbt 0.16.0, unencrypted private keys are allowed. Only add the passphrase if necessary. + Starting from dbt v1.5.0, you have the option to use a private_key string instead of a private_key_path. The private_key string should be in either Base64-encoded DER format, representing the key bytes, or a plain-text PEM format. Refer to Snowflake documentation for more info on how they generate the key. -3. To successfully fill in the Private Key field, you **must** include commented lines when you add the passphrase. Leaving the **Private Key Passphrase** field empty will return an error. If you're receiving a `Could not deserialize key data` or `JWT token` error, refer to [Troubleshooting](#troubleshooting) for more info. +4. To successfully fill in the Private Key field, you **must** include commented lines. If you're receiving a `Could not deserialize key data` or `JWT token` error, refer to [Troubleshooting](#troubleshooting) for more info. **Example:** From 71e2cd7fe375ab4593e732ecb026317200bc3410 Mon Sep 17 00:00:00 2001 From: Pat Kearns Date: Wed, 20 Dec 2023 14:31:29 +1100 Subject: [PATCH 017/143] Update connect-snowflake.md remove v from version --- .../docs/docs/cloud/connect-data-platform/connect-snowflake.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md index 0de67e17d9d..34b69f56c27 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md @@ -43,7 +43,7 @@ alter user jsmith set rsa_public_key='MIIBIjANBgkqh...'; 2. Finally, set the **Private Key** and **Private Key Passphrase** fields in the **Credentials** page to finish configuring dbt Cloud to authenticate with Snowflake using a key pair. **Note:** Since dbt 0.16.0, unencrypted private keys are allowed. Only add the passphrase if necessary. - Starting from dbt v1.5.0, you have the option to use a private_key string instead of a private_key_path. The private_key string should be in either Base64-encoded DER format, representing the key bytes, or a plain-text PEM format. Refer to Snowflake documentation for more info on how they generate the key. + Starting from dbt 1.5.0, you have the option to use a private_key string instead of a private_key_path. The private_key string should be in either Base64-encoded DER format, representing the key bytes, or a plain-text PEM format. Refer to Snowflake documentation for more info on how they generate the key. 4. To successfully fill in the Private Key field, you **must** include commented lines. If you're receiving a `Could not deserialize key data` or `JWT token` error, refer to [Troubleshooting](#troubleshooting) for more info. From 73195850c5e8ce5e76893d1bd6defc50ad163b09 Mon Sep 17 00:00:00 2001 From: Benoit Perigaud <8754100+b-per@users.noreply.github.com> Date: Wed, 20 Dec 2023 11:08:48 +0100 Subject: [PATCH 018/143] Update spark-setup.md Fix incorrect rendering of heading --- website/docs/docs/core/connect-data-platform/spark-setup.md | 1 + 1 file changed, 1 insertion(+) diff --git a/website/docs/docs/core/connect-data-platform/spark-setup.md b/website/docs/docs/core/connect-data-platform/spark-setup.md index 93595cea3f6..992dc182b75 100644 --- a/website/docs/docs/core/connect-data-platform/spark-setup.md +++ b/website/docs/docs/core/connect-data-platform/spark-setup.md @@ -204,6 +204,7 @@ connect_retries: 3 + ### Server side configuration Spark can be customized using [Application Properties](https://spark.apache.org/docs/latest/configuration.html). Using these properties the execution can be customized, for example, to allocate more memory to the driver process. Also, the Spark SQL runtime can be set through these properties. For example, this allows the user to [set a Spark catalogs](https://spark.apache.org/docs/latest/configuration.html#spark-sql). From 458a79e0e85e7b581c55b2f3101f2ae01dcce1bf Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Wed, 20 Dec 2023 07:26:58 -0500 Subject: [PATCH 019/143] Update warehouse-setups-cloud-callout.md --- website/snippets/warehouse-setups-cloud-callout.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/website/snippets/warehouse-setups-cloud-callout.md b/website/snippets/warehouse-setups-cloud-callout.md index 3bc1147a637..56edd3a96ea 100644 --- a/website/snippets/warehouse-setups-cloud-callout.md +++ b/website/snippets/warehouse-setups-cloud-callout.md @@ -1,3 +1,3 @@ -:::info `profiles.yml` file is for CLI users only -If you're using dbt Cloud, you don't need to create a `profiles.yml` file. This file is only for CLI users. To connect your data platform to dbt Cloud, refer to [About data platforms](/docs/cloud/connect-data-platform/about-connections). +:::info `profiles.yml` file is for dbt Core users only +If you're using dbt Cloud, you don't need to create a `profiles.yml` file. This file is only for dbt Core users. To connect your data platform to dbt Cloud, refer to [About data platforms](/docs/cloud/connect-data-platform/about-connections). ::: From 49f6f1e388a333be4834b459113dbfb216b60656 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Wed, 20 Dec 2023 07:27:32 -0500 Subject: [PATCH 020/143] Update spark-setup.md --- website/docs/docs/core/connect-data-platform/spark-setup.md | 4 ---- 1 file changed, 4 deletions(-) diff --git a/website/docs/docs/core/connect-data-platform/spark-setup.md b/website/docs/docs/core/connect-data-platform/spark-setup.md index 992dc182b75..9d9e0c9d5fb 100644 --- a/website/docs/docs/core/connect-data-platform/spark-setup.md +++ b/website/docs/docs/core/connect-data-platform/spark-setup.md @@ -20,10 +20,6 @@ meta: -:::note -See [Databricks setup](#databricks-setup) for the Databricks version of this page. -::: - import SetUpPages from '/snippets/_setup-pages-intro.md'; From be3ddf8d32e754d2c6a2478d126d5646fd4f21e3 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Wed, 20 Dec 2023 07:32:37 -0500 Subject: [PATCH 021/143] Update dbt-databricks-for-databricks.md --- website/snippets/dbt-databricks-for-databricks.md | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/website/snippets/dbt-databricks-for-databricks.md b/website/snippets/dbt-databricks-for-databricks.md index f1c5ec84af1..acb0b111aaf 100644 --- a/website/snippets/dbt-databricks-for-databricks.md +++ b/website/snippets/dbt-databricks-for-databricks.md @@ -1,4 +1,5 @@ -:::info If you're using Databricks, use `dbt-databricks` -If you're using Databricks, the `dbt-databricks` adapter is recommended over `dbt-spark`. -If you're still using dbt-spark with Databricks consider [migrating from the dbt-spark adapter to the dbt-databricks adapter](/guides/migrate-from-spark-to-databricks). +:::tip If you're using Databricks, use `dbt-databricks` +If you're using Databricks, the `dbt-databricks` adapter is recommended over `dbt-spark`. If you're still using dbt-spark with Databricks consider [migrating from the dbt-spark adapter to the dbt-databricks adapter](/guides/migrate-from-spark-to-databricks). + +For the Databricks version of this page, refer to [Databricks setup](#databricks-setup). ::: From 9d2094267cbaf1601ed218865b748ea1c4eca5a4 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Wed, 20 Dec 2023 07:33:15 -0500 Subject: [PATCH 022/143] Update website/snippets/dbt-databricks-for-databricks.md --- website/snippets/dbt-databricks-for-databricks.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/snippets/dbt-databricks-for-databricks.md b/website/snippets/dbt-databricks-for-databricks.md index acb0b111aaf..1e18da33d42 100644 --- a/website/snippets/dbt-databricks-for-databricks.md +++ b/website/snippets/dbt-databricks-for-databricks.md @@ -1,4 +1,4 @@ -:::tip If you're using Databricks, use `dbt-databricks` +:::info If you're using Databricks, use `dbt-databricks` If you're using Databricks, the `dbt-databricks` adapter is recommended over `dbt-spark`. If you're still using dbt-spark with Databricks consider [migrating from the dbt-spark adapter to the dbt-databricks adapter](/guides/migrate-from-spark-to-databricks). For the Databricks version of this page, refer to [Databricks setup](#databricks-setup). From b595d2cef420b5381991d70de9eb3ba2f2f765df Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Wed, 20 Dec 2023 07:51:38 -0500 Subject: [PATCH 023/143] remove dup --- website/sidebars.js | 1 - 1 file changed, 1 deletion(-) diff --git a/website/sidebars.js b/website/sidebars.js index a82b2e06ec2..23a58360bbc 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -135,7 +135,6 @@ const sidebarSettings = { "docs/cloud/secure/redshift-privatelink", "docs/cloud/secure/postgres-privatelink", "docs/cloud/secure/vcs-privatelink", - "docs/cloud/secure/ip-restrictions", ], }, // PrivateLink "docs/cloud/billing", From f412756ea467c99cef528af63cfde64b1927c016 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Wed, 20 Dec 2023 08:24:04 -0500 Subject: [PATCH 024/143] add files to sidebar --- website/sidebars.js | 2 ++ 1 file changed, 2 insertions(+) diff --git a/website/sidebars.js b/website/sidebars.js index 23a58360bbc..6bb630037c1 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -1027,6 +1027,8 @@ const sidebarSettings = { id: "best-practices/how-we-build-our-metrics/semantic-layer-1-intro", }, items: [ + "best-practices/how-we-build-our-metrics/semantic-layer-1-intro", + "best-practices/how-we-build-our-metrics/semantic-layer-2-setup", "best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models", "best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics", "best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart", From 6125f3f686e986a76fd3922b33e6fcfad3ca68e9 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Wed, 20 Dec 2023 08:45:13 -0500 Subject: [PATCH 025/143] add missing pages to sidebar --- .../semantic-layer-2-setup.md | 25 ++++++++++++++++--- .../docs/docs/build/metricflow-commands.md | 11 ++++---- 2 files changed, 27 insertions(+), 9 deletions(-) diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md index 6e9153a3780..275395f6b18 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md @@ -13,9 +13,23 @@ git clone git@github.com:dbt-labs/jaffle-sl-template.git cd path/to/project ``` -Next, before you start writing code, you need to install MetricFlow as an extension of a dbt adapter from PyPI (dbt Core users only). The MetricFlow is compatible with Python versions 3.8 through 3.11. +Next, before you start writing code, you need to install MetricFlow: -We'll use pip to install MetricFlow and our dbt adapter: + + + + +- [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) — MetricFlow commands are embedded in the dbt Cloud CLI. This means you can immediately run them once you install the dbt Cloud CLI. Using dbt Cloud means you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. + +- [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) — You can create metrics using MetricFlow in the dbt Cloud IDE. However, support for running MetricFlow commands in the IDE will be available soon. + + + + + +- Download MetricFlow as an extension of a dbt adapter from PyPI (dbt Core users only). The MetricFlow is compatible with Python versions 3.8 through 3.11. + - **Note**, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow. +- We'll use pip to install MetricFlow and our dbt adapter: ```shell # activate a virtual environment for your project, @@ -27,13 +41,16 @@ python -m pip install "dbt-metricflow[adapter name]" # e.g. python -m pip install "dbt-metricflow[snowflake]" ``` -Lastly, to get to the pre-Semantic Layer starting state, checkout the `start-here` branch. + + + +- Now that you're ready to use MetricFlow, get to the pre-Semantic Layer starting state by checking out the `start-here` branch: ```shell git checkout start-here ``` -For more information, refer to the [MetricFlow commands](/docs/build/metricflow-commands) or a [quickstart](/guides) to get more familiar with setting up a dbt project. +For more information, refer to the [MetricFlow commands](/docs/build/metricflow-commands) or a [quickstart guides](/guides) to get more familiar with setting up a dbt project. ## Basic commands diff --git a/website/docs/docs/build/metricflow-commands.md b/website/docs/docs/build/metricflow-commands.md index e3bb93da964..a0964269e68 100644 --- a/website/docs/docs/build/metricflow-commands.md +++ b/website/docs/docs/build/metricflow-commands.md @@ -17,15 +17,16 @@ MetricFlow is compatible with Python versions 3.8, 3.9, 3.10, and 3.11. MetricFlow is a dbt package that allows you to define and query metrics in your dbt project. You can use MetricFlow to query metrics in your dbt project in the dbt Cloud CLI, dbt Cloud IDE, or dbt Core. -**Note** — MetricFlow commands aren't supported in dbt Cloud jobs yet. However, you can add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`python -m pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs. +Using MetricFlow with dbt Cloud means you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. + +**dbt Cloud jobs** — MetricFlow commands aren't supported in dbt Cloud jobs yet. However, you can add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`python -m pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs. -MetricFlow commands are embedded in the dbt Cloud CLI, which means you can immediately run them once you install the dbt Cloud CLI. - -A benefit to using the dbt Cloud is that you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. +- MetricFlow commands are embedded in the dbt Cloud CLI. This means you can immediately run them once you install the dbt Cloud CLI and don't need to install MetricFlow separately. +- You don't need to manage versioning — your dbt Cloud account will automatically manage the versioning for you. @@ -35,7 +36,7 @@ A benefit to using the dbt Cloud is that you won't need to manage versioning &md You can create metrics using MetricFlow in the dbt Cloud IDE. However, support for running MetricFlow commands in the IDE will be available soon. ::: -A benefit to using the dbt Cloud is that you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. + From 9dccece8d3a0009f8af6d64b6bdc29e9b8bb8764 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Wed, 20 Dec 2023 10:16:30 -0500 Subject: [PATCH 026/143] Update connect-snowflake.md From 47b0043dee4eca006013bd080603cc6e0fef4ccd Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Wed, 20 Dec 2023 14:52:21 -0500 Subject: [PATCH 027/143] tweaks --- .../74-Dec-2023/dec-sl-updates.md | 52 +++++++++++-------- 1 file changed, 31 insertions(+), 21 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index cc40dd88461..605683ed4c4 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -1,30 +1,40 @@ --- -title: "Updates and fixes: dbt Semantic Layer and MetricFlow updates for December 2023." +title: "dbt Semantic Layer and MetricFlow updates for December 2023" description: "December 2023: Enhanced Tableau integration, BIGINT support, LookML to MetricFlow conversion, and deprecation of legacy features." -sidebar_label: "Update ad fixes: dbt Semantic Layer and MetricFlow." +sidebar_label: "Update ad fixes: dbt Semantic Layer and MetricFlow" sidebar_position: 08 date: 2023-12-22 --- -The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer and MetricFlow. Here are the updates and fixes for December 2023. - -## Bug fixes -- The dbt Semantic Layer integration with Tableau now supports queries that resolve to a "NOT IN" clause (for example: using "exclude" in the filtering user interface). Previously it wasn’t supported. -- The dbt Semantic Layer can support `BIGINT` with precision greater than 18. Previously it would return an error. -- We fixed a memory leak that would amount in intermittent errors when querying our JDBC API. -- Added support for converting various Redshift and Postgres specific data types. Previously, the driver would throw an error when encountering columns with those types. -- Apply time offset for nested dervied & ratio metrics ([#882](https://github.com/dbt-labs/metricflow/issues/882)) -- Fix Incorrect SQL Column Name Rendering for WhereConstraintNode ([#908](https://github.com/dbt-labs/metricflow/issues/908)) -- `Unable To Satisfy Query Error` with Cumulative Metrics in Saved Queries ([#917](https://github.com/dbt-labs/metricflow/issues/917)) -- Fixes a bug in dimension-only queries where the filter column is removed before the filter has been applied. ([#923](https://github.com/dbt-labs/metricflow/issues/923)) -- Bug fix: Keep where constraint column until used for nested derived offset metric queries. ([#930](https://github.com/dbt-labs/metricflow/issues/930)) +The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer and MetricFlow. + +Refer to the following updates and fixes for December 2023: + +## gBug fixes + +The following are fixes for the dbt Semantic Layer and MetricFlow: + +**dbt Semantic Layer** + +- Tableau integration — The dbt Semantic Layer integration with Tableau now supports queries that resolve to a "NOT IN" clause. This applies to using "exclude" in the filtering user interface. Previously it wasn’t supported. +- `BIGINT` support — The dbt Semantic Layer can now support `BIGINT` values with precision greater than 18. Previously it would return an error. +- Memory leak — We fixed a memory leak in the JDBC API that would previously lead to intermittent errors when querying it. +- Data conversion support — Added support for converting various Redshift and Postgres-specific data types. Previously, the driver would throw an error when encountering columns with those types. + +**MetricFlow** + +- Time offset for nested metrics — Implemented time offset for nested derived and ratio metrics. ([MetricFlow Issue #882](https://github.com/dbt-labs/metricflow/issues/882)) +- SQL column name rendering: — Fixed incorrect SQL column name rendering in `WhereConstraintNode`. ([MetricFlow Issue #908](https://github.com/dbt-labs/metricflow/issues/908)) +- Cumulative metrics query error — Fixed the `Unable To Satisfy Query` error with cumulative metrics in Saved Queries. ([MetricFlow Issue #917](https://github.com/dbt-labs/metricflow/issues/917)) +- Dimension-only query — Fixes a bug in dimension-only queries where the filter column is removed before the filter has been applied. ([MetricFlow Issue #923](https://github.com/dbt-labs/metricflow/issues/923)) +- Where constraint column — Ensured retention of the where constraint column until used for nested derived offset metric queries. ([MetricFlow Issue #930](https://github.com/dbt-labs/metricflow/issues/930)) ## Improvements -- dbt Labs deprecated [dbt Metrics and the legacy dbt Semantic Layer](/docs/dbt-versions/release-notes/Dec-2023/legacy-sl), both supported on dbt version 1.5 or lower. This change came into effect on December 15th, 2023. -- The [dbt converter tool](https://github.com/dbt-labs/dbt-converter) can now help automate some of the work in converting from LookML (Looker's modeling language) for those who are migrating. Previously this wasn’t available. + +- Deprecation — We deprecated [dbt Metrics and the legacy dbt Semantic Layer](/docs/dbt-versions/release-notes/Dec-2023/legacy-sl), both supported on dbt version 1.5 or lower. This change came into effect on December 15th, 2023. +- Improved dbt converter tool — The [dbt converter tool](https://github.com/dbt-labs/dbt-converter) can now help automate some of the work in converting from LookML (Looker's modeling language) for those who are migrating. Previously this wasn’t available. ## New features -- Support for ambiguous group-by-item resolution. Previously, group-by-items were input by the user in a relatively specific form. For example, the group-by-item: -``` -guest__listing__created_at__month -``` -refers to the created_at time dimension at a month grain that is resolved by joining the measure source to the dimension sources by the guest and listing entities. Now we handle this complexity for the user, and allow you to simply request ``listing__created_at__month``. If there is only one possible resolution, we will resolve it for the user. If there are multiple possible resolutions, we will ask for additional user input. + +- Simplified group-by-item requests — Improved support for ambiguous group-by-item resolution. Previously, you need to specify them in detail, like `guest__listing__created_at__month`. This indicates a monthly `created_at` time dimension, linked by `guest` and `listing` entities. + + Now you can use a shorter form, like ` listing__created_at__month`. If there's only one way to interpret this, dbt will resolve it automatically. If multiple interpretations are possible, dbt will ask for more details from the user. From 92ac11846dbadb4642bf5e510a8f14339b55fea7 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Wed, 20 Dec 2023 14:53:34 -0500 Subject: [PATCH 028/143] consistent language --- .../release-notes/74-Dec-2023/dec-sl-updates.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 605683ed4c4..6895dce1f1a 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -9,7 +9,7 @@ The dbt Labs team continues to work on adding new features, fixing bugs, and inc Refer to the following updates and fixes for December 2023: -## gBug fixes +## Bug fixes The following are fixes for the dbt Semantic Layer and MetricFlow: @@ -17,7 +17,7 @@ The following are fixes for the dbt Semantic Layer and MetricFlow: - Tableau integration — The dbt Semantic Layer integration with Tableau now supports queries that resolve to a "NOT IN" clause. This applies to using "exclude" in the filtering user interface. Previously it wasn’t supported. - `BIGINT` support — The dbt Semantic Layer can now support `BIGINT` values with precision greater than 18. Previously it would return an error. -- Memory leak — We fixed a memory leak in the JDBC API that would previously lead to intermittent errors when querying it. +- Memory leak — Fixed a memory leak in the JDBC API that would previously lead to intermittent errors when querying it. - Data conversion support — Added support for converting various Redshift and Postgres-specific data types. Previously, the driver would throw an error when encountering columns with those types. **MetricFlow** @@ -25,7 +25,7 @@ The following are fixes for the dbt Semantic Layer and MetricFlow: - Time offset for nested metrics — Implemented time offset for nested derived and ratio metrics. ([MetricFlow Issue #882](https://github.com/dbt-labs/metricflow/issues/882)) - SQL column name rendering: — Fixed incorrect SQL column name rendering in `WhereConstraintNode`. ([MetricFlow Issue #908](https://github.com/dbt-labs/metricflow/issues/908)) - Cumulative metrics query error — Fixed the `Unable To Satisfy Query` error with cumulative metrics in Saved Queries. ([MetricFlow Issue #917](https://github.com/dbt-labs/metricflow/issues/917)) -- Dimension-only query — Fixes a bug in dimension-only queries where the filter column is removed before the filter has been applied. ([MetricFlow Issue #923](https://github.com/dbt-labs/metricflow/issues/923)) +- Dimension-only query — Fixed a bug in dimension-only queries where the filter column is removed before the filter has been applied. ([MetricFlow Issue #923](https://github.com/dbt-labs/metricflow/issues/923)) - Where constraint column — Ensured retention of the where constraint column until used for nested derived offset metric queries. ([MetricFlow Issue #930](https://github.com/dbt-labs/metricflow/issues/930)) ## Improvements From 89aeae466e9b3cad71ce85cbc1ec4672663c4be8 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Wed, 20 Dec 2023 14:54:31 -0500 Subject: [PATCH 029/143] typo --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 6895dce1f1a..8fce5b837c7 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -1,7 +1,7 @@ --- title: "dbt Semantic Layer and MetricFlow updates for December 2023" description: "December 2023: Enhanced Tableau integration, BIGINT support, LookML to MetricFlow conversion, and deprecation of legacy features." -sidebar_label: "Update ad fixes: dbt Semantic Layer and MetricFlow" +sidebar_label: "Update and fixes: dbt Semantic Layer and MetricFlow" sidebar_position: 08 date: 2023-12-22 --- From 27cd55515fa34e88666ce3e1290fd89cd7708818 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Wed, 20 Dec 2023 14:56:13 -0500 Subject: [PATCH 030/143] tweak --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 8fce5b837c7..96a1e20fc6b 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -11,7 +11,7 @@ Refer to the following updates and fixes for December 2023: ## Bug fixes -The following are fixes for the dbt Semantic Layer and MetricFlow: +The following are updates for the dbt Semantic Layer and MetricFlow: **dbt Semantic Layer** From 0d4ec7716f917b12f54102b40e08a68dbda468cb Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:45:35 -0500 Subject: [PATCH 031/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index f51181bf588..df42765825d 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -13,7 +13,7 @@ is_featured: false --- ## Overview -Over the course of my 3 years running the Partner Engineering team at dbt Labs, the most common question I have been asked is “How do we integrate with dbt?”. Because those conversations often start out at the same place, I decided to create this guide so I’m no longer the blocker to fundamental information. This also allows us to skip the intro and get to the fun conversations like what a joint solution for our customers would look like so much faster. +Over the course of my three years running the Partner Engineering team at dbt Labs, the most common question I've been asked is, How do we integrate with dbt? Because those conversations often start out at the same place, I decided to create this guide so I’m no longer the blocker to fundamental information. This also allows us to skip the intro and get to the fun conversations so much faster, like what a joint solution for our customers would look like. Now this guide does not include how to integrate with dbt Core. If you’re interested in creating an dbt Adapter, **[please check out this documentation instead.](https://docs.getdbt.com/guides/dbt-ecosystem/adapter-development/1-what-are-adapters)** From 44bf2cf36d86912636a47af580551b5f2165c711 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:45:53 -0500 Subject: [PATCH 032/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index df42765825d..d589bf76dd4 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -17,7 +17,7 @@ Over the course of my three years running the Partner Engineering team at dbt La Now this guide does not include how to integrate with dbt Core. If you’re interested in creating an dbt Adapter, **[please check out this documentation instead.](https://docs.getdbt.com/guides/dbt-ecosystem/adapter-development/1-what-are-adapters)** -Instead we are going to focus on integrating with dbt Cloud. Integrating with dbt Cloud is a key requirement to become a dbt Labs technology partner, opening the door to a variety of collaborative commercial opportunities. +Instead, we're going to focus on integrating with dbt Cloud. Integrating with dbt Cloud is a key requirement to become a dbt Labs technology partner, opening the door to a variety of collaborative commercial opportunities. Here I will cover how to get started, potential use cases you want to solve for, and points of integrations to do so. From c28ecb48016971a6e28e61c476ae12bfae85a550 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:46:10 -0500 Subject: [PATCH 033/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index d589bf76dd4..5353ca996fd 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -19,7 +19,7 @@ Now this guide does not include how to integrate with dbt Core. If you’re inte Instead, we're going to focus on integrating with dbt Cloud. Integrating with dbt Cloud is a key requirement to become a dbt Labs technology partner, opening the door to a variety of collaborative commercial opportunities. -Here I will cover how to get started, potential use cases you want to solve for, and points of integrations to do so. +Here I'll cover how to get started, potential use cases you want to solve for, and points of integrations to do so. ## New to dbt Cloud? From 402aee56cf622a0ef9d8d12c53afd3a1db2e249e Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:46:31 -0500 Subject: [PATCH 034/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 5353ca996fd..a48f18ef7cc 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -23,7 +23,7 @@ Here I'll cover how to get started, potential use cases you want to solve for, a ## New to dbt Cloud? -If you're new to dbt and dbt Cloud, we recommend you and your software developers try our [Getting Started Quickstarts](https://docs.getdbt.com/quickstarts) after reading [What is dbt?](https://docs.getdbt.com/docs/introduction). The documentation will help you familiarize yourself with how our users interact with dbt. By going through this, you will also create a sample dbt project to test your integration. +If you're new to dbt and dbt Cloud, we recommend you and your software developers try our [Getting Started Quickstarts](/guides) after reading [What is dbt?](/docs/introduction). The documentation will help you familiarize yourself with how our users interact with dbt. By going through this, you will also create a sample dbt project to test your integration. If you require a partner dbt Cloud account to test on, we can upgrade an existing account or a trial account. **This account may only be used for development, training, and demonstration purposes.** Please speak to your partner manager if you're interested and provide the account id (provided in the URL). Our partner account has all of the enterprise level functionality and can be provided with a signed partnerships agreement. From 434e7c0ce23345b6d73146fd3ba7875df6751e4f Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:46:48 -0500 Subject: [PATCH 035/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index a48f18ef7cc..0db135cf9bf 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -27,7 +27,7 @@ If you're new to dbt and dbt Cloud, we recommend you and your software developer If you require a partner dbt Cloud account to test on, we can upgrade an existing account or a trial account. **This account may only be used for development, training, and demonstration purposes.** Please speak to your partner manager if you're interested and provide the account id (provided in the URL). Our partner account has all of the enterprise level functionality and can be provided with a signed partnerships agreement. -## Integration Points +## Integration points - [Discovery API (formerly referred to as Metadata API)](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-api) - **Overview**: This GraphQL API allows you to query the metadata that dbt Cloud generates every time you run a dbt Project. We have two schemas available (environment and job level). By default, we always recommend that you integrate with the environment level schema because it contains the latest state and historical run results of all the jobs run on the dbt Cloud project. The job level will only provide you the metadata of one job, giving you only a small snapshot of part of the project. From 74e6fbbd25ad0943ad354c0a406833e7738d5fb7 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:47:13 -0500 Subject: [PATCH 036/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 0db135cf9bf..85424a1219a 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -29,7 +29,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin ## Integration points -- [Discovery API (formerly referred to as Metadata API)](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-api) +- [Discovery API (formerly referred to as Metadata API)](/docs/dbt-cloud-apis/discovery-api) - **Overview**: This GraphQL API allows you to query the metadata that dbt Cloud generates every time you run a dbt Project. We have two schemas available (environment and job level). By default, we always recommend that you integrate with the environment level schema because it contains the latest state and historical run results of all the jobs run on the dbt Cloud project. The job level will only provide you the metadata of one job, giving you only a small snapshot of part of the project. - [Administrative API (also referred to as the Admin API)](https://docs.getdbt.com/docs/dbt-cloud-apis/admin-cloud-api) - **Overview:** This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. From 6d215a8d793ba675cb53242de01eda58e7c4559c Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:47:28 -0500 Subject: [PATCH 037/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 85424a1219a..ec70e770431 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -25,7 +25,7 @@ Here I'll cover how to get started, potential use cases you want to solve for, a If you're new to dbt and dbt Cloud, we recommend you and your software developers try our [Getting Started Quickstarts](/guides) after reading [What is dbt?](/docs/introduction). The documentation will help you familiarize yourself with how our users interact with dbt. By going through this, you will also create a sample dbt project to test your integration. -If you require a partner dbt Cloud account to test on, we can upgrade an existing account or a trial account. **This account may only be used for development, training, and demonstration purposes.** Please speak to your partner manager if you're interested and provide the account id (provided in the URL). Our partner account has all of the enterprise level functionality and can be provided with a signed partnerships agreement. +If you require a partner dbt Cloud account to test on, we can upgrade an existing account or a trial account. This account may only be used for development, training, and demonstration purposes. Please contact your partner manager if you're interested and provide the account ID (provided in the URL). Our partner account includes all of the enterprise level functionality and can be provided with a signed partnerships agreement. ## Integration points From 74a53e22e95c1bc38c493e6d1cb01b65c2d3aa77 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:49:55 -0500 Subject: [PATCH 038/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index ec70e770431..af93fbdae34 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -30,7 +30,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin ## Integration points - [Discovery API (formerly referred to as Metadata API)](/docs/dbt-cloud-apis/discovery-api) - - **Overview**: This GraphQL API allows you to query the metadata that dbt Cloud generates every time you run a dbt Project. We have two schemas available (environment and job level). By default, we always recommend that you integrate with the environment level schema because it contains the latest state and historical run results of all the jobs run on the dbt Cloud project. The job level will only provide you the metadata of one job, giving you only a small snapshot of part of the project. + - **Overview** — This GraphQL API allows you to query the metadata that dbt Cloud generates every time you run a dbt project. We have two schemas available (environment and job level). By default, we always recommend that you integrate with the environment level schema because it contains the latest state and historical run results of all the jobs run on the dbt Cloud project. The job level will only provide you the metadata of one job, giving you only a small snapshot of part of the project. - [Administrative API (also referred to as the Admin API)](https://docs.getdbt.com/docs/dbt-cloud-apis/admin-cloud-api) - **Overview:** This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. - Webhooks From ec77ce086f441682332a8924f54799fcea6160b3 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:50:14 -0500 Subject: [PATCH 039/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index af93fbdae34..491b88acc12 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -15,7 +15,7 @@ is_featured: false Over the course of my three years running the Partner Engineering team at dbt Labs, the most common question I've been asked is, How do we integrate with dbt? Because those conversations often start out at the same place, I decided to create this guide so I’m no longer the blocker to fundamental information. This also allows us to skip the intro and get to the fun conversations so much faster, like what a joint solution for our customers would look like. -Now this guide does not include how to integrate with dbt Core. If you’re interested in creating an dbt Adapter, **[please check out this documentation instead.](https://docs.getdbt.com/guides/dbt-ecosystem/adapter-development/1-what-are-adapters)** +This guide doesn't include how to integrate with dbt Core. If you’re interested in creating a dbt adapter, please check out the [adapter development guide](/guides/dbt-ecosystem/adapter-development/1-what-are-adapters) instead. Instead, we're going to focus on integrating with dbt Cloud. Integrating with dbt Cloud is a key requirement to become a dbt Labs technology partner, opening the door to a variety of collaborative commercial opportunities. From d652d75ee0ed3e3c815748e6da6f04750061dca5 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:50:32 -0500 Subject: [PATCH 040/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 491b88acc12..f4e06eb6fa6 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -33,7 +33,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin - **Overview** — This GraphQL API allows you to query the metadata that dbt Cloud generates every time you run a dbt project. We have two schemas available (environment and job level). By default, we always recommend that you integrate with the environment level schema because it contains the latest state and historical run results of all the jobs run on the dbt Cloud project. The job level will only provide you the metadata of one job, giving you only a small snapshot of part of the project. - [Administrative API (also referred to as the Admin API)](https://docs.getdbt.com/docs/dbt-cloud-apis/admin-cloud-api) - **Overview:** This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. -- Webhooks +- [Webhooks](/docs/deploy/webhooks) - **Overview:** Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information on your dbt jobs in real time. - [Link to documentation](https://docs.getdbt.com/docs/deploy/webhooks) - Semantic Layers/Metrics From bd6426d970aba27c9101e2305921e2aada1aa43d Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:50:47 -0500 Subject: [PATCH 041/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index f4e06eb6fa6..7ad14063a29 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -34,7 +34,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin - [Administrative API (also referred to as the Admin API)](https://docs.getdbt.com/docs/dbt-cloud-apis/admin-cloud-api) - **Overview:** This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. - [Webhooks](/docs/deploy/webhooks) - - **Overview:** Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information on your dbt jobs in real time. + - **Overview** — Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information about your dbt jobs in real time. - [Link to documentation](https://docs.getdbt.com/docs/deploy/webhooks) - Semantic Layers/Metrics - **Overview: Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](https://docs.getdbt.com/docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](https://docs.getdbt.com/guides/dbt-ecosystem/sl-partner-integration-guide).** From d29101c0776373d7149e12691e441fa08afc5067 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:51:02 -0500 Subject: [PATCH 042/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 7ad14063a29..03af0c4e3d0 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -32,7 +32,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin - [Discovery API (formerly referred to as Metadata API)](/docs/dbt-cloud-apis/discovery-api) - **Overview** — This GraphQL API allows you to query the metadata that dbt Cloud generates every time you run a dbt project. We have two schemas available (environment and job level). By default, we always recommend that you integrate with the environment level schema because it contains the latest state and historical run results of all the jobs run on the dbt Cloud project. The job level will only provide you the metadata of one job, giving you only a small snapshot of part of the project. - [Administrative API (also referred to as the Admin API)](https://docs.getdbt.com/docs/dbt-cloud-apis/admin-cloud-api) - - **Overview:** This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. + - **Overview** — This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. - [Webhooks](/docs/deploy/webhooks) - **Overview** — Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information about your dbt jobs in real time. - [Link to documentation](https://docs.getdbt.com/docs/deploy/webhooks) From dce49403d18df90d11561cd296315960b745864f Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:51:18 -0500 Subject: [PATCH 043/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 03af0c4e3d0..8891f89cd82 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -36,7 +36,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin - [Webhooks](/docs/deploy/webhooks) - **Overview** — Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information about your dbt jobs in real time. - [Link to documentation](https://docs.getdbt.com/docs/deploy/webhooks) -- Semantic Layers/Metrics +- [Semantic Layers/Metrics](/docs/dbt-cloud-apis/sl-api-overview) - **Overview: Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](https://docs.getdbt.com/docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](https://docs.getdbt.com/guides/dbt-ecosystem/sl-partner-integration-guide).** - Metrics definitions can be pulled from the Discovery API (linked above) or the Semantic Layer Driver/GraphQL API. The key difference is the Discovery API is not able to pull the semantic graph which provides the list of available dimensions that one can query per metric. That is only available via the SL Driver/APIs. The tradeoff is the SL Driver/APIs does not have access to the lineage of the entire dbt project (i.e how the dbt metrics dependencies on dbt models) - [We have three available integration points for the Semantic Layer API.](https://docs.getdbt.com/docs/dbt-cloud-apis/sl-api-overview) From e7a5cfcbf4497acd2ff82c03bbd9d0792835d571 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:51:36 -0500 Subject: [PATCH 044/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 8891f89cd82..bd95135bf2e 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -38,7 +38,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin - [Link to documentation](https://docs.getdbt.com/docs/deploy/webhooks) - [Semantic Layers/Metrics](/docs/dbt-cloud-apis/sl-api-overview) - **Overview: Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](https://docs.getdbt.com/docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](https://docs.getdbt.com/guides/dbt-ecosystem/sl-partner-integration-guide).** - - Metrics definitions can be pulled from the Discovery API (linked above) or the Semantic Layer Driver/GraphQL API. The key difference is the Discovery API is not able to pull the semantic graph which provides the list of available dimensions that one can query per metric. That is only available via the SL Driver/APIs. The tradeoff is the SL Driver/APIs does not have access to the lineage of the entire dbt project (i.e how the dbt metrics dependencies on dbt models) + - Metrics definitions can be pulled from the Discovery API (linked above) or the Semantic Layer Driver/GraphQL API. The key difference is that the Discovery API isn't able to pull the semantic graph, which provides the list of available dimensions that one can query per metric. That is only available with the SL Driver/APIs. The trade-off is that the SL Driver/APIs doesn't have access to the lineage of the entire dbt project (that is, how the dbt metrics dependencies on dbt models). - [We have three available integration points for the Semantic Layer API.](https://docs.getdbt.com/docs/dbt-cloud-apis/sl-api-overview) ## dbt Cloud Hosting and Authentication From 8c2b058c0815d11ba6db37883f0ed900757bd316 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:51:59 -0500 Subject: [PATCH 045/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index bd95135bf2e..2e03723719f 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -41,7 +41,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin - Metrics definitions can be pulled from the Discovery API (linked above) or the Semantic Layer Driver/GraphQL API. The key difference is that the Discovery API isn't able to pull the semantic graph, which provides the list of available dimensions that one can query per metric. That is only available with the SL Driver/APIs. The trade-off is that the SL Driver/APIs doesn't have access to the lineage of the entire dbt project (that is, how the dbt metrics dependencies on dbt models). - [We have three available integration points for the Semantic Layer API.](https://docs.getdbt.com/docs/dbt-cloud-apis/sl-api-overview) -## dbt Cloud Hosting and Authentication +## dbt Cloud hosting and authentication To use the dbt Cloud APIs, you will need access to the customer’s access urls. Depending on their dbt Cloud setup, they will have a different access url. To find out more, here is the [documentation](https://docs.getdbt.com/docs/cloud/about-cloud/regions-ip-addresses) to understand all the possible configurations. My recommendation is to allow the customer to provide their own url to simplify support. From 294a6aefc70b741de2cae76477b473975c7b40f4 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:52:16 -0500 Subject: [PATCH 046/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 2e03723719f..9940ade8c69 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -37,7 +37,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin - **Overview** — Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information about your dbt jobs in real time. - [Link to documentation](https://docs.getdbt.com/docs/deploy/webhooks) - [Semantic Layers/Metrics](/docs/dbt-cloud-apis/sl-api-overview) - - **Overview: Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](https://docs.getdbt.com/docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](https://docs.getdbt.com/guides/dbt-ecosystem/sl-partner-integration-guide).** + - **Overview** — Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](/docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](/guides/dbt-ecosystem/sl-partner-integration-guide). - Metrics definitions can be pulled from the Discovery API (linked above) or the Semantic Layer Driver/GraphQL API. The key difference is that the Discovery API isn't able to pull the semantic graph, which provides the list of available dimensions that one can query per metric. That is only available with the SL Driver/APIs. The trade-off is that the SL Driver/APIs doesn't have access to the lineage of the entire dbt project (that is, how the dbt metrics dependencies on dbt models). - [We have three available integration points for the Semantic Layer API.](https://docs.getdbt.com/docs/dbt-cloud-apis/sl-api-overview) From 27b085a7a5b428e860db900d9c7f2046ada82ac5 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:52:32 -0500 Subject: [PATCH 047/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 9940ade8c69..38c3e9b6d6c 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -39,7 +39,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin - [Semantic Layers/Metrics](/docs/dbt-cloud-apis/sl-api-overview) - **Overview** — Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](/docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](/guides/dbt-ecosystem/sl-partner-integration-guide). - Metrics definitions can be pulled from the Discovery API (linked above) or the Semantic Layer Driver/GraphQL API. The key difference is that the Discovery API isn't able to pull the semantic graph, which provides the list of available dimensions that one can query per metric. That is only available with the SL Driver/APIs. The trade-off is that the SL Driver/APIs doesn't have access to the lineage of the entire dbt project (that is, how the dbt metrics dependencies on dbt models). - - [We have three available integration points for the Semantic Layer API.](https://docs.getdbt.com/docs/dbt-cloud-apis/sl-api-overview) + - Three integration points are available for the Semantic Layer API. ## dbt Cloud hosting and authentication From efe6c44a886fe5efa3fcf6ea04d54fe920d529f2 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:52:52 -0500 Subject: [PATCH 048/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 38c3e9b6d6c..727c1187542 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -31,7 +31,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin - [Discovery API (formerly referred to as Metadata API)](/docs/dbt-cloud-apis/discovery-api) - **Overview** — This GraphQL API allows you to query the metadata that dbt Cloud generates every time you run a dbt project. We have two schemas available (environment and job level). By default, we always recommend that you integrate with the environment level schema because it contains the latest state and historical run results of all the jobs run on the dbt Cloud project. The job level will only provide you the metadata of one job, giving you only a small snapshot of part of the project. -- [Administrative API (also referred to as the Admin API)](https://docs.getdbt.com/docs/dbt-cloud-apis/admin-cloud-api) +- [Administrative (Admin) API](/docs/dbt-cloud-apis/admin-cloud-api) - **Overview** — This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. - [Webhooks](/docs/deploy/webhooks) - **Overview** — Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information about your dbt jobs in real time. From 1b6cc262871831682f14464d5a66863305d466fa Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:55:55 -0500 Subject: [PATCH 049/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 727c1187542..1abe7b396d4 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -43,7 +43,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin ## dbt Cloud hosting and authentication -To use the dbt Cloud APIs, you will need access to the customer’s access urls. Depending on their dbt Cloud setup, they will have a different access url. To find out more, here is the [documentation](https://docs.getdbt.com/docs/cloud/about-cloud/regions-ip-addresses) to understand all the possible configurations. My recommendation is to allow the customer to provide their own url to simplify support. +To use the dbt Cloud APIs, you'll need access to the customer’s access urls. Depending on their dbt Cloud setup, they'll have a different access URL. To find out more, refer to [Regions & IP addresses](/docs/cloud/about-cloud/regions-ip-addresses) to understand all the possible configurations. My recommendation is to allow the customer to provide their own URL to simplify support. If the customer is on an Azure Single Tenant instance, they do not currently have access to the Discovery API or the Semantic Layer APIs. From 325c554b4f14eb1aaf7baff221d6e82cb2538e85 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:57:49 -0500 Subject: [PATCH 050/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 1abe7b396d4..9207dad4d55 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -45,7 +45,7 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin To use the dbt Cloud APIs, you'll need access to the customer’s access urls. Depending on their dbt Cloud setup, they'll have a different access URL. To find out more, refer to [Regions & IP addresses](/docs/cloud/about-cloud/regions-ip-addresses) to understand all the possible configurations. My recommendation is to allow the customer to provide their own URL to simplify support. -If the customer is on an Azure Single Tenant instance, they do not currently have access to the Discovery API or the Semantic Layer APIs. +If the customer is on an Azure single tenant instance, they don't currently have access to the Discovery API or the Semantic Layer APIs. For authentication, we highly recommend that your integration uses account service tokens. You can read more about how to create a service token and what permission sets to provide it [here](https://docs.getdbt.com/docs/dbt-cloud-apis/service-tokens). Please note depending on their plan type, they will have access to different permission sets. We **do not** recommend that users supply their user bearer tokens for authentication. This can cause issues if the user leaves the organization and provides you access to all the dbt Cloud accounts associated to the user rather than just the account (and related projects) that they want to integrate with. From 151e4da6d1707694e009fe69186174be927950d2 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:59:04 -0500 Subject: [PATCH 051/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 9207dad4d55..a01430e38b8 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -47,7 +47,7 @@ To use the dbt Cloud APIs, you'll need access to the customer’s access urls. D If the customer is on an Azure single tenant instance, they don't currently have access to the Discovery API or the Semantic Layer APIs. -For authentication, we highly recommend that your integration uses account service tokens. You can read more about how to create a service token and what permission sets to provide it [here](https://docs.getdbt.com/docs/dbt-cloud-apis/service-tokens). Please note depending on their plan type, they will have access to different permission sets. We **do not** recommend that users supply their user bearer tokens for authentication. This can cause issues if the user leaves the organization and provides you access to all the dbt Cloud accounts associated to the user rather than just the account (and related projects) that they want to integrate with. +For authentication, we highly recommend that your integration uses account service tokens. You can read more about [how to create a service token and what permission sets to provide it](/docs/dbt-cloud-apis/service-tokens). Please note that depending on their plan type, they'll have access to different permission sets. We _do not_ recommend that users supply their user bearer tokens for authentication. This can cause issues if the user leaves the organization and provides you access to all the dbt Cloud accounts associated to the user rather than just the account (and related projects) that they want to integrate with. ## Potential Use Cases From cec34b0fdf0d6a94e46fd78bf2e09a9fbc9304b6 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:59:18 -0500 Subject: [PATCH 052/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index a01430e38b8..1fac6ab730f 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -52,7 +52,7 @@ For authentication, we highly recommend that your integration uses account servi ## Potential Use Cases - Event-based orchestration - - **Desired Action:** You wish to receive information that a scheduled dbt Cloud Job has been completed or kick off a dbt Cloud job. You can align your product schedule to the dbt Cloud run schedule. + - **Desired action** — You want to receive information that a scheduled dbt Cloud job has been completed or has kicked off a dbt Cloud job. You can align your product schedule to the dbt Cloud run schedule. - **Examples:** Kicking off a dbt Job after the ETL job of extracting and loading the data is completed. Or receiving a webhook after the job has been completed to kick off your reverse ETL job. - **Integration Points:** Webhooks and/or Admin API - dbt Lineage From 71db8f23bade59a08184a0b6a9d43b6b2604b267 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:59:29 -0500 Subject: [PATCH 053/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 1fac6ab730f..d39876f5de0 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -54,7 +54,7 @@ For authentication, we highly recommend that your integration uses account servi - Event-based orchestration - **Desired action** — You want to receive information that a scheduled dbt Cloud job has been completed or has kicked off a dbt Cloud job. You can align your product schedule to the dbt Cloud run schedule. - **Examples:** Kicking off a dbt Job after the ETL job of extracting and loading the data is completed. Or receiving a webhook after the job has been completed to kick off your reverse ETL job. - - **Integration Points:** Webhooks and/or Admin API + - **Integration points** — Webhooks and/or Admin API - dbt Lineage - **Desired Action:** You wish to interpolate the dbt lineage metadata into your tool. - **Example: In your tool, you wish to pull in the dbt DAG into your lineage diagram. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-full-data-lineage)** From 1f17b84f1e98a47c4215343bbadfa324b488de80 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:59:41 -0500 Subject: [PATCH 054/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index d39876f5de0..e74ed030c19 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -55,7 +55,7 @@ For authentication, we highly recommend that your integration uses account servi - **Desired action** — You want to receive information that a scheduled dbt Cloud job has been completed or has kicked off a dbt Cloud job. You can align your product schedule to the dbt Cloud run schedule. - **Examples:** Kicking off a dbt Job after the ETL job of extracting and loading the data is completed. Or receiving a webhook after the job has been completed to kick off your reverse ETL job. - **Integration points** — Webhooks and/or Admin API -- dbt Lineage +- dbt lineage - **Desired Action:** You wish to interpolate the dbt lineage metadata into your tool. - **Example: In your tool, you wish to pull in the dbt DAG into your lineage diagram. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-full-data-lineage)** - **Integration Points:** Discovery API From f0638c430a7ea9a2150c5a765621397749ac05f9 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:59:53 -0500 Subject: [PATCH 055/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index e74ed030c19..b499373ca4f 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -56,7 +56,7 @@ For authentication, we highly recommend that your integration uses account servi - **Examples:** Kicking off a dbt Job after the ETL job of extracting and loading the data is completed. Or receiving a webhook after the job has been completed to kick off your reverse ETL job. - **Integration points** — Webhooks and/or Admin API - dbt lineage - - **Desired Action:** You wish to interpolate the dbt lineage metadata into your tool. + - **Desired action** — You want to interpolate the dbt lineage metadata into your tool. - **Example: In your tool, you wish to pull in the dbt DAG into your lineage diagram. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-full-data-lineage)** - **Integration Points:** Discovery API - dbt Environment/Job metadata From 50f4b3c800eb5d7c2c71783c53073c8299abccc7 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:00:09 -0500 Subject: [PATCH 056/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index b499373ca4f..bd9649866ae 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -53,7 +53,7 @@ For authentication, we highly recommend that your integration uses account servi - Event-based orchestration - **Desired action** — You want to receive information that a scheduled dbt Cloud job has been completed or has kicked off a dbt Cloud job. You can align your product schedule to the dbt Cloud run schedule. - - **Examples:** Kicking off a dbt Job after the ETL job of extracting and loading the data is completed. Or receiving a webhook after the job has been completed to kick off your reverse ETL job. + - **Examples** — Kicking off a dbt job after the ETL job of extracting and loading the data is completed. Or receiving a webhook after the job has been completed to kick off your reverse ETL job. - **Integration points** — Webhooks and/or Admin API - dbt lineage - **Desired action** — You want to interpolate the dbt lineage metadata into your tool. From f235d7278e78b39c9c2459a420f6f53381967c4e Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:00:22 -0500 Subject: [PATCH 057/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index bd9649866ae..50562816369 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -49,7 +49,7 @@ If the customer is on an Azure single tenant instance, they don't currently have For authentication, we highly recommend that your integration uses account service tokens. You can read more about [how to create a service token and what permission sets to provide it](/docs/dbt-cloud-apis/service-tokens). Please note that depending on their plan type, they'll have access to different permission sets. We _do not_ recommend that users supply their user bearer tokens for authentication. This can cause issues if the user leaves the organization and provides you access to all the dbt Cloud accounts associated to the user rather than just the account (and related projects) that they want to integrate with. -## Potential Use Cases +## Potential use cases - Event-based orchestration - **Desired action** — You want to receive information that a scheduled dbt Cloud job has been completed or has kicked off a dbt Cloud job. You can align your product schedule to the dbt Cloud run schedule. From d9a9b6c2db809a8c168c30288caed3b8947ca9a4 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:00:52 -0500 Subject: [PATCH 058/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 1 - 1 file changed, 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 50562816369..2be52eee2dc 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -35,7 +35,6 @@ If you require a partner dbt Cloud account to test on, we can upgrade an existin - **Overview** — This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. - [Webhooks](/docs/deploy/webhooks) - **Overview** — Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information about your dbt jobs in real time. - - [Link to documentation](https://docs.getdbt.com/docs/deploy/webhooks) - [Semantic Layers/Metrics](/docs/dbt-cloud-apis/sl-api-overview) - **Overview** — Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](/docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](/guides/dbt-ecosystem/sl-partner-integration-guide). - Metrics definitions can be pulled from the Discovery API (linked above) or the Semantic Layer Driver/GraphQL API. The key difference is that the Discovery API isn't able to pull the semantic graph, which provides the list of available dimensions that one can query per metric. That is only available with the SL Driver/APIs. The trade-off is that the SL Driver/APIs doesn't have access to the lineage of the entire dbt project (that is, how the dbt metrics dependencies on dbt models). From e64fede56647042d7dd0e1d3b57a102ad56da35c Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:01:22 -0500 Subject: [PATCH 059/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 2be52eee2dc..5440a5d9f23 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -56,7 +56,7 @@ For authentication, we highly recommend that your integration uses account servi - **Integration points** — Webhooks and/or Admin API - dbt lineage - **Desired action** — You want to interpolate the dbt lineage metadata into your tool. - - **Example: In your tool, you wish to pull in the dbt DAG into your lineage diagram. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-full-data-lineage)** + - **Example** — In your tool, you want to pull in the dbt DAG into your lineage diagram. For details on what you could pull and how to do this, refer to [Use cases and examples for the Discovery API](/docs/dbt-cloud-apis/discovery-use-cases-and-examples). - **Integration Points:** Discovery API - dbt Environment/Job metadata - **Desired Action:** You wish to interpolate dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. From c6bae5ec52d0e38a0611e9bde3fa921845f2bc2c Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:01:33 -0500 Subject: [PATCH 060/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 5440a5d9f23..2ac518fda6e 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -58,7 +58,7 @@ For authentication, we highly recommend that your integration uses account servi - **Desired action** — You want to interpolate the dbt lineage metadata into your tool. - **Example** — In your tool, you want to pull in the dbt DAG into your lineage diagram. For details on what you could pull and how to do this, refer to [Use cases and examples for the Discovery API](/docs/dbt-cloud-apis/discovery-use-cases-and-examples). - **Integration Points:** Discovery API -- dbt Environment/Job metadata +- dbt environment/job metadata - **Desired Action:** You wish to interpolate dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. - **Example:** In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model) - **Integration Points:** Discovery API From ff218889307cb820046359b0b27b70f06303545e Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:01:56 -0500 Subject: [PATCH 061/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 2ac518fda6e..d89d32381bf 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -57,7 +57,7 @@ For authentication, we highly recommend that your integration uses account servi - dbt lineage - **Desired action** — You want to interpolate the dbt lineage metadata into your tool. - **Example** — In your tool, you want to pull in the dbt DAG into your lineage diagram. For details on what you could pull and how to do this, refer to [Use cases and examples for the Discovery API](/docs/dbt-cloud-apis/discovery-use-cases-and-examples). - - **Integration Points:** Discovery API + - **Integration points** — Discovery API - dbt environment/job metadata - **Desired Action:** You wish to interpolate dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. - **Example:** In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model) From 6af40a3884fce985487fdca5642b2c346178aaef Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:07:02 -0500 Subject: [PATCH 062/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index d89d32381bf..b938dca050c 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -93,7 +93,7 @@ For authentication, we highly recommend that your integration uses account servi - Contact your Partner Manager with your account id (in your URL) - Why should I not use the Admin API to pull out the dbt artifacts for metadata? - We recommend not integrating with the Admin API to extract the dbt artifacts documentation. This is because the Discovery API provides more extensive information, a user-friendly structure and more reliable integration point. -- How do I get access to the dbt Brand assets? +- How do I get access to the dbt brand assets? - Check out this [page](https://www.getdbt.com/brand-guidelines/). Please make sure you’re not using our old logo(hint: there should only be one hole in the logo). Please also note that the name dbt and the dbt logo are trademarked by dbt Labs, and that use is governed by our brand guidelines - which are fairly specific for commercial uses. If you have any questions about proper use of our marks, please ask for your partner manager. - How do I engage with the partnerships team? - Email partnerships@dbtlabs.com. \ No newline at end of file From 3c91a599841f2e7c1bf8efdd3ba9ba5e05a408c2 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:07:10 -0500 Subject: [PATCH 063/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index b938dca050c..27541fa232b 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -94,6 +94,6 @@ For authentication, we highly recommend that your integration uses account servi - Why should I not use the Admin API to pull out the dbt artifacts for metadata? - We recommend not integrating with the Admin API to extract the dbt artifacts documentation. This is because the Discovery API provides more extensive information, a user-friendly structure and more reliable integration point. - How do I get access to the dbt brand assets? - - Check out this [page](https://www.getdbt.com/brand-guidelines/). Please make sure you’re not using our old logo(hint: there should only be one hole in the logo). Please also note that the name dbt and the dbt logo are trademarked by dbt Labs, and that use is governed by our brand guidelines - which are fairly specific for commercial uses. If you have any questions about proper use of our marks, please ask for your partner manager. + - Check out our [Brand guidelines](https://www.getdbt.com/brand-guidelines/) page. Please make sure you’re not using our old logo (hint: there should only be one hole in the logo). Please also note that the name dbt and the dbt logo are trademarked by dbt Labs, and that use is governed by our brand guidelines, which are fairly specific for commercial uses. If you have any questions about proper use of our marks, please ask your partner manager. - How do I engage with the partnerships team? - Email partnerships@dbtlabs.com. \ No newline at end of file From b8d54492535a21217827ae763b38ed6a341a52ee Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:07:17 -0500 Subject: [PATCH 064/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 27541fa232b..e28b9ccf9d1 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -92,7 +92,7 @@ For authentication, we highly recommend that your integration uses account servi - How do I get a Partner Account? - Contact your Partner Manager with your account id (in your URL) - Why should I not use the Admin API to pull out the dbt artifacts for metadata? - - We recommend not integrating with the Admin API to extract the dbt artifacts documentation. This is because the Discovery API provides more extensive information, a user-friendly structure and more reliable integration point. + - We recommend not integrating with the Admin API to extract the dbt artifacts documentation. This is because the Discovery API provides more extensive information, a user-friendly structure, and a more reliable integration point. - How do I get access to the dbt brand assets? - Check out our [Brand guidelines](https://www.getdbt.com/brand-guidelines/) page. Please make sure you’re not using our old logo (hint: there should only be one hole in the logo). Please also note that the name dbt and the dbt logo are trademarked by dbt Labs, and that use is governed by our brand guidelines, which are fairly specific for commercial uses. If you have any questions about proper use of our marks, please ask your partner manager. - How do I engage with the partnerships team? From 298cd464fb60f6a93bc5525dd47a9e73c67125be Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:07:22 -0500 Subject: [PATCH 065/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index e28b9ccf9d1..fd269137235 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -90,7 +90,7 @@ For authentication, we highly recommend that your integration uses account servi - Can you provide column-level information in the lineage? - Column-level lineage is currently in beta release with more information to come. - How do I get a Partner Account? - - Contact your Partner Manager with your account id (in your URL) + - Contact your Partner Manager with your account ID (in your URL). - Why should I not use the Admin API to pull out the dbt artifacts for metadata? - We recommend not integrating with the Admin API to extract the dbt artifacts documentation. This is because the Discovery API provides more extensive information, a user-friendly structure, and a more reliable integration point. - How do I get access to the dbt brand assets? From 1635cecbb0c2dc1cc4011896cee03aa3d2cd0a6b Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:07:27 -0500 Subject: [PATCH 066/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index fd269137235..7c4cdef78c5 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -91,7 +91,7 @@ For authentication, we highly recommend that your integration uses account servi - Column-level lineage is currently in beta release with more information to come. - How do I get a Partner Account? - Contact your Partner Manager with your account ID (in your URL). -- Why should I not use the Admin API to pull out the dbt artifacts for metadata? +- Why shouldn't I use the Admin API to pull out the dbt artifacts for metadata? - We recommend not integrating with the Admin API to extract the dbt artifacts documentation. This is because the Discovery API provides more extensive information, a user-friendly structure, and a more reliable integration point. - How do I get access to the dbt brand assets? - Check out our [Brand guidelines](https://www.getdbt.com/brand-guidelines/) page. Please make sure you’re not using our old logo (hint: there should only be one hole in the logo). Please also note that the name dbt and the dbt logo are trademarked by dbt Labs, and that use is governed by our brand guidelines, which are fairly specific for commercial uses. If you have any questions about proper use of our marks, please ask your partner manager. From 2f2666897b966fcd909d6ccabe6ae1f99260c391 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:07:43 -0500 Subject: [PATCH 067/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 7c4cdef78c5..9869ffd5bda 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -60,7 +60,7 @@ For authentication, we highly recommend that your integration uses account servi - **Integration points** — Discovery API - dbt environment/job metadata - **Desired Action:** You wish to interpolate dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. - - **Example:** In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model) + - **Example** — In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. For details on what you could pull and how to do this, refer to [What's the latest state of each model](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model). - **Integration Points:** Discovery API - dbt Model Documentation - **Desired Action:** You wish to interpolate dbt Project Information, including model descriptions, column descriptions, etc. From eefe9e0052890abf23e561212020cfdd4df457f2 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:07:50 -0500 Subject: [PATCH 068/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 9869ffd5bda..ec6a6e7c365 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -61,7 +61,7 @@ For authentication, we highly recommend that your integration uses account servi - dbt environment/job metadata - **Desired Action:** You wish to interpolate dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. - **Example** — In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. For details on what you could pull and how to do this, refer to [What's the latest state of each model](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model). - - **Integration Points:** Discovery API + - **Integration points** — Discovery API - dbt Model Documentation - **Desired Action:** You wish to interpolate dbt Project Information, including model descriptions, column descriptions, etc. - **Example:** You want to extract out the dbt model description so that you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean) From c521160cee9e779f8aec9028db7c24a652fb955d Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:07:57 -0500 Subject: [PATCH 069/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index ec6a6e7c365..ff39616b5fb 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -59,7 +59,7 @@ For authentication, we highly recommend that your integration uses account servi - **Example** — In your tool, you want to pull in the dbt DAG into your lineage diagram. For details on what you could pull and how to do this, refer to [Use cases and examples for the Discovery API](/docs/dbt-cloud-apis/discovery-use-cases-and-examples). - **Integration points** — Discovery API - dbt environment/job metadata - - **Desired Action:** You wish to interpolate dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. + - **Desired action** — You want to interpolate the dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. - **Example** — In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. For details on what you could pull and how to do this, refer to [What's the latest state of each model](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model). - **Integration points** — Discovery API - dbt Model Documentation From ed73c02048c5b075c99338257573551be369be0d Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:08:03 -0500 Subject: [PATCH 070/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index ff39616b5fb..9fe53141dc5 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -63,7 +63,7 @@ For authentication, we highly recommend that your integration uses account servi - **Example** — In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. For details on what you could pull and how to do this, refer to [What's the latest state of each model](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model). - **Integration points** — Discovery API - dbt Model Documentation - - **Desired Action:** You wish to interpolate dbt Project Information, including model descriptions, column descriptions, etc. + - **Desired action** — You want to interpolate the dbt project Information, including model descriptions, column descriptions, etc. - **Example:** You want to extract out the dbt model description so that you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean) - **Integration Points:** Discovery API From f9d6b1e611cb6f33b4a73af9092c60b71ea71afb Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:08:14 -0500 Subject: [PATCH 071/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 9fe53141dc5..2b482eca245 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -67,7 +67,7 @@ For authentication, we highly recommend that your integration uses account servi - **Example:** You want to extract out the dbt model description so that you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean) - **Integration Points:** Discovery API -**dbt Core only users will have no access to the above integration points.** For dbt metadata, oftentimes our partners will create a dbt core integration by using the [dbt artifacts](https://www.getdbt.com/product/semantic-layer/) files generated by each run and provided by the user. With our Discovery API, we are providing a dynamic way to get the latest up to date information, parsed out for you. +dbt Core only users will have no access to the above integration points. For dbt metadata, oftentimes our partners will create a dbt Core integration by using the [dbt artifact](https://www.getdbt.com/product/semantic-layer/) files generated by each run and provided by the user. With the Discovery API, we are providing a dynamic way to get the latest information parsed out for you. ## dbt Cloud Plans & Permissions From fbc844d651b710c4376795a13340ce3f5ddafdda Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:08:20 -0500 Subject: [PATCH 072/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 2b482eca245..ab03edd69f7 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -73,7 +73,7 @@ dbt Core only users will have no access to the above integration points. For dbt [The dbt Cloud plan type](https://www.getdbt.com/pricing) will change what the user has access to. There are four different types of plans: -- **Developer**: This is free and available to one user with a limited amount of successful models built. This plan cannot access the APIs, Webhooks, or Semantic Layer. Limited to 1 project. +- **Developer** — This is free and available to one user with a limited amount of successful models built. This plan can't access the APIs, Webhooks, or Semantic Layer and is limited to just one project. - **Team:** This plan has access to the APIs, Webhooks, and Semantic Layer. You may have up to 8 users on the account and one dbt Cloud Project. This is limited to 15,000 successful models built. - **Enterprise** (Multi-tenant/Multi-cell): This plan has access to the APIs, Webhooks, and Semantic Layer. They may have more than one dbt Cloud Project based on how many dbt projects/domains they have using dbt. Majority of our enterprise customers are on multi-tenant dbt Cloud instances. - **Enterprise** (Single-tenant): This plan may have access to the APIs, Webhooks, and Semantic Layer. If you are working with a specific customer, let us know, and we can confirm if their instance has access. From f2ccece9f687926159d0a4797b9fe905095ead58 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:08:35 -0500 Subject: [PATCH 073/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index ab03edd69f7..dd5ae6bad35 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -69,7 +69,7 @@ For authentication, we highly recommend that your integration uses account servi dbt Core only users will have no access to the above integration points. For dbt metadata, oftentimes our partners will create a dbt Core integration by using the [dbt artifact](https://www.getdbt.com/product/semantic-layer/) files generated by each run and provided by the user. With the Discovery API, we are providing a dynamic way to get the latest information parsed out for you. -## dbt Cloud Plans & Permissions +## dbt Cloud plans & permissions [The dbt Cloud plan type](https://www.getdbt.com/pricing) will change what the user has access to. There are four different types of plans: From dbb3924f965f4f5343d9a70e1ce55f76e59b6f1b Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:08:42 -0500 Subject: [PATCH 074/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index dd5ae6bad35..666ecd53f8d 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -75,7 +75,7 @@ dbt Core only users will have no access to the above integration points. For dbt - **Developer** — This is free and available to one user with a limited amount of successful models built. This plan can't access the APIs, Webhooks, or Semantic Layer and is limited to just one project. - **Team:** This plan has access to the APIs, Webhooks, and Semantic Layer. You may have up to 8 users on the account and one dbt Cloud Project. This is limited to 15,000 successful models built. -- **Enterprise** (Multi-tenant/Multi-cell): This plan has access to the APIs, Webhooks, and Semantic Layer. They may have more than one dbt Cloud Project based on how many dbt projects/domains they have using dbt. Majority of our enterprise customers are on multi-tenant dbt Cloud instances. +- **Enterprise** (multi-tenant/multi-cell) — This plan provides access to the APIs, webhooks, and Semantic Layer. You can have more than one dbt Cloud project based on how many dbt projects/domains they have using dbt. The majority of our enterprise customers are on multi-tenant dbt Cloud instances. - **Enterprise** (Single-tenant): This plan may have access to the APIs, Webhooks, and Semantic Layer. If you are working with a specific customer, let us know, and we can confirm if their instance has access. ## Frequently Asked Questions From 4358a4d752b06c8ec8bdebb9545fece1934101fc Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:08:49 -0500 Subject: [PATCH 075/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 666ecd53f8d..fd1c4a9a6f8 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -80,7 +80,7 @@ dbt Core only users will have no access to the above integration points. For dbt ## Frequently Asked Questions -- What is a dbt Cloud Project? +- What is a dbt Cloud project? - A dbt Cloud project is made up of two connections: one to the git repository and one to the data warehouse/platform. Most customers will have only one dbt Cloud Project in their account but there are enterprise clients who might have more depending on their use cases.The project also encapsulates two types of environments at minimal: a development environment and deployment environment. - Oftentimes folks refer to the [dbt Project](https://docs.getdbt.com/docs/build/projects) as the code hosted in their git repository. - What is a dbt Cloud Environment? From 5748cbe3d11f7c0129761c35b98bbeafa4f053d6 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:09:18 -0500 Subject: [PATCH 076/143] Update website/blog/2023-12-20-partner-integration-guide.md --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index fd1c4a9a6f8..561d030385f 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -78,7 +78,7 @@ dbt Core only users will have no access to the above integration points. For dbt - **Enterprise** (multi-tenant/multi-cell) — This plan provides access to the APIs, webhooks, and Semantic Layer. You can have more than one dbt Cloud project based on how many dbt projects/domains they have using dbt. The majority of our enterprise customers are on multi-tenant dbt Cloud instances. - **Enterprise** (Single-tenant): This plan may have access to the APIs, Webhooks, and Semantic Layer. If you are working with a specific customer, let us know, and we can confirm if their instance has access. -## Frequently Asked Questions +## FAQs - What is a dbt Cloud project? - A dbt Cloud project is made up of two connections: one to the git repository and one to the data warehouse/platform. Most customers will have only one dbt Cloud Project in their account but there are enterprise clients who might have more depending on their use cases.The project also encapsulates two types of environments at minimal: a development environment and deployment environment. From c64ddc589fb98bd97cb8f756a8c81f6c2dabf93a Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:09:27 -0500 Subject: [PATCH 077/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 561d030385f..09b488a1b77 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -81,7 +81,7 @@ dbt Core only users will have no access to the above integration points. For dbt ## FAQs - What is a dbt Cloud project? - - A dbt Cloud project is made up of two connections: one to the git repository and one to the data warehouse/platform. Most customers will have only one dbt Cloud Project in their account but there are enterprise clients who might have more depending on their use cases.The project also encapsulates two types of environments at minimal: a development environment and deployment environment. + - A dbt Cloud project is made up of two connections: one to the Git repository and one to the data warehouse/platform. Most customers will have only one dbt Cloud project in their account but there are enterprise clients who might have more depending on their use cases. The project also encapsulates two types of environments at minimal: a development environment and deployment environment. - Oftentimes folks refer to the [dbt Project](https://docs.getdbt.com/docs/build/projects) as the code hosted in their git repository. - What is a dbt Cloud Environment? - [For an overview, check out this documentation.](https://docs.getdbt.com/docs/environments-in-dbt) At minimal an project will have one deployment type environment that they will be executing jobs on. The development environment powers the dbt Cloud IDE and Cloud CLI. From 3cf7408d6850b5d5ef76fd982dd89863d22f456e Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:09:33 -0500 Subject: [PATCH 078/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 09b488a1b77..116b966ab26 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -83,7 +83,7 @@ dbt Core only users will have no access to the above integration points. For dbt - What is a dbt Cloud project? - A dbt Cloud project is made up of two connections: one to the Git repository and one to the data warehouse/platform. Most customers will have only one dbt Cloud project in their account but there are enterprise clients who might have more depending on their use cases. The project also encapsulates two types of environments at minimal: a development environment and deployment environment. - Oftentimes folks refer to the [dbt Project](https://docs.getdbt.com/docs/build/projects) as the code hosted in their git repository. -- What is a dbt Cloud Environment? +- What is a dbt Cloud environment? - [For an overview, check out this documentation.](https://docs.getdbt.com/docs/environments-in-dbt) At minimal an project will have one deployment type environment that they will be executing jobs on. The development environment powers the dbt Cloud IDE and Cloud CLI. - Can we write back to the dbt project? - At this moment, we do not have a Write API. A dbt project is hosted in a git repository, so if you have a git provider integration, you can manually open up a Pull Request on the project to maintain the version control process. From 1b766c9071a8bb40a7e0d316fe8a867c4b14c308 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:09:41 -0500 Subject: [PATCH 079/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 116b966ab26..2dd7b6765e1 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -86,7 +86,7 @@ dbt Core only users will have no access to the above integration points. For dbt - What is a dbt Cloud environment? - [For an overview, check out this documentation.](https://docs.getdbt.com/docs/environments-in-dbt) At minimal an project will have one deployment type environment that they will be executing jobs on. The development environment powers the dbt Cloud IDE and Cloud CLI. - Can we write back to the dbt project? - - At this moment, we do not have a Write API. A dbt project is hosted in a git repository, so if you have a git provider integration, you can manually open up a Pull Request on the project to maintain the version control process. + - At this moment, we don't have a Write API. A dbt project is hosted in a Git repository, so if you have a Git provider integration, you can manually open a pull request (PR) on the project to maintain the version control process. - Can you provide column-level information in the lineage? - Column-level lineage is currently in beta release with more information to come. - How do I get a Partner Account? From ea0ae3caab2db33a7bcb2a0a4bbcb6f980b7f28d Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:09:48 -0500 Subject: [PATCH 080/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 2dd7b6765e1..1e9e41c4993 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -84,7 +84,7 @@ dbt Core only users will have no access to the above integration points. For dbt - A dbt Cloud project is made up of two connections: one to the Git repository and one to the data warehouse/platform. Most customers will have only one dbt Cloud project in their account but there are enterprise clients who might have more depending on their use cases. The project also encapsulates two types of environments at minimal: a development environment and deployment environment. - Oftentimes folks refer to the [dbt Project](https://docs.getdbt.com/docs/build/projects) as the code hosted in their git repository. - What is a dbt Cloud environment? - - [For an overview, check out this documentation.](https://docs.getdbt.com/docs/environments-in-dbt) At minimal an project will have one deployment type environment that they will be executing jobs on. The development environment powers the dbt Cloud IDE and Cloud CLI. + - For an overview, check out [About environments](https://docs.getdbt.com/docs/environments-in-dbt). At a minimum, a project will have one deployment type environment that they will be executing jobs on. The development environment powers the dbt Cloud IDE and Cloud CLI. - Can we write back to the dbt project? - At this moment, we don't have a Write API. A dbt project is hosted in a Git repository, so if you have a Git provider integration, you can manually open a pull request (PR) on the project to maintain the version control process. - Can you provide column-level information in the lineage? From 872c676f423ebb9338c8f4839059878172e5e1ec Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:09:56 -0500 Subject: [PATCH 081/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 1e9e41c4993..a145bd71744 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -62,7 +62,7 @@ For authentication, we highly recommend that your integration uses account servi - **Desired action** — You want to interpolate the dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. - **Example** — In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. For details on what you could pull and how to do this, refer to [What's the latest state of each model](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model). - **Integration points** — Discovery API -- dbt Model Documentation +- dbt model documentation - **Desired action** — You want to interpolate the dbt project Information, including model descriptions, column descriptions, etc. - **Example:** You want to extract out the dbt model description so that you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean) - **Integration Points:** Discovery API From f177a6dc7ff8624dde7e68fdf8960203651b9e58 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:10:01 -0500 Subject: [PATCH 082/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index a145bd71744..859edcd0a3f 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -65,7 +65,7 @@ For authentication, we highly recommend that your integration uses account servi - dbt model documentation - **Desired action** — You want to interpolate the dbt project Information, including model descriptions, column descriptions, etc. - **Example:** You want to extract out the dbt model description so that you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean) - - **Integration Points:** Discovery API + - **Integration points** — Discovery API dbt Core only users will have no access to the above integration points. For dbt metadata, oftentimes our partners will create a dbt Core integration by using the [dbt artifact](https://www.getdbt.com/product/semantic-layer/) files generated by each run and provided by the user. With the Discovery API, we are providing a dynamic way to get the latest information parsed out for you. From 57ac462015dc064b320ba4b96466b85a3d82441a Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:10:11 -0500 Subject: [PATCH 083/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 859edcd0a3f..8823e90b8b7 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -64,7 +64,7 @@ For authentication, we highly recommend that your integration uses account servi - **Integration points** — Discovery API - dbt model documentation - **Desired action** — You want to interpolate the dbt project Information, including model descriptions, column descriptions, etc. - - **Example:** You want to extract out the dbt model description so that you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. [This is what you could pull and how to do this.](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean) + - **Example** — You want to extract the dbt model description so you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. For details on what you could pull and how to do this, refer to [What does this dataset and its columns mean](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean). - **Integration points** — Discovery API dbt Core only users will have no access to the above integration points. For dbt metadata, oftentimes our partners will create a dbt Core integration by using the [dbt artifact](https://www.getdbt.com/product/semantic-layer/) files generated by each run and provided by the user. With the Discovery API, we are providing a dynamic way to get the latest information parsed out for you. From 715394bf4a08b039d62baff45f2f592aaf72bb10 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:10:16 -0500 Subject: [PATCH 084/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 8823e90b8b7..8b198743dbc 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -74,7 +74,7 @@ dbt Core only users will have no access to the above integration points. For dbt [The dbt Cloud plan type](https://www.getdbt.com/pricing) will change what the user has access to. There are four different types of plans: - **Developer** — This is free and available to one user with a limited amount of successful models built. This plan can't access the APIs, Webhooks, or Semantic Layer and is limited to just one project. -- **Team:** This plan has access to the APIs, Webhooks, and Semantic Layer. You may have up to 8 users on the account and one dbt Cloud Project. This is limited to 15,000 successful models built. +- **Team** — This plan provides access to the APIs, webhooks, and Semantic Layer. You can have up to eight users on the account and one dbt Cloud Project. This is limited to 15,000 successful models built. - **Enterprise** (multi-tenant/multi-cell) — This plan provides access to the APIs, webhooks, and Semantic Layer. You can have more than one dbt Cloud project based on how many dbt projects/domains they have using dbt. The majority of our enterprise customers are on multi-tenant dbt Cloud instances. - **Enterprise** (Single-tenant): This plan may have access to the APIs, Webhooks, and Semantic Layer. If you are working with a specific customer, let us know, and we can confirm if their instance has access. From 68f3a68afbfc7d9887da3a07c856387a31141c4d Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:10:21 -0500 Subject: [PATCH 085/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 8b198743dbc..1f9d587b0f1 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -76,7 +76,7 @@ dbt Core only users will have no access to the above integration points. For dbt - **Developer** — This is free and available to one user with a limited amount of successful models built. This plan can't access the APIs, Webhooks, or Semantic Layer and is limited to just one project. - **Team** — This plan provides access to the APIs, webhooks, and Semantic Layer. You can have up to eight users on the account and one dbt Cloud Project. This is limited to 15,000 successful models built. - **Enterprise** (multi-tenant/multi-cell) — This plan provides access to the APIs, webhooks, and Semantic Layer. You can have more than one dbt Cloud project based on how many dbt projects/domains they have using dbt. The majority of our enterprise customers are on multi-tenant dbt Cloud instances. -- **Enterprise** (Single-tenant): This plan may have access to the APIs, Webhooks, and Semantic Layer. If you are working with a specific customer, let us know, and we can confirm if their instance has access. +- **Enterprise** (single tenant): This plan might have access to the APIs, webhooks, and Semantic Layer. If you're working with a specific customer, let us know and we can confirm if their instance has access. ## FAQs From 509f5a8a4aac7b7a924ad1736de2b4dc34d890f7 Mon Sep 17 00:00:00 2001 From: Amy Chen <46451573+amychen1776@users.noreply.github.com> Date: Wed, 20 Dec 2023 16:10:26 -0500 Subject: [PATCH 086/143] Update website/blog/2023-12-20-partner-integration-guide.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 1f9d587b0f1..22fbbbafbb7 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -82,7 +82,7 @@ dbt Core only users will have no access to the above integration points. For dbt - What is a dbt Cloud project? - A dbt Cloud project is made up of two connections: one to the Git repository and one to the data warehouse/platform. Most customers will have only one dbt Cloud project in their account but there are enterprise clients who might have more depending on their use cases. The project also encapsulates two types of environments at minimal: a development environment and deployment environment. - - Oftentimes folks refer to the [dbt Project](https://docs.getdbt.com/docs/build/projects) as the code hosted in their git repository. + - Folks commonly refer to the [dbt project](https://docs.getdbt.com/docs/build/projects) as the code hosted in their Git repository. - What is a dbt Cloud environment? - For an overview, check out [About environments](https://docs.getdbt.com/docs/environments-in-dbt). At a minimum, a project will have one deployment type environment that they will be executing jobs on. The development environment powers the dbt Cloud IDE and Cloud CLI. - Can we write back to the dbt project? From 6732df021c4335faa35ba6cb106b4e6123ba7077 Mon Sep 17 00:00:00 2001 From: Ly Nguyen Date: Wed, 20 Dec 2023 13:22:55 -0800 Subject: [PATCH 087/143] Update repo caching --- website/snippets/_cloud-environments-info.md | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index 6e096b83750..6400b29ea9f 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -42,6 +42,12 @@ For improved reliability and performance on your job runs, you can enable dbt Cl dbt Cloud caches your project's Git repo after each successful run and retains it for 8 days if there are no repo updates. It caches all packages regardless of installation method and does not fetch code outside of the job runs. +Below lists the situations when dbt Cloud uses the cached copy: + +- Git authentication fails. +- There are syntax errors in the `packages.yml` file. To catch these errors sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). +- A package is incompatible with the dbt version being used. To catch this incompatibility sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). + To enable Git repository caching, select **Account settings** from the gear menu and enable the **Repository caching** option. From d459a08ed34f042bc4d4962eeb2f075be7408d0a Mon Sep 17 00:00:00 2001 From: Amy Chen Date: Wed, 20 Dec 2023 17:27:40 -0500 Subject: [PATCH 088/143] fix links --- .../2023-12-20-partner-integration-guide.md | 24 +++++++++---------- 1 file changed, 12 insertions(+), 12 deletions(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 22fbbbafbb7..1c1ea8f893c 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -15,7 +15,7 @@ is_featured: false Over the course of my three years running the Partner Engineering team at dbt Labs, the most common question I've been asked is, How do we integrate with dbt? Because those conversations often start out at the same place, I decided to create this guide so I’m no longer the blocker to fundamental information. This also allows us to skip the intro and get to the fun conversations so much faster, like what a joint solution for our customers would look like. -This guide doesn't include how to integrate with dbt Core. If you’re interested in creating a dbt adapter, please check out the [adapter development guide](/guides/dbt-ecosystem/adapter-development/1-what-are-adapters) instead. +This guide doesn't include how to integrate with dbt Core. If you’re interested in creating a dbt adapter, please check out the [adapter development guide](https://docs.getdbt.com/guides/dbt-ecosystem/adapter-development/1-what-are-adapters) instead. Instead, we're going to focus on integrating with dbt Cloud. Integrating with dbt Cloud is a key requirement to become a dbt Labs technology partner, opening the door to a variety of collaborative commercial opportunities. @@ -23,30 +23,30 @@ Here I'll cover how to get started, potential use cases you want to solve for, a ## New to dbt Cloud? -If you're new to dbt and dbt Cloud, we recommend you and your software developers try our [Getting Started Quickstarts](/guides) after reading [What is dbt?](/docs/introduction). The documentation will help you familiarize yourself with how our users interact with dbt. By going through this, you will also create a sample dbt project to test your integration. +If you're new to dbt and dbt Cloud, we recommend you and your software developers try our [Getting Started Quickstarts](https://docs.getdbt.com/guides) after reading [What is dbt?](https://docs.getdbt.com/docs/introduction). The documentation will help you familiarize yourself with how our users interact with dbt. By going through this, you will also create a sample dbt project to test your integration. If you require a partner dbt Cloud account to test on, we can upgrade an existing account or a trial account. This account may only be used for development, training, and demonstration purposes. Please contact your partner manager if you're interested and provide the account ID (provided in the URL). Our partner account includes all of the enterprise level functionality and can be provided with a signed partnerships agreement. ## Integration points -- [Discovery API (formerly referred to as Metadata API)](/docs/dbt-cloud-apis/discovery-api) +- [Discovery API (formerly referred to as Metadata API)](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-api) - **Overview** — This GraphQL API allows you to query the metadata that dbt Cloud generates every time you run a dbt project. We have two schemas available (environment and job level). By default, we always recommend that you integrate with the environment level schema because it contains the latest state and historical run results of all the jobs run on the dbt Cloud project. The job level will only provide you the metadata of one job, giving you only a small snapshot of part of the project. -- [Administrative (Admin) API](/docs/dbt-cloud-apis/admin-cloud-api) +- [Administrative (Admin) API](https://docs.getdbt.com/docs/dbt-cloud-apis/admin-cloud-api) - **Overview** — This REST API allows you to orchestrate dbt Cloud jobs runs and help you administer a dbt Cloud account. For metadata retrieval, we recommend integrating with the Discovery API instead. -- [Webhooks](/docs/deploy/webhooks) +- [Webhooks](https://docs.getdbt.com/docs/deploy/webhooks) - **Overview** — Outbound webhooks can send notifications about your dbt Cloud jobs to other systems. These webhooks allow you to get the latest information about your dbt jobs in real time. -- [Semantic Layers/Metrics](/docs/dbt-cloud-apis/sl-api-overview) - - **Overview** — Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](/docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](/guides/dbt-ecosystem/sl-partner-integration-guide). +- [Semantic Layers/Metrics](https://docs.getdbt.com/docs/dbt-cloud-apis/sl-api-overview) + - **Overview** — Our Semantic Layer is made up of two parts: metrics definitions and the ability to interactively query the dbt metrics. For more details, here is a [basic overview](https://docs.getdbt.com/docs/use-dbt-semantic-layer/dbt-sl) and [our best practices](https://docs.getdbt.com/guides/dbt-ecosystem/sl-partner-integration-guide). - Metrics definitions can be pulled from the Discovery API (linked above) or the Semantic Layer Driver/GraphQL API. The key difference is that the Discovery API isn't able to pull the semantic graph, which provides the list of available dimensions that one can query per metric. That is only available with the SL Driver/APIs. The trade-off is that the SL Driver/APIs doesn't have access to the lineage of the entire dbt project (that is, how the dbt metrics dependencies on dbt models). - Three integration points are available for the Semantic Layer API. ## dbt Cloud hosting and authentication -To use the dbt Cloud APIs, you'll need access to the customer’s access urls. Depending on their dbt Cloud setup, they'll have a different access URL. To find out more, refer to [Regions & IP addresses](/docs/cloud/about-cloud/regions-ip-addresses) to understand all the possible configurations. My recommendation is to allow the customer to provide their own URL to simplify support. +To use the dbt Cloud APIs, you'll need access to the customer’s access urls. Depending on their dbt Cloud setup, they'll have a different access URL. To find out more, refer to [Regions & IP addresses](https://docs.getdbt.com/docs/cloud/about-cloud/regions-ip-addresses) to understand all the possible configurations. My recommendation is to allow the customer to provide their own URL to simplify support. If the customer is on an Azure single tenant instance, they don't currently have access to the Discovery API or the Semantic Layer APIs. -For authentication, we highly recommend that your integration uses account service tokens. You can read more about [how to create a service token and what permission sets to provide it](/docs/dbt-cloud-apis/service-tokens). Please note that depending on their plan type, they'll have access to different permission sets. We _do not_ recommend that users supply their user bearer tokens for authentication. This can cause issues if the user leaves the organization and provides you access to all the dbt Cloud accounts associated to the user rather than just the account (and related projects) that they want to integrate with. +For authentication, we highly recommend that your integration uses account service tokens. You can read more about [how to create a service token and what permission sets to provide it](https://docs.getdbt.com/docs/dbt-cloud-apis/service-tokens). Please note that depending on their plan type, they'll have access to different permission sets. We _do not_ recommend that users supply their user bearer tokens for authentication. This can cause issues if the user leaves the organization and provides you access to all the dbt Cloud accounts associated to the user rather than just the account (and related projects) that they want to integrate with. ## Potential use cases @@ -56,15 +56,15 @@ For authentication, we highly recommend that your integration uses account servi - **Integration points** — Webhooks and/or Admin API - dbt lineage - **Desired action** — You want to interpolate the dbt lineage metadata into your tool. - - **Example** — In your tool, you want to pull in the dbt DAG into your lineage diagram. For details on what you could pull and how to do this, refer to [Use cases and examples for the Discovery API](/docs/dbt-cloud-apis/discovery-use-cases-and-examples). + - **Example** — In your tool, you want to pull in the dbt DAG into your lineage diagram. For details on what you could pull and how to do this, refer to [Use cases and examples for the Discovery API](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples). - **Integration points** — Discovery API - dbt environment/job metadata - **Desired action** — You want to interpolate the dbt Cloud job information into your tool, including the status of the jobs, the status of the tables executed in the run, what tests passed, etc. - - **Example** — In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. For details on what you could pull and how to do this, refer to [What's the latest state of each model](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model). + - **Example** — In your Business Intelligence tool, stakeholders select from tables that a dbt model created. You show the last time the model passed its tests/last run to show that the tables are current and can be trusted. For details on what you could pull and how to do this, refer to [What's the latest state of each model](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#whats-the-latest-state-of-each-model). - **Integration points** — Discovery API - dbt model documentation - **Desired action** — You want to interpolate the dbt project Information, including model descriptions, column descriptions, etc. - - **Example** — You want to extract the dbt model description so you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. For details on what you could pull and how to do this, refer to [What does this dataset and its columns mean](/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean). + - **Example** — You want to extract the dbt model description so you can display and help the stakeholder understand what they are selecting from. This way, the creators can easily pass on the information without updating another system. For details on what you could pull and how to do this, refer to [What does this dataset and its columns mean](https://docs.getdbt.com/docs/dbt-cloud-apis/discovery-use-cases-and-examples#what-does-this-dataset-and-its-columns-mean). - **Integration points** — Discovery API dbt Core only users will have no access to the above integration points. For dbt metadata, oftentimes our partners will create a dbt Core integration by using the [dbt artifact](https://www.getdbt.com/product/semantic-layer/) files generated by each run and provided by the user. With the Discovery API, we are providing a dynamic way to get the latest information parsed out for you. From 4fa9d370ce42701a72dcd8b457e87cdc5e2fdbaa Mon Sep 17 00:00:00 2001 From: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> Date: Wed, 20 Dec 2023 15:14:13 -0800 Subject: [PATCH 089/143] Update website/blog/2023-12-20-partner-integration-guide.md --- website/blog/2023-12-20-partner-integration-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/2023-12-20-partner-integration-guide.md b/website/blog/2023-12-20-partner-integration-guide.md index 1c1ea8f893c..b546f258f6c 100644 --- a/website/blog/2023-12-20-partner-integration-guide.md +++ b/website/blog/2023-12-20-partner-integration-guide.md @@ -23,7 +23,7 @@ Here I'll cover how to get started, potential use cases you want to solve for, a ## New to dbt Cloud? -If you're new to dbt and dbt Cloud, we recommend you and your software developers try our [Getting Started Quickstarts](https://docs.getdbt.com/guides) after reading [What is dbt?](https://docs.getdbt.com/docs/introduction). The documentation will help you familiarize yourself with how our users interact with dbt. By going through this, you will also create a sample dbt project to test your integration. +If you're new to dbt and dbt Cloud, we recommend you and your software developers try our [Getting Started Quickstarts](https://docs.getdbt.com/guides) after reading [What is dbt](https://docs.getdbt.com/docs/introduction). The documentation will help you familiarize yourself with how our users interact with dbt. By going through this, you will also create a sample dbt project to test your integration. If you require a partner dbt Cloud account to test on, we can upgrade an existing account or a trial account. This account may only be used for development, training, and demonstration purposes. Please contact your partner manager if you're interested and provide the account ID (provided in the URL). Our partner account includes all of the enterprise level functionality and can be provided with a signed partnerships agreement. From 2ef07fa1fbd0e8259a0eb7c9a4c923ccf9fad102 Mon Sep 17 00:00:00 2001 From: Ly Nguyen Date: Wed, 20 Dec 2023 15:22:29 -0800 Subject: [PATCH 090/143] Feedback --- website/snippets/_cloud-environments-info.md | 1 + 1 file changed, 1 insertion(+) diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index 6400b29ea9f..50f321cfd96 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -44,6 +44,7 @@ dbt Cloud caches your project's Git repo after each successful run and retains i Below lists the situations when dbt Cloud uses the cached copy: +- Outages from third-party services (for example, your Git provider) - Git authentication fails. - There are syntax errors in the `packages.yml` file. To catch these errors sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). - A package is incompatible with the dbt version being used. To catch this incompatibility sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). From 67e27374444e3e4151a3d81e3b1411aab4b04e5d Mon Sep 17 00:00:00 2001 From: Ly Nguyen Date: Wed, 20 Dec 2023 15:23:20 -0800 Subject: [PATCH 091/143] Missing a period --- website/snippets/_cloud-environments-info.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index 50f321cfd96..01f4d8eb35e 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -44,7 +44,7 @@ dbt Cloud caches your project's Git repo after each successful run and retains i Below lists the situations when dbt Cloud uses the cached copy: -- Outages from third-party services (for example, your Git provider) +- Outages from third-party services (for example, your Git provider). - Git authentication fails. - There are syntax errors in the `packages.yml` file. To catch these errors sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). - A package is incompatible with the dbt version being used. To catch this incompatibility sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). From 7f5890dc973e090483b92f9d95646d90541e4309 Mon Sep 17 00:00:00 2001 From: Jordan Stein Date: Wed, 20 Dec 2023 16:40:01 -0800 Subject: [PATCH 092/143] update group by items call out --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 96a1e20fc6b..1b01a93fefd 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -35,6 +35,7 @@ The following are updates for the dbt Semantic Layer and MetricFlow: ## New features -- Simplified group-by-item requests — Improved support for ambiguous group-by-item resolution. Previously, you need to specify them in detail, like `guest__listing__created_at__month`. This indicates a monthly `created_at` time dimension, linked by `guest` and `listing` entities. +- Simplified group-by-item requests. We updated the way the MetricFlow query resolver finds queryable dimensions for metrics. The main improvements ares: + - If the grain of a time dimension in a query is not specified, then the grain of the requested time dimension is resolved to be the finest grain that is available for the queried metrics. For example, say you have two metrics; revenue which has a weekly grain and orders which has a daily grain. If you query these metrics like this: `dbt sl query --metrics revenue,orders --group-by metric_time` metricflow will automatically query these metrics at a weekly grain. - Now you can use a shorter form, like ` listing__created_at__month`. If there's only one way to interpret this, dbt will resolve it automatically. If multiple interpretations are possible, dbt will ask for more details from the user. +- In a metric filter, if an ambiguous time dimension does not specify the grain, and all semantic models that are used to compute the metric define the time dimension with the same grain, MetricFlow should assume the specific time dimension is that grain. For example, say I have two metrics; revenue and users which are both daily. I can query these metrics without sepcifying the time dimension grain in the filte i.e `mf query --metrics users,revenue --group-by metric_time --where "{{ TimeDimension('metric_time') }} = '2017-07-30' "` From 1dad0f0b023eef9f75af30fbf97d052acef808de Mon Sep 17 00:00:00 2001 From: Damian Owsianny Date: Thu, 21 Dec 2023 10:37:14 +0100 Subject: [PATCH 093/143] Add on_table_exists 'replace' option to Starburst/Trino --- website/docs/reference/resource-configs/trino-configs.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/website/docs/reference/resource-configs/trino-configs.md b/website/docs/reference/resource-configs/trino-configs.md index 21df13feac4..9ee62959f76 100644 --- a/website/docs/reference/resource-configs/trino-configs.md +++ b/website/docs/reference/resource-configs/trino-configs.md @@ -97,8 +97,9 @@ The `dbt-trino` adapter supports these modes in `table` materialization, which y - `rename` — Creates an intermediate table, renames the target table to the backup one, and renames the intermediate table to the target one. - `drop` — Drops and re-creates a table. This overcomes the table rename limitation in AWS Glue. +- `replace` — Replaces a table using CREATE OR REPLACE clause. Support for table replacement varies across connectors. Refer to the connector documentation for details. -The recommended `table` materialization uses `on_table_exists = 'rename'` and is also the default. You can change this default configuration by editing _one_ of these files: +If CREATE OR REPLACE is supported in underlying connector, `replace` is recommended option. Otherwise, the recommended `table` materialization uses `on_table_exists = 'rename'` and is also the default. You can change this default configuration by editing _one_ of these files: - the SQL file for your model - the `dbt_project.yml` configuration file From 78f1667ce011f9b08fff78b6cb05f1b1c61454ee Mon Sep 17 00:00:00 2001 From: Damian Owsianny Date: Thu, 21 Dec 2023 10:38:08 +0100 Subject: [PATCH 094/143] Add new author of Starburst/Trino --- website/docs/docs/core/connect-data-platform/trino-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/core/connect-data-platform/trino-setup.md b/website/docs/docs/core/connect-data-platform/trino-setup.md index a7dc658358f..bb36bb11a01 100644 --- a/website/docs/docs/core/connect-data-platform/trino-setup.md +++ b/website/docs/docs/core/connect-data-platform/trino-setup.md @@ -4,7 +4,7 @@ description: "Read this guide to learn about the Starburst/Trino warehouse setup id: "trino-setup" meta: maintained_by: Starburst Data, Inc. - authors: Marius Grama, Przemek Denkiewicz, Michiel de Smet + authors: Marius Grama, Przemek Denkiewicz, Michiel de Smet, Damian Owsianny github_repo: 'starburstdata/dbt-trino' pypi_package: 'dbt-trino' min_core_version: 'v0.20.0' From 79202d5bf28047c8593161d553bd16a8e39ac719 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 11:55:09 -0500 Subject: [PATCH 095/143] Adding multi cell migration page --- website/docs/docs/cloud/migration.md | 41 ++++++++++++++++++++++++++++ 1 file changed, 41 insertions(+) create mode 100644 website/docs/docs/cloud/migration.md diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md new file mode 100644 index 00000000000..7fa91000389 --- /dev/null +++ b/website/docs/docs/cloud/migration.md @@ -0,0 +1,41 @@ +--- +title: "Multi-cell migration checklist" +id: migration +description: "Prepare for account migration to AWS cell based architecture." +pagination_next: null +pagination_prev: null +--- + +dbt Labs is in the process of migrating our U.S. based multi-tenant accounts to [AWS cell-based architecture](https://docs.aws.amazon.com/wellarchitected/latest/reducing-scope-of-impact-with-cell-based-architecture/what-is-a-cell-based-architecture.html), a critical component of the [AWS well-architected framework](https://aws.amazon.com/architecture/well-architected/?wa-lens-whitepapers.sort-by=item.additionalFields.sortDate&wa-lens-whitepapers.sort-order=desc&wa-guidance-whitepapers.sort-by=item.additionalFields.sortDate&wa-guidance-whitepapers.sort-order=desc). The benefits of the cell-based architecture will improve the performance, reliability, and security of your dbt Cloud environment, but there is some preparation required to ensure a successful migration. + +This document outlines the steps that you must take to prevent service disruptions before your environment is migrated over to the cell-based architecture. This will impact areas such as login, IP restrictions, and API access. + +### What’s changing? Pre-migration checklist. + +Prior to your migration date, your account admin will need to make some changes to your dbt Cloud account. + +If your account has been scheduled for migration, upon login, you will see a banner indicating your migration date. If you do not see a banner, you do not need to take any action. + +1. **IP Addresses** — dbt Cloud has new IPs that will be used to access your warehouse after the migration. Make sure to allow inbound traffic from these IPs in your firewall, and include it in any database grants. All six of the IPs below should be added to allowlists. + * Old IPs: `52.45.144.63`, `54.81.134.249`, `52.22.161.231` + * New IPs: `52.3.77.232`, `3.214.191.130`, `34.233.79.135` +2. **APIs and integrations** — Each dbt Cloud account will be allocated a static Access URL like: `aa000.us1.dbt.com`. You should begin migrating your API access and partner integrations to use the new static subdomain as soon as possible. You can find your Access URL on: + * Any page where you generate or manage API tokens. + * The **Account Settings** > **Account page**. + + :::important Multiple account access + Each account for which you have access will have a different, dedicated [Access URL](https://next.docs.getdbt.com/docs/cloud/about-cloud/access-regions-ip-addresses#accessing-your-account)! + ::: + +3. **IDE sessions** — Any uncommitted changes in the IDE may be lost during the migration process. We _strongly_ encourage you to commit all changes in the IDE before your scheduled migration time. +4. **User invitations** — Any pending user invitations will be invalidated during the migration. You can re-send the invitations once the migration is complete. +5. **Git Integrations** — Integrations with Github, Gitlab, and Azure DevOps will need to be manually updated. We are not migrating any accounts using these integrations at this time. If you are using one of these integrations and your account is scheduled for migration, please contact support and we will delay your migration. +6. **SSO Integrations** — Integrations with SSO IdPs will need to be manually updated. We are not migrating any accounts using SSO at this time; if you are using one of these integrations and your account is scheduled for migration, please contact support, and we will delay your migration. + +### Post-migration + +After migration, if you completed all of the checklist items above, your dbt Cloud resources and jobs will continue to work as they did before. + +You have the option to log into dbt Cloud at a different URL: + * If you were previously logging in at `cloud.getdbt.com`, you should instead plan to login at `us1.dbt.com`. The original URL will still work, but you’ll have to click through to be redirected upon login. + * You may also log in directly with your account’s unique [Access URL](https://next.docs.getdbt.com/docs/cloud/about-cloud/access-regions-ip-addresses#accessing-your-account). \ No newline at end of file From ecdbc83acf9c69381ddc1ca1af7fed781a69c620 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:21:55 -0500 Subject: [PATCH 096/143] Update website/docs/docs/cloud/migration.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 7fa91000389..1cb2e33c364 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -10,7 +10,7 @@ dbt Labs is in the process of migrating our U.S. based multi-tenant accounts to This document outlines the steps that you must take to prevent service disruptions before your environment is migrated over to the cell-based architecture. This will impact areas such as login, IP restrictions, and API access. -### What’s changing? Pre-migration checklist. +## Premigration checklist Prior to your migration date, your account admin will need to make some changes to your dbt Cloud account. From 4b9fac73daa280f68843e8d315932189b010228c Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:22:10 -0500 Subject: [PATCH 097/143] Update website/docs/docs/cloud/migration.md Co-authored-by: Connor McArthur --- website/docs/docs/cloud/migration.md | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 1cb2e33c364..6d207f5b613 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -6,7 +6,11 @@ pagination_next: null pagination_prev: null --- -dbt Labs is in the process of migrating our U.S. based multi-tenant accounts to [AWS cell-based architecture](https://docs.aws.amazon.com/wellarchitected/latest/reducing-scope-of-impact-with-cell-based-architecture/what-is-a-cell-based-architecture.html), a critical component of the [AWS well-architected framework](https://aws.amazon.com/architecture/well-architected/?wa-lens-whitepapers.sort-by=item.additionalFields.sortDate&wa-lens-whitepapers.sort-order=desc&wa-guidance-whitepapers.sort-by=item.additionalFields.sortDate&wa-guidance-whitepapers.sort-order=desc). The benefits of the cell-based architecture will improve the performance, reliability, and security of your dbt Cloud environment, but there is some preparation required to ensure a successful migration. +dbt Labs is in the process of migrating dbt Cloud to a new **cell-based architecture**. This architecture will be the foundation of dbt Cloud for years to come, and will bring improved **scalability**, **reliability**, and **security** to all customers and users of dbt Cloud. + +There is some preparation required to ensure a successful migration. + +Migrations are being scheduled on a per-account basis. **If you have not received any communication (either via a banner, or via an email) about a migration date, you do not need to take any action at this time.** Our team will share a specific migration date with you, with appropriate advance notice, before we complete any migration steps in the dbt Cloud backend. This document outlines the steps that you must take to prevent service disruptions before your environment is migrated over to the cell-based architecture. This will impact areas such as login, IP restrictions, and API access. From f6fe368112e85737e4ee9cfcf44af1fe98283ab6 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:22:29 -0500 Subject: [PATCH 098/143] Update website/docs/docs/cloud/migration.md --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 6d207f5b613..f989536abaf 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -14,7 +14,7 @@ Migrations are being scheduled on a per-account basis. **If you have not receive This document outlines the steps that you must take to prevent service disruptions before your environment is migrated over to the cell-based architecture. This will impact areas such as login, IP restrictions, and API access. -## Premigration checklist +## Pre-migration checklist Prior to your migration date, your account admin will need to make some changes to your dbt Cloud account. From 8774c56f87c5908b25e6273f6c8cf3357fb4d298 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:22:47 -0500 Subject: [PATCH 099/143] Update website/docs/docs/cloud/migration.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index f989536abaf..4f33a67565e 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -16,7 +16,7 @@ This document outlines the steps that you must take to prevent service disruptio ## Pre-migration checklist -Prior to your migration date, your account admin will need to make some changes to your dbt Cloud account. +Prior to your migration date, your dbt Cloud account admin will need to make some changes to your account. If your account has been scheduled for migration, upon login, you will see a banner indicating your migration date. If you do not see a banner, you do not need to take any action. From cd8fcea8dce4644988c78877a7384edf247a863f Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:24:14 -0500 Subject: [PATCH 100/143] Apply suggestions from code review Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/docs/docs/cloud/migration.md | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 4f33a67565e..ead08ff1a82 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -23,18 +23,18 @@ If your account has been scheduled for migration, upon login, you will see a ban 1. **IP Addresses** — dbt Cloud has new IPs that will be used to access your warehouse after the migration. Make sure to allow inbound traffic from these IPs in your firewall, and include it in any database grants. All six of the IPs below should be added to allowlists. * Old IPs: `52.45.144.63`, `54.81.134.249`, `52.22.161.231` * New IPs: `52.3.77.232`, `3.214.191.130`, `34.233.79.135` -2. **APIs and integrations** — Each dbt Cloud account will be allocated a static Access URL like: `aa000.us1.dbt.com`. You should begin migrating your API access and partner integrations to use the new static subdomain as soon as possible. You can find your Access URL on: +2. **APIs and integrations** — Each dbt Cloud account will be allocated a static access URL like: `aa000.us1.dbt.com`. You should begin migrating your API access and partner integrations to use the new static subdomain as soon as possible. You can find your access URL on: * Any page where you generate or manage API tokens. * The **Account Settings** > **Account page**. :::important Multiple account access - Each account for which you have access will have a different, dedicated [Access URL](https://next.docs.getdbt.com/docs/cloud/about-cloud/access-regions-ip-addresses#accessing-your-account)! + Be careful, each account that you have access to will have a different, dedicated [access URL](https://next.docs.getdbt.com/docs/cloud/about-cloud/access-regions-ip-addresses#accessing-your-account). ::: -3. **IDE sessions** — Any uncommitted changes in the IDE may be lost during the migration process. We _strongly_ encourage you to commit all changes in the IDE before your scheduled migration time. -4. **User invitations** — Any pending user invitations will be invalidated during the migration. You can re-send the invitations once the migration is complete. -5. **Git Integrations** — Integrations with Github, Gitlab, and Azure DevOps will need to be manually updated. We are not migrating any accounts using these integrations at this time. If you are using one of these integrations and your account is scheduled for migration, please contact support and we will delay your migration. -6. **SSO Integrations** — Integrations with SSO IdPs will need to be manually updated. We are not migrating any accounts using SSO at this time; if you are using one of these integrations and your account is scheduled for migration, please contact support, and we will delay your migration. +3. **IDE sessions** — Any uncommitted changes in the IDE might be lost during the migration process. dbt Labs _strongly_ encourages you to commit all changes in the IDE before your scheduled migration time. +4. **User invitations** — Any pending user invitations will be invalidated during the migration. You can resend the invitations once the migration is complete. +5. **Git integrations** — Integrations with GitHub, GitLab, and Azure DevOps will need to be manually updated. dbt Labs will not be migrating any accounts using these integrations at this time. If you're using one of these integrations and your account is scheduled for migration, please contact support and we will delay your migration. +6. **SSO integrations** — Integrations with SSO identity providers (IdPs) will need to be manually updated. dbt Labs will not be migrating any accounts using SSO at this time. If you're using one of these integrations and your account is scheduled for migration, please contact support and we will delay your migration. ### Post-migration From 67e3fcbc6a6402f5f3aa3c660f33c8e911f03264 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:24:54 -0500 Subject: [PATCH 101/143] Update website/docs/docs/cloud/migration.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index ead08ff1a82..6a957d57127 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -36,7 +36,7 @@ If your account has been scheduled for migration, upon login, you will see a ban 5. **Git integrations** — Integrations with GitHub, GitLab, and Azure DevOps will need to be manually updated. dbt Labs will not be migrating any accounts using these integrations at this time. If you're using one of these integrations and your account is scheduled for migration, please contact support and we will delay your migration. 6. **SSO integrations** — Integrations with SSO identity providers (IdPs) will need to be manually updated. dbt Labs will not be migrating any accounts using SSO at this time. If you're using one of these integrations and your account is scheduled for migration, please contact support and we will delay your migration. -### Post-migration +## Post-migration After migration, if you completed all of the checklist items above, your dbt Cloud resources and jobs will continue to work as they did before. From b697ecdc9cd195c0f7e927aeb5e31b9a99bd456f Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:26:35 -0500 Subject: [PATCH 102/143] Update website/docs/docs/cloud/migration.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 6a957d57127..8c8a3375b61 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -42,4 +42,4 @@ After migration, if you completed all of the checklist items above, your dbt Clo You have the option to log into dbt Cloud at a different URL: * If you were previously logging in at `cloud.getdbt.com`, you should instead plan to login at `us1.dbt.com`. The original URL will still work, but you’ll have to click through to be redirected upon login. - * You may also log in directly with your account’s unique [Access URL](https://next.docs.getdbt.com/docs/cloud/about-cloud/access-regions-ip-addresses#accessing-your-account). \ No newline at end of file + * You may also log in directly with your account’s unique [access URL](https://next.docs.getdbt.com/docs/cloud/about-cloud/access-regions-ip-addresses#accessing-your-account). \ No newline at end of file From 72c55578ba3d813e6eff38669e6104d3c7c5d17b Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:27:04 -0500 Subject: [PATCH 103/143] Update website/docs/docs/cloud/migration.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 8c8a3375b61..2081a0a3096 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -38,7 +38,7 @@ If your account has been scheduled for migration, upon login, you will see a ban ## Post-migration -After migration, if you completed all of the checklist items above, your dbt Cloud resources and jobs will continue to work as they did before. +After migration, if you completed all the [Pre-migration checklist](#pre-migration-checklist) items, your dbt Cloud resources and jobs will continue to work as they did before. You have the option to log into dbt Cloud at a different URL: * If you were previously logging in at `cloud.getdbt.com`, you should instead plan to login at `us1.dbt.com`. The original URL will still work, but you’ll have to click through to be redirected upon login. From 01c1712e57c966d700ba75ca1054ec9aafb4fe3e Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:27:23 -0500 Subject: [PATCH 104/143] Apply suggestions from code review Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/docs/docs/cloud/migration.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 2081a0a3096..bd7cbffe913 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -18,9 +18,9 @@ This document outlines the steps that you must take to prevent service disruptio Prior to your migration date, your dbt Cloud account admin will need to make some changes to your account. -If your account has been scheduled for migration, upon login, you will see a banner indicating your migration date. If you do not see a banner, you do not need to take any action. +If your account is scheduled for migration, you will see a banner indicating your migration date when you log in. If you don't see a banner, you don't need to take any action. -1. **IP Addresses** — dbt Cloud has new IPs that will be used to access your warehouse after the migration. Make sure to allow inbound traffic from these IPs in your firewall, and include it in any database grants. All six of the IPs below should be added to allowlists. +1. **IP addresses** — dbt Cloud will be using new IPs to access your warehouse after the migration. Make sure to allow inbound traffic from these IPs in your firewall and include it in any database grants. All six of the IPs below should be added to allowlists. * Old IPs: `52.45.144.63`, `54.81.134.249`, `52.22.161.231` * New IPs: `52.3.77.232`, `3.214.191.130`, `34.233.79.135` 2. **APIs and integrations** — Each dbt Cloud account will be allocated a static access URL like: `aa000.us1.dbt.com`. You should begin migrating your API access and partner integrations to use the new static subdomain as soon as possible. You can find your access URL on: From c123eb1542fa96e3fc503c7cae66f0e04ec92f80 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:33:22 -0500 Subject: [PATCH 105/143] Update website/docs/docs/cloud/migration.md --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index bd7cbffe913..58b4ed8c530 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -10,7 +10,7 @@ dbt Labs is in the process of migrating dbt Cloud to a new **cell-based architec There is some preparation required to ensure a successful migration. -Migrations are being scheduled on a per-account basis. **If you have not received any communication (either via a banner, or via an email) about a migration date, you do not need to take any action at this time.** Our team will share a specific migration date with you, with appropriate advance notice, before we complete any migration steps in the dbt Cloud backend. +Migrations are being scheduled on a per-account basis. _If you have not received any communication (either via a banner or email notification) about a migration date, you do not need to take any action at this time._ dbt Labs will share migration date information with you, with appropriate advance notice, before we complete any migration steps in the dbt Cloud backend. This document outlines the steps that you must take to prevent service disruptions before your environment is migrated over to the cell-based architecture. This will impact areas such as login, IP restrictions, and API access. From 465238eacee54ae3dc6174d20548c6bd70bcb6c7 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:36:06 -0500 Subject: [PATCH 106/143] Update website/docs/docs/cloud/migration.md --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 58b4ed8c530..0ec9e4c4d26 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -40,6 +40,6 @@ If your account is scheduled for migration, you will see a banner indicating you After migration, if you completed all the [Pre-migration checklist](#pre-migration-checklist) items, your dbt Cloud resources and jobs will continue to work as they did before. -You have the option to log into dbt Cloud at a different URL: +You have the option to log in to dbt Cloud at a different URL: * If you were previously logging in at `cloud.getdbt.com`, you should instead plan to login at `us1.dbt.com`. The original URL will still work, but you’ll have to click through to be redirected upon login. * You may also log in directly with your account’s unique [access URL](https://next.docs.getdbt.com/docs/cloud/about-cloud/access-regions-ip-addresses#accessing-your-account). \ No newline at end of file From 4c4bb5b7ef39a404ea1849d73e3f45dcb927b804 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 14:03:41 -0500 Subject: [PATCH 107/143] Update website/docs/docs/cloud/migration.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 0ec9e4c4d26..12b4026ff9f 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -10,7 +10,7 @@ dbt Labs is in the process of migrating dbt Cloud to a new **cell-based architec There is some preparation required to ensure a successful migration. -Migrations are being scheduled on a per-account basis. _If you have not received any communication (either via a banner or email notification) about a migration date, you do not need to take any action at this time._ dbt Labs will share migration date information with you, with appropriate advance notice, before we complete any migration steps in the dbt Cloud backend. +Migrations are being scheduled on a per-account basis. _If you haven't received any communication (either with a banner or by email) about a migration date, you don't need to take any action at this time._ dbt Labs will share migration date information with you, with appropriate advance notice, before we complete any migration steps in the dbt Cloud backend. This document outlines the steps that you must take to prevent service disruptions before your environment is migrated over to the cell-based architecture. This will impact areas such as login, IP restrictions, and API access. From 9fd5e551a5b2f3cbc6e242f8d2d0298fe92945a9 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 14:06:18 -0500 Subject: [PATCH 108/143] Update website/docs/docs/cloud/migration.md --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 12b4026ff9f..69804aa9cd0 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -1,7 +1,7 @@ --- title: "Multi-cell migration checklist" id: migration -description: "Prepare for account migration to AWS cell based architecture." +description: "Prepare for account migration to AWS cell-based architecture." pagination_next: null pagination_prev: null --- From 4caa9558fc068cbab8ecd9f8a77daa3c7c6ea11a Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 21 Dec 2023 14:06:27 -0500 Subject: [PATCH 109/143] Update website/docs/docs/cloud/migration.md Co-authored-by: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> --- website/docs/docs/cloud/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 69804aa9cd0..0c43a287bbe 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -6,7 +6,7 @@ pagination_next: null pagination_prev: null --- -dbt Labs is in the process of migrating dbt Cloud to a new **cell-based architecture**. This architecture will be the foundation of dbt Cloud for years to come, and will bring improved **scalability**, **reliability**, and **security** to all customers and users of dbt Cloud. +dbt Labs is in the process of migrating dbt Cloud to a new _cell-based architecture_. This architecture will be the foundation of dbt Cloud for years to come, and will bring improved scalability, reliability, and security to all customers and users of dbt Cloud. There is some preparation required to ensure a successful migration. From b839ea5b9c749300db47d14460a912e43860ea99 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Thu, 21 Dec 2023 15:49:54 -0500 Subject: [PATCH 110/143] Update website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 1b01a93fefd..dbdf66c8e64 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -38,4 +38,6 @@ The following are updates for the dbt Semantic Layer and MetricFlow: - Simplified group-by-item requests. We updated the way the MetricFlow query resolver finds queryable dimensions for metrics. The main improvements ares: - If the grain of a time dimension in a query is not specified, then the grain of the requested time dimension is resolved to be the finest grain that is available for the queried metrics. For example, say you have two metrics; revenue which has a weekly grain and orders which has a daily grain. If you query these metrics like this: `dbt sl query --metrics revenue,orders --group-by metric_time` metricflow will automatically query these metrics at a weekly grain. -- In a metric filter, if an ambiguous time dimension does not specify the grain, and all semantic models that are used to compute the metric define the time dimension with the same grain, MetricFlow should assume the specific time dimension is that grain. For example, say I have two metrics; revenue and users which are both daily. I can query these metrics without sepcifying the time dimension grain in the filte i.e `mf query --metrics users,revenue --group-by metric_time --where "{{ TimeDimension('metric_time') }} = '2017-07-30' "` +- Assumes time dimension grain: When using a metric filter, if an ambiguous time dimension doesn't specify the grain, and all used semantic models define this time dimension with the same grain, MetricFlow now automatically assumes the time dimension to be of that grain. + - For example, if you have two daily metrics: `revenue` and `users` — you can now query these metrics without specifying the time dimension grain in the filter: `mf query --metrics users,revenue --group-by metric_time --where "{{ TimeDimension('metric_time') }} = '2017-07-30' "` + From 224ce876c19ffb86a0826c004fda0a2c2b79e2cf Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Thu, 21 Dec 2023 15:53:40 -0500 Subject: [PATCH 111/143] Update website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index dbdf66c8e64..be8e5e8e61f 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -7,7 +7,7 @@ date: 2023-12-22 --- The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer and MetricFlow. -Refer to the following updates and fixes for December 2023: +Refer to the following updates and fixes for December 2023. ## Bug fixes From a3c0ec7f1de71bcdb69c2433c610f9c502afebbf Mon Sep 17 00:00:00 2001 From: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> Date: Thu, 21 Dec 2023 12:59:23 -0800 Subject: [PATCH 112/143] Update website/snippets/_cloud-environments-info.md Co-authored-by: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> --- website/snippets/_cloud-environments-info.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index 01f4d8eb35e..c9886a2eb94 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -42,7 +42,7 @@ For improved reliability and performance on your job runs, you can enable dbt Cl dbt Cloud caches your project's Git repo after each successful run and retains it for 8 days if there are no repo updates. It caches all packages regardless of installation method and does not fetch code outside of the job runs. -Below lists the situations when dbt Cloud uses the cached copy: +dbt Cloud will use the cached copy of your project's Git repo under these circumstances: - Outages from third-party services (for example, your Git provider). - Git authentication fails. From 29303c3040ca9d87c9dcd3e42b32552feeb83bbf Mon Sep 17 00:00:00 2001 From: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:00:20 -0800 Subject: [PATCH 113/143] Update website/snippets/_cloud-environments-info.md Co-authored-by: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> --- website/snippets/_cloud-environments-info.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index c9886a2eb94..aedec779a58 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -44,7 +44,7 @@ dbt Cloud caches your project's Git repo after each successful run and retains i dbt Cloud will use the cached copy of your project's Git repo under these circumstances: -- Outages from third-party services (for example, your Git provider). +- Outages from third-party services (for example, the [dbt package hub](https://hub.getdbt.com/)). - Git authentication fails. - There are syntax errors in the `packages.yml` file. To catch these errors sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). - A package is incompatible with the dbt version being used. To catch this incompatibility sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). From d3bcee988173b9f7b15a850ffb01dab190ff9893 Mon Sep 17 00:00:00 2001 From: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:04:19 -0800 Subject: [PATCH 114/143] Update website/snippets/_cloud-environments-info.md Co-authored-by: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> --- website/snippets/_cloud-environments-info.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index aedec779a58..101fa5d409e 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -46,7 +46,7 @@ dbt Cloud will use the cached copy of your project's Git repo under these circum - Outages from third-party services (for example, the [dbt package hub](https://hub.getdbt.com/)). - Git authentication fails. -- There are syntax errors in the `packages.yml` file. To catch these errors sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). +- There are syntax errors in the `packages.yml` file. You can set up and use [continuous integration (CI)](/docs/deploy/continuous-integration) to find these errors earlier. - A package is incompatible with the dbt version being used. To catch this incompatibility sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). To enable Git repository caching, select **Account settings** from the gear menu and enable the **Repository caching** option. From 61630d8d673d9245e634fb88e377a29f60b76d39 Mon Sep 17 00:00:00 2001 From: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:07:46 -0800 Subject: [PATCH 115/143] Update website/snippets/_cloud-environments-info.md Co-authored-by: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> --- website/snippets/_cloud-environments-info.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index 101fa5d409e..4dc5cfee003 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -47,7 +47,7 @@ dbt Cloud will use the cached copy of your project's Git repo under these circum - Outages from third-party services (for example, the [dbt package hub](https://hub.getdbt.com/)). - Git authentication fails. - There are syntax errors in the `packages.yml` file. You can set up and use [continuous integration (CI)](/docs/deploy/continuous-integration) to find these errors earlier. -- A package is incompatible with the dbt version being used. To catch this incompatibility sooner, set up and use [continuous integration (CI)](/docs/deploy/continuous-integration). +- If a package doesn't work with the current dbt version. You can set up and use [continuous integration (CI)](/docs/deploy/continuous-integration) to identify this issue sooner. To enable Git repository caching, select **Account settings** from the gear menu and enable the **Repository caching** option. From eba18680f6a13fa9ab960af8ec67a9c402098c17 Mon Sep 17 00:00:00 2001 From: Ly Nguyen <107218380+nghi-ly@users.noreply.github.com> Date: Thu, 21 Dec 2023 13:08:06 -0800 Subject: [PATCH 116/143] Update website/snippets/_cloud-environments-info.md --- website/snippets/_cloud-environments-info.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index 4dc5cfee003..6b6eb1c2761 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -46,7 +46,7 @@ dbt Cloud will use the cached copy of your project's Git repo under these circum - Outages from third-party services (for example, the [dbt package hub](https://hub.getdbt.com/)). - Git authentication fails. -- There are syntax errors in the `packages.yml` file. You can set up and use [continuous integration (CI)](/docs/deploy/continuous-integration) to find these errors earlier. +- There are syntax errors in the `packages.yml` file. You can set up and use [continuous integration (CI)](/docs/deploy/continuous-integration) to find these errors sooner. - If a package doesn't work with the current dbt version. You can set up and use [continuous integration (CI)](/docs/deploy/continuous-integration) to identify this issue sooner. To enable Git repository caching, select **Account settings** from the gear menu and enable the **Repository caching** option. From fe96ae5f9c3999deabf2c5e91476fe33fd0ad182 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Thu, 21 Dec 2023 16:21:12 -0500 Subject: [PATCH 117/143] Update website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md Co-authored-by: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> --- .../how-we-build-our-metrics/semantic-layer-2-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md index 275395f6b18..0eb3f26ae2c 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md @@ -28,7 +28,7 @@ Next, before you start writing code, you need to install MetricFlow: - Download MetricFlow as an extension of a dbt adapter from PyPI (dbt Core users only). The MetricFlow is compatible with Python versions 3.8 through 3.11. - - **Note**, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow. + - **Note**: You'll need to manage versioning between dbt Core, your adapter, and MetricFlow. - We'll use pip to install MetricFlow and our dbt adapter: ```shell From 294499e3df10f7440c5c14355c1eadff2b8b8bc2 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Thu, 21 Dec 2023 16:21:29 -0500 Subject: [PATCH 118/143] Update website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md Co-authored-by: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> --- .../how-we-build-our-metrics/semantic-layer-2-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md index 0eb3f26ae2c..7c74b69d859 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md @@ -19,7 +19,7 @@ Next, before you start writing code, you need to install MetricFlow: -- [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) — MetricFlow commands are embedded in the dbt Cloud CLI. This means you can immediately run them once you install the dbt Cloud CLI. Using dbt Cloud means you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. +- [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) — MetricFlow commands are embedded in the dbt Cloud CLI. You can immediately run them once you install the dbt Cloud CLI. Using dbt Cloud means you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. - [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) — You can create metrics using MetricFlow in the dbt Cloud IDE. However, support for running MetricFlow commands in the IDE will be available soon. From e225dad6e8fac7fab7d2d35f87debe4bfda34951 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Thu, 21 Dec 2023 16:22:15 -0500 Subject: [PATCH 119/143] Update website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md Co-authored-by: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> --- .../how-we-build-our-metrics/semantic-layer-2-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md index 7c74b69d859..5fea4fe8695 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md @@ -50,7 +50,7 @@ python -m pip install "dbt-metricflow[adapter name]" git checkout start-here ``` -For more information, refer to the [MetricFlow commands](/docs/build/metricflow-commands) or a [quickstart guides](/guides) to get more familiar with setting up a dbt project. +For more information, refer to the [MetricFlow commands](/docs/build/metricflow-commands) or the [quickstart guides](/guides) to get more familiar with setting up a dbt project. ## Basic commands From 96b59393e8c8256eccd7a1faf3ab3e928a38f1b6 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Thu, 21 Dec 2023 16:39:32 -0500 Subject: [PATCH 120/143] fix pagination --- .../how-we-build-our-metrics/semantic-layer-1-intro.md | 2 ++ .../how-we-build-our-metrics/semantic-layer-2-setup.md | 1 + .../semantic-layer-3-build-semantic-models.md | 1 + .../how-we-build-our-metrics/semantic-layer-4-build-metrics.md | 1 + .../semantic-layer-5-refactor-a-mart.md | 1 + .../semantic-layer-6-advanced-metrics.md | 1 + .../how-we-build-our-metrics/semantic-layer-7-conclusion.md | 1 + 7 files changed, 8 insertions(+) diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-1-intro.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-1-intro.md index ee3d4262882..59bdc41a705 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-1-intro.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-1-intro.md @@ -2,6 +2,8 @@ title: "Intro to MetricFlow" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow +pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup" +pagination_prev: null --- Flying cars, hoverboards, and true self-service analytics: this is the future we were promised. The first two might still be a few years out, but real self-service analytics is here today. With dbt Cloud's Semantic Layer, you can resolve the tension between accuracy and flexibility that has hampered analytics tools for years, empowering everybody in your organization to explore a shared reality of metrics. Best of all for analytics engineers, building with these new tools will significantly [DRY](https://docs.getdbt.com/terms/dry) up and simplify your codebase. As you'll see, the deep interaction between your dbt models and the Semantic Layer make your dbt project the ideal place to craft your metrics. diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md index 5fea4fe8695..20643391a82 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md @@ -2,6 +2,7 @@ title: "Set up MetricFlow" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow +pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models" --- ## Getting started diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md index a2dc55e37ae..3c33c08874c 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md @@ -2,6 +2,7 @@ title: "Building semantic models" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow +pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics" --- ## How to build a semantic model diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics.md index da83adbdc69..9f7849299b9 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics.md @@ -2,6 +2,7 @@ title: "Building metrics" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow +pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart" --- ## How to build metrics diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart.md index dfdba2941e9..68b42ee6aa4 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart.md @@ -2,6 +2,7 @@ title: "Refactor an existing mart" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow +pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics" --- ## A new approach diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics.md index fe7438b5800..92ab444172a 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics.md @@ -2,6 +2,7 @@ title: "More advanced metrics" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow +pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-7-conclusion" --- ## More advanced metric types diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-7-conclusion.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-7-conclusion.md index a1062721177..1870b6b77e4 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-7-conclusion.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-7-conclusion.md @@ -2,6 +2,7 @@ title: "Best practices" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow +pagination_next: null --- ## Putting it all together From b75662226b7b48c64cfac86169173aab5b72329e Mon Sep 17 00:00:00 2001 From: Jordan Stein Date: Thu, 21 Dec 2023 13:40:35 -0800 Subject: [PATCH 121/143] update release notes --- .../release-notes/74-Dec-2023/dec-sl-updates.md | 17 +---------------- 1 file changed, 1 insertion(+), 16 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index be8e5e8e61f..7491d17c039 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -7,7 +7,7 @@ date: 2023-12-22 --- The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer and MetricFlow. -Refer to the following updates and fixes for December 2023. +Refer to the following updates and fixes for December 2023: ## Bug fixes @@ -20,24 +20,9 @@ The following are updates for the dbt Semantic Layer and MetricFlow: - Memory leak — Fixed a memory leak in the JDBC API that would previously lead to intermittent errors when querying it. - Data conversion support — Added support for converting various Redshift and Postgres-specific data types. Previously, the driver would throw an error when encountering columns with those types. -**MetricFlow** - -- Time offset for nested metrics — Implemented time offset for nested derived and ratio metrics. ([MetricFlow Issue #882](https://github.com/dbt-labs/metricflow/issues/882)) -- SQL column name rendering: — Fixed incorrect SQL column name rendering in `WhereConstraintNode`. ([MetricFlow Issue #908](https://github.com/dbt-labs/metricflow/issues/908)) -- Cumulative metrics query error — Fixed the `Unable To Satisfy Query` error with cumulative metrics in Saved Queries. ([MetricFlow Issue #917](https://github.com/dbt-labs/metricflow/issues/917)) -- Dimension-only query — Fixed a bug in dimension-only queries where the filter column is removed before the filter has been applied. ([MetricFlow Issue #923](https://github.com/dbt-labs/metricflow/issues/923)) -- Where constraint column — Ensured retention of the where constraint column until used for nested derived offset metric queries. ([MetricFlow Issue #930](https://github.com/dbt-labs/metricflow/issues/930)) ## Improvements - Deprecation — We deprecated [dbt Metrics and the legacy dbt Semantic Layer](/docs/dbt-versions/release-notes/Dec-2023/legacy-sl), both supported on dbt version 1.5 or lower. This change came into effect on December 15th, 2023. - Improved dbt converter tool — The [dbt converter tool](https://github.com/dbt-labs/dbt-converter) can now help automate some of the work in converting from LookML (Looker's modeling language) for those who are migrating. Previously this wasn’t available. -## New features - -- Simplified group-by-item requests. We updated the way the MetricFlow query resolver finds queryable dimensions for metrics. The main improvements ares: - - If the grain of a time dimension in a query is not specified, then the grain of the requested time dimension is resolved to be the finest grain that is available for the queried metrics. For example, say you have two metrics; revenue which has a weekly grain and orders which has a daily grain. If you query these metrics like this: `dbt sl query --metrics revenue,orders --group-by metric_time` metricflow will automatically query these metrics at a weekly grain. - -- Assumes time dimension grain: When using a metric filter, if an ambiguous time dimension doesn't specify the grain, and all used semantic models define this time dimension with the same grain, MetricFlow now automatically assumes the time dimension to be of that grain. - - For example, if you have two daily metrics: `revenue` and `users` — you can now query these metrics without specifying the time dimension grain in the filter: `mf query --metrics users,revenue --group-by metric_time --where "{{ TimeDimension('metric_time') }} = '2017-07-30' "` - From b0bb375f2883bf58bdf91c3c353d3034ed2b8656 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Thu, 21 Dec 2023 16:42:41 -0500 Subject: [PATCH 122/143] fi broken links --- .../how-we-build-our-metrics/semantic-layer-1-intro.md | 2 +- .../how-we-build-our-metrics/semantic-layer-2-setup.md | 2 +- .../semantic-layer-3-build-semantic-models.md | 2 +- .../how-we-build-our-metrics/semantic-layer-4-build-metrics.md | 2 +- .../semantic-layer-5-refactor-a-mart.md | 2 +- .../semantic-layer-6-advanced-metrics.md | 2 +- 6 files changed, 6 insertions(+), 6 deletions(-) diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-1-intro.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-1-intro.md index 59bdc41a705..e50542a446c 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-1-intro.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-1-intro.md @@ -2,7 +2,7 @@ title: "Intro to MetricFlow" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow -pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup" +pagination_next: "best-practices/how-we-build-our-metrics/semantic-layer-2-setup" pagination_prev: null --- diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md index 20643391a82..470445891dc 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md @@ -2,7 +2,7 @@ title: "Set up MetricFlow" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow -pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models" +pagination_next: "best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models" --- ## Getting started diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md index 3c33c08874c..9c710b286ef 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md @@ -2,7 +2,7 @@ title: "Building semantic models" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow -pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics" +pagination_next: "best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics" --- ## How to build a semantic model diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics.md index 9f7849299b9..003eff9de40 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-4-build-metrics.md @@ -2,7 +2,7 @@ title: "Building metrics" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow -pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart" +pagination_next: "best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart" --- ## How to build metrics diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart.md index 68b42ee6aa4..9ae80cbcd29 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-5-refactor-a-mart.md @@ -2,7 +2,7 @@ title: "Refactor an existing mart" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow -pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics" +pagination_next: "best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics" --- ## A new approach diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics.md index 92ab444172a..e5c6e452dac 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-6-advanced-metrics.md @@ -2,7 +2,7 @@ title: "More advanced metrics" description: Getting started with the dbt and MetricFlow hoverSnippet: Learn how to get started with the dbt and MetricFlow -pagination_next: "docs/best-practices/how-we-build-our-metrics/semantic-layer-7-conclusion" +pagination_next: "best-practices/how-we-build-our-metrics/semantic-layer-7-conclusion" --- ## More advanced metric types From 03ec70050b076a4cbcb3c58690b1c6b247129328 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Thu, 21 Dec 2023 16:49:37 -0500 Subject: [PATCH 123/143] Update website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 7491d17c039..598908dc921 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -1,5 +1,5 @@ --- -title: "dbt Semantic Layer and MetricFlow updates for December 2023" +title: "dbt Semantic Layer updates for December 2023" description: "December 2023: Enhanced Tableau integration, BIGINT support, LookML to MetricFlow conversion, and deprecation of legacy features." sidebar_label: "Update and fixes: dbt Semantic Layer and MetricFlow" sidebar_position: 08 From 31339d2dbd19d93d33a552691b6187812fdb3174 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Thu, 21 Dec 2023 16:49:52 -0500 Subject: [PATCH 124/143] Update website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 598908dc921..63ecd37fb4b 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -1,7 +1,7 @@ --- title: "dbt Semantic Layer updates for December 2023" description: "December 2023: Enhanced Tableau integration, BIGINT support, LookML to MetricFlow conversion, and deprecation of legacy features." -sidebar_label: "Update and fixes: dbt Semantic Layer and MetricFlow" +sidebar_label: "Update and fixes: dbt Semantic Layer" sidebar_position: 08 date: 2023-12-22 --- From 4f9e6c061010edf298a1eb631a0779ebb0b6e961 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Thu, 21 Dec 2023 16:50:04 -0500 Subject: [PATCH 125/143] Update website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 63ecd37fb4b..7213962b58f 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -5,7 +5,7 @@ sidebar_label: "Update and fixes: dbt Semantic Layer" sidebar_position: 08 date: 2023-12-22 --- -The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer and MetricFlow. +The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer. Refer to the following updates and fixes for December 2023: From 2811e4353e0d1a25e00f4c99a6f5616b7d4c3ebb Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Thu, 21 Dec 2023 16:50:16 -0500 Subject: [PATCH 126/143] Update website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md --- .../dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 7213962b58f..7bb44b44724 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -11,7 +11,7 @@ Refer to the following updates and fixes for December 2023: ## Bug fixes -The following are updates for the dbt Semantic Layer and MetricFlow: +The following are updates for the dbt Semantic Layer: **dbt Semantic Layer** From 0feec8522c89b80c327211a479c7b0c4bbc7fe6d Mon Sep 17 00:00:00 2001 From: Pat Kearns Date: Fri, 22 Dec 2023 15:56:30 +1100 Subject: [PATCH 127/143] Update website/docs/docs/cloud/connect-data-platform/connect-snowflake.md Co-authored-by: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> --- .../docs/docs/cloud/connect-data-platform/connect-snowflake.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md index 34b69f56c27..68dbd2f8a42 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md @@ -42,7 +42,8 @@ alter user jsmith set rsa_public_key='MIIBIjANBgkqh...'; ``` 2. Finally, set the **Private Key** and **Private Key Passphrase** fields in the **Credentials** page to finish configuring dbt Cloud to authenticate with Snowflake using a key pair. - **Note:** Since dbt 0.16.0, unencrypted private keys are allowed. Only add the passphrase if necessary. + +**Note:** From dbt version 0.16.0 onwards, unencrypted private keys are permitted. Use a passphrase only if needed. Starting from dbt 1.5.0, you have the option to use a private_key string instead of a private_key_path. The private_key string should be in either Base64-encoded DER format, representing the key bytes, or a plain-text PEM format. Refer to Snowflake documentation for more info on how they generate the key. 4. To successfully fill in the Private Key field, you **must** include commented lines. If you're receiving a `Could not deserialize key data` or `JWT token` error, refer to [Troubleshooting](#troubleshooting) for more info. From 00c20182e7a46dd83ed7fa3268d4a97715f23d8b Mon Sep 17 00:00:00 2001 From: Pat Kearns Date: Fri, 22 Dec 2023 15:56:51 +1100 Subject: [PATCH 128/143] Update website/docs/docs/cloud/connect-data-platform/connect-snowflake.md Co-authored-by: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> --- .../docs/docs/cloud/connect-data-platform/connect-snowflake.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md index 68dbd2f8a42..7dd4c86b59b 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md @@ -44,7 +44,8 @@ alter user jsmith set rsa_public_key='MIIBIjANBgkqh...'; 2. Finally, set the **Private Key** and **Private Key Passphrase** fields in the **Credentials** page to finish configuring dbt Cloud to authenticate with Snowflake using a key pair. **Note:** From dbt version 0.16.0 onwards, unencrypted private keys are permitted. Use a passphrase only if needed. - Starting from dbt 1.5.0, you have the option to use a private_key string instead of a private_key_path. The private_key string should be in either Base64-encoded DER format, representing the key bytes, or a plain-text PEM format. Refer to Snowflake documentation for more info on how they generate the key. +As of dbt version 1.5.0, you can use a `private_key` string in place of `private_key_path`. This `private_key` string can be either Base64-encoded DER format for the key bytes or plain-text PEM format. For more details on key generation, refer to the [Snowflake documentation](https://community.snowflake.com/s/article/How-to-configure-Snowflake-key-pair-authentication-fields-in-dbt-connection). + 4. To successfully fill in the Private Key field, you **must** include commented lines. If you're receiving a `Could not deserialize key data` or `JWT token` error, refer to [Troubleshooting](#troubleshooting) for more info. From ff9b70f70c493d50f650315cbe7758e0b789f40f Mon Sep 17 00:00:00 2001 From: Pat Kearns Date: Fri, 22 Dec 2023 15:57:08 +1100 Subject: [PATCH 129/143] Update website/docs/docs/core/connect-data-platform/snowflake-setup.md Co-authored-by: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> --- website/docs/docs/core/connect-data-platform/snowflake-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/core/connect-data-platform/snowflake-setup.md b/website/docs/docs/core/connect-data-platform/snowflake-setup.md index d9d4aa6f3cb..8d42ec523f2 100644 --- a/website/docs/docs/core/connect-data-platform/snowflake-setup.md +++ b/website/docs/docs/core/connect-data-platform/snowflake-setup.md @@ -98,7 +98,7 @@ Along with adding the `authenticator` parameter, be sure to run `alter account s ### Key Pair Authentication -To use key pair authentication, omit a `password` and instead provide a `private_key_path` and, optionally, a `private_key_passphrase`. +To use key pair authentication, skip the `password` and provide a `private_key_path`. If needed, you can also add a `private_key_passphrase`. **Note:** Versions of dbt before 0.16.0 required that private keys were encrypted and a `private_key_passphrase` was provided. Since dbt 0.16.0, unencrypted private keys are allowed. Only add the passphrase if necessary. Starting from [dbt v1.5.0](/docs/dbt-versions/core), you have the option to use a `private_key` string instead of a `private_key_path`. The `private_key` string should be in either Base64-encoded DER format, representing the key bytes, or a plain-text PEM format. Refer to [Snowflake documentation](https://docs.snowflake.com/developer-guide/python-connector/python-connector-example#using-key-pair-authentication-key-pair-rotation) for more info on how they generate the key. From 868170e34fd0b3cf01399d18cd26dda6e6e2f389 Mon Sep 17 00:00:00 2001 From: Pat Kearns Date: Fri, 22 Dec 2023 15:57:22 +1100 Subject: [PATCH 130/143] Update website/docs/docs/core/connect-data-platform/snowflake-setup.md Co-authored-by: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> --- website/docs/docs/core/connect-data-platform/snowflake-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/core/connect-data-platform/snowflake-setup.md b/website/docs/docs/core/connect-data-platform/snowflake-setup.md index 8d42ec523f2..ce021f8013c 100644 --- a/website/docs/docs/core/connect-data-platform/snowflake-setup.md +++ b/website/docs/docs/core/connect-data-platform/snowflake-setup.md @@ -99,7 +99,7 @@ Along with adding the `authenticator` parameter, be sure to run `alter account s ### Key Pair Authentication To use key pair authentication, skip the `password` and provide a `private_key_path`. If needed, you can also add a `private_key_passphrase`. -**Note:** Versions of dbt before 0.16.0 required that private keys were encrypted and a `private_key_passphrase` was provided. Since dbt 0.16.0, unencrypted private keys are allowed. Only add the passphrase if necessary. +**Note**: In dbt versions before 0.16.0, private keys needed encryption and a `private_key_passphrase`. From dbt version 0.16.0 onwards, unencrypted private keys are accepted, so add a passphrase only if necessary. Starting from [dbt v1.5.0](/docs/dbt-versions/core), you have the option to use a `private_key` string instead of a `private_key_path`. The `private_key` string should be in either Base64-encoded DER format, representing the key bytes, or a plain-text PEM format. Refer to [Snowflake documentation](https://docs.snowflake.com/developer-guide/python-connector/python-connector-example#using-key-pair-authentication-key-pair-rotation) for more info on how they generate the key. From f7dc8306ea5b936fa4d4d44b4a2164cf2200ee35 Mon Sep 17 00:00:00 2001 From: sachinthakur96 Date: Fri, 22 Dec 2023 13:01:02 +0530 Subject: [PATCH 131/143] Adding Oauth access --- website/docs/docs/core/connect-data-platform/vertica-setup.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/website/docs/docs/core/connect-data-platform/vertica-setup.md b/website/docs/docs/core/connect-data-platform/vertica-setup.md index 525e1be86fc..b7bb85a537b 100644 --- a/website/docs/docs/core/connect-data-platform/vertica-setup.md +++ b/website/docs/docs/core/connect-data-platform/vertica-setup.md @@ -68,10 +68,12 @@ your-profile: username: [your username] password: [your password] database: [database name] + oauth_access_token: [access token] schema: [dbt schema] connection_load_balance: True backup_server_node: [list of backup hostnames or IPs] retries: [1 or more] + threads: [1 or more] target: dev ``` @@ -92,6 +94,7 @@ your-profile: | username | The username to use to connect to the server. | Yes | None | dbadmin| password |The password to use for authenticating to the server. |Yes|None|my_password| database |The name of the database running on the server. |Yes | None | my_db | +| oauth_access_token | To authenticate via OAuth, provide an OAuth Access Token that authorizes a user to the database. | No | "" | Default: "" | schema| The schema to build models into.| No| None |VMart| connection_load_balance| A Boolean value that indicates whether the connection can be redirected to a host in the database other than host.| No| True |True| backup_server_node| List of hosts to connect to if the primary host specified in the connection (host, port) is unreachable. Each item in the list should be either a host string (using default port 5433) or a (host, port) tuple. A host can be a host name or an IP address.| No| None |['123.123.123.123','www.abc.com',('123.123.123.124',5433)]| From 30cebdc9b3509012587232311edd918f47fa74a9 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Fri, 22 Dec 2023 08:01:03 -0500 Subject: [PATCH 132/143] Update keyboard-shortcuts.md add shift option click option per slack convo with user https://getdbt.slack.com/archives/C03SAHKKG2Z/p1703249576071509?thread_ts=1703191150.925439&cid=C03SAHKKG2Z --- .../docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md index 121cab68ce7..d00a5a7d939 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md @@ -13,14 +13,14 @@ Use this dbt Cloud IDE page to help you quickly reference some common operation |--------|----------------|------------------| | View a full list of editor shortcuts | Fn-F1 | Fn-F1 | | Select a file to open | Command-O | Control-O | -| Open the command palette to invoke dbt commands and actions | Command-P or Command-Shift-P | Control-P or Control-Shift-P | -| Multi-edit by selecting multiple lines | Option-click or Shift-Option-Command | Hold Alt and click | +| Close currently active editor tab | Option-W | Alt-W | | Preview code | Command-Enter | Control-Enter | | Compile code | Command-Shift-Enter | Control-Shift-Enter | -| Reveal a list of dbt functions | Enter two underscores `__` | Enter two underscores `__` | +| Reveal a list of dbt functions in the editor | Enter two underscores `__` | Enter two underscores `__` | +| Open the command palette to invoke dbt commands and actions | Command-P
Command-Shift-P | Control-P
Control-Shift-P | +| Multi-edit in the editor by selecting multiple lines | Option-Click
Shift-Option-Command
Shift-Option-Click | Hold Alt and Click | | Toggle open the [Invocation history drawer](/docs/cloud/dbt-cloud-ide/ide-user-interface#invocation-history) located on the bottom of the IDE. | Control-backtick (or Control + `) | Control-backtick (or Ctrl + `) | -| Add a block comment to selected code. SQL files will use the Jinja syntax `({# #})` rather than the SQL one `(/* */)`.

Markdown files will use the Markdown syntax `()` | Command-Option-/ | Control-Alt-/ | -| Close the currently active editor tab | Option-W | Alt-W | +| Add a block comment to the selected code. SQL files will use the Jinja syntax `({# #})` rather than the SQL one `(/* */)`.

Markdown files will use the Markdown syntax `()` | Command-Option-/ | Control-Alt-/ | ## Related docs From 2c47b81211b84381e0c710c4d945733f88da09bc Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Fri, 22 Dec 2023 08:24:49 -0500 Subject: [PATCH 133/143] Update keyboard-shortcuts.md --- website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md index d00a5a7d939..9332a116de0 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md @@ -17,8 +17,8 @@ Use this dbt Cloud IDE page to help you quickly reference some common operation | Preview code | Command-Enter | Control-Enter | | Compile code | Command-Shift-Enter | Control-Shift-Enter | | Reveal a list of dbt functions in the editor | Enter two underscores `__` | Enter two underscores `__` | -| Open the command palette to invoke dbt commands and actions | Command-P
Command-Shift-P | Control-P
Control-Shift-P | -| Multi-edit in the editor by selecting multiple lines | Option-Click
Shift-Option-Command
Shift-Option-Click | Hold Alt and Click | +| Open the command palette to invoke dbt commands and actions | - Command-P
- Command-Shift-P | Control-P
Control-Shift-P | +| Multi-edit in the editor by selecting multiple lines | - Option-Click
- Shift-Option-Command
- Shift-Option-Click | Hold Alt and Click | | Toggle open the [Invocation history drawer](/docs/cloud/dbt-cloud-ide/ide-user-interface#invocation-history) located on the bottom of the IDE. | Control-backtick (or Control + `) | Control-backtick (or Ctrl + `) | | Add a block comment to the selected code. SQL files will use the Jinja syntax `({# #})` rather than the SQL one `(/* */)`.

Markdown files will use the Markdown syntax `()` | Command-Option-/ | Control-Alt-/ | From f48745c5e387b44ab151781c5b51defcafba6437 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Fri, 22 Dec 2023 08:40:35 -0500 Subject: [PATCH 134/143] Update website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md --- website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md index 9332a116de0..daf417dc4cf 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md @@ -18,7 +18,7 @@ Use this dbt Cloud IDE page to help you quickly reference some common operation | Compile code | Command-Shift-Enter | Control-Shift-Enter | | Reveal a list of dbt functions in the editor | Enter two underscores `__` | Enter two underscores `__` | | Open the command palette to invoke dbt commands and actions | - Command-P
- Command-Shift-P | Control-P
Control-Shift-P | -| Multi-edit in the editor by selecting multiple lines | - Option-Click
- Shift-Option-Command
- Shift-Option-Click | Hold Alt and Click | +| Multi-edit in the editor by selecting multiple lines | Option-Click / Shift-Option-Command / Shift-Option-Click | Hold Alt and Click | | Toggle open the [Invocation history drawer](/docs/cloud/dbt-cloud-ide/ide-user-interface#invocation-history) located on the bottom of the IDE. | Control-backtick (or Control + `) | Control-backtick (or Ctrl + `) | | Add a block comment to the selected code. SQL files will use the Jinja syntax `({# #})` rather than the SQL one `(/* */)`.

Markdown files will use the Markdown syntax `()` | Command-Option-/ | Control-Alt-/ | From f87032929ada21fa0014914e7b860c7f736088ec Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Fri, 22 Dec 2023 08:41:15 -0500 Subject: [PATCH 135/143] Update website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md --- website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md index daf417dc4cf..1e847e0a4f2 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md @@ -17,7 +17,7 @@ Use this dbt Cloud IDE page to help you quickly reference some common operation | Preview code | Command-Enter | Control-Enter | | Compile code | Command-Shift-Enter | Control-Shift-Enter | | Reveal a list of dbt functions in the editor | Enter two underscores `__` | Enter two underscores `__` | -| Open the command palette to invoke dbt commands and actions | - Command-P
- Command-Shift-P | Control-P
Control-Shift-P | +| Open the command palette to invoke dbt commands and actions | Command-P / Command-Shift-P | Control-P / Control-Shift-P | | Multi-edit in the editor by selecting multiple lines | Option-Click / Shift-Option-Command / Shift-Option-Click | Hold Alt and Click | | Toggle open the [Invocation history drawer](/docs/cloud/dbt-cloud-ide/ide-user-interface#invocation-history) located on the bottom of the IDE. | Control-backtick (or Control + `) | Control-backtick (or Ctrl + `) | | Add a block comment to the selected code. SQL files will use the Jinja syntax `({# #})` rather than the SQL one `(/* */)`.

Markdown files will use the Markdown syntax `()` | Command-Option-/ | Control-Alt-/ | From 51b5a4ce0db9fe450104e39ae5d934ac21eebea6 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Fri, 22 Dec 2023 09:09:05 -0500 Subject: [PATCH 136/143] Update website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md --- website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md index 1e847e0a4f2..de456e52655 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md @@ -19,7 +19,7 @@ Use this dbt Cloud IDE page to help you quickly reference some common operation | Reveal a list of dbt functions in the editor | Enter two underscores `__` | Enter two underscores `__` | | Open the command palette to invoke dbt commands and actions | Command-P / Command-Shift-P | Control-P / Control-Shift-P | | Multi-edit in the editor by selecting multiple lines | Option-Click / Shift-Option-Command / Shift-Option-Click | Hold Alt and Click | -| Toggle open the [Invocation history drawer](/docs/cloud/dbt-cloud-ide/ide-user-interface#invocation-history) located on the bottom of the IDE. | Control-backtick (or Control + `) | Control-backtick (or Ctrl + `) | +| Open the [Invocation history drawer](/docs/cloud/dbt-cloud-ide/ide-user-interface#invocation-history) located at the bottom of the IDE. | Control-backtick (or Control + `) | Control-backtick (or Ctrl + `) | | Add a block comment to the selected code. SQL files will use the Jinja syntax `({# #})` rather than the SQL one `(/* */)`.

Markdown files will use the Markdown syntax `()` | Command-Option-/ | Control-Alt-/ | ## Related docs From 45e719247438a25a30d78287057b071308305012 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Fri, 22 Dec 2023 09:16:06 -0500 Subject: [PATCH 137/143] Update website/docs/docs/core/connect-data-platform/vertica-setup.md --- website/docs/docs/core/connect-data-platform/vertica-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/core/connect-data-platform/vertica-setup.md b/website/docs/docs/core/connect-data-platform/vertica-setup.md index 113b71c05d1..8e499d68b3e 100644 --- a/website/docs/docs/core/connect-data-platform/vertica-setup.md +++ b/website/docs/docs/core/connect-data-platform/vertica-setup.md @@ -6,7 +6,7 @@ meta: authors: 'Vertica (Former authors: Matthew Carter, Andy Regan, Andrew Hedengren)' github_repo: 'vertica/dbt-vertica' pypi_package: 'dbt-vertica' - min_core_version: 'v1.7.0 and newer' + min_core_version: 'v1.7.0' cloud_support: 'Not Supported' min_supported_version: 'Vertica 23.4.0' slack_channel_name: 'n/a' From 13d1fcf10cbf7c81218a6fafaf7f11b146c25616 Mon Sep 17 00:00:00 2001 From: Amy Chen Date: Fri, 22 Dec 2023 09:18:07 -0500 Subject: [PATCH 138/143] update author --- website/blog/authors.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/blog/authors.yml b/website/blog/authors.yml index 82cc300bdc8..a3548575b6e 100644 --- a/website/blog/authors.yml +++ b/website/blog/authors.yml @@ -1,6 +1,6 @@ amy_chen: image_url: /img/blog/authors/achen.png - job_title: Product Partnerships Manager + job_title: Product Ecosystem Manager links: - icon: fa-linkedin url: https://www.linkedin.com/in/yuanamychen/ From 28e9d4c3d394566e66b65c9ca46f93fc1a30f111 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Fri, 22 Dec 2023 09:20:09 -0500 Subject: [PATCH 139/143] Update website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md --- website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md index de456e52655..61fe47a235a 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/keyboard-shortcuts.md @@ -19,7 +19,7 @@ Use this dbt Cloud IDE page to help you quickly reference some common operation | Reveal a list of dbt functions in the editor | Enter two underscores `__` | Enter two underscores `__` | | Open the command palette to invoke dbt commands and actions | Command-P / Command-Shift-P | Control-P / Control-Shift-P | | Multi-edit in the editor by selecting multiple lines | Option-Click / Shift-Option-Command / Shift-Option-Click | Hold Alt and Click | -| Open the [Invocation history drawer](/docs/cloud/dbt-cloud-ide/ide-user-interface#invocation-history) located at the bottom of the IDE. | Control-backtick (or Control + `) | Control-backtick (or Ctrl + `) | +| Open the [**Invocation History Drawer**](/docs/cloud/dbt-cloud-ide/ide-user-interface#invocation-history) located at the bottom of the IDE. | Control-backtick (or Control + `) | Control-backtick (or Ctrl + `) | | Add a block comment to the selected code. SQL files will use the Jinja syntax `({# #})` rather than the SQL one `(/* */)`.

Markdown files will use the Markdown syntax `()` | Command-Option-/ | Control-Alt-/ | ## Related docs From cc0ca46db3c792b9ecc5bdab96b380214eeba4cb Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Fri, 22 Dec 2023 09:34:15 -0500 Subject: [PATCH 140/143] Update dec-sl-updates.md --- .../release-notes/74-Dec-2023/dec-sl-updates.md | 8 +------- 1 file changed, 1 insertion(+), 7 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md index 7bb44b44724..401b43fb333 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/dec-sl-updates.md @@ -5,16 +5,10 @@ sidebar_label: "Update and fixes: dbt Semantic Layer" sidebar_position: 08 date: 2023-12-22 --- -The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer. - -Refer to the following updates and fixes for December 2023: +The dbt Labs team continues to work on adding new features, fixing bugs, and increasing reliability for the dbt Semantic Layer. The following list explains the updates and fixes for December 2023 in more detail. ## Bug fixes -The following are updates for the dbt Semantic Layer: - -**dbt Semantic Layer** - - Tableau integration — The dbt Semantic Layer integration with Tableau now supports queries that resolve to a "NOT IN" clause. This applies to using "exclude" in the filtering user interface. Previously it wasn’t supported. - `BIGINT` support — The dbt Semantic Layer can now support `BIGINT` values with precision greater than 18. Previously it would return an error. - Memory leak — Fixed a memory leak in the JDBC API that would previously lead to intermittent errors when querying it. From f5a2a3ed964378917d783f297eee0ae953f76101 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Fri, 22 Dec 2023 11:39:57 -0500 Subject: [PATCH 141/143] Update website/docs/docs/cloud/connect-data-platform/connect-snowflake.md --- .../docs/docs/cloud/connect-data-platform/connect-snowflake.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md index 7dd4c86b59b..eb6aba0c260 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md @@ -47,7 +47,7 @@ alter user jsmith set rsa_public_key='MIIBIjANBgkqh...'; As of dbt version 1.5.0, you can use a `private_key` string in place of `private_key_path`. This `private_key` string can be either Base64-encoded DER format for the key bytes or plain-text PEM format. For more details on key generation, refer to the [Snowflake documentation](https://community.snowflake.com/s/article/How-to-configure-Snowflake-key-pair-authentication-fields-in-dbt-connection). -4. To successfully fill in the Private Key field, you **must** include commented lines. If you're receiving a `Could not deserialize key data` or `JWT token` error, refer to [Troubleshooting](#troubleshooting) for more info. +4. To successfully fill in the Private Key field, you _must_ include commented lines. If you receive a `Could not deserialize key data` or `JWT token` error, refer to [Troubleshooting](#troubleshooting) for more info. **Example:** From eada5eed7173880c28aea88529bb31f5a8a908b0 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Fri, 22 Dec 2023 11:42:33 -0500 Subject: [PATCH 142/143] Update website/docs/docs/core/connect-data-platform/snowflake-setup.md --- website/docs/docs/core/connect-data-platform/snowflake-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/core/connect-data-platform/snowflake-setup.md b/website/docs/docs/core/connect-data-platform/snowflake-setup.md index ce021f8013c..2ab5e64e36a 100644 --- a/website/docs/docs/core/connect-data-platform/snowflake-setup.md +++ b/website/docs/docs/core/connect-data-platform/snowflake-setup.md @@ -99,7 +99,7 @@ Along with adding the `authenticator` parameter, be sure to run `alter account s ### Key Pair Authentication To use key pair authentication, skip the `password` and provide a `private_key_path`. If needed, you can also add a `private_key_passphrase`. -**Note**: In dbt versions before 0.16.0, private keys needed encryption and a `private_key_passphrase`. From dbt version 0.16.0 onwards, unencrypted private keys are accepted, so add a passphrase only if necessary. +**Note**: Unencrypted private keys are accepted, so add a passphrase only if necessary. Starting from [dbt v1.5.0](/docs/dbt-versions/core), you have the option to use a `private_key` string instead of a `private_key_path`. The `private_key` string should be in either Base64-encoded DER format, representing the key bytes, or a plain-text PEM format. Refer to [Snowflake documentation](https://docs.snowflake.com/developer-guide/python-connector/python-connector-example#using-key-pair-authentication-key-pair-rotation) for more info on how they generate the key. From bbf7027add290376b3b3a1323e7ab64bfa03f366 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Fri, 22 Dec 2023 11:43:20 -0500 Subject: [PATCH 143/143] Update website/docs/docs/cloud/connect-data-platform/connect-snowflake.md --- .../docs/docs/cloud/connect-data-platform/connect-snowflake.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md index eb6aba0c260..c265529fb49 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md @@ -43,7 +43,7 @@ alter user jsmith set rsa_public_key='MIIBIjANBgkqh...'; 2. Finally, set the **Private Key** and **Private Key Passphrase** fields in the **Credentials** page to finish configuring dbt Cloud to authenticate with Snowflake using a key pair. -**Note:** From dbt version 0.16.0 onwards, unencrypted private keys are permitted. Use a passphrase only if needed. +**Note:** Unencrypted private keys are permitted. Use a passphrase only if needed. As of dbt version 1.5.0, you can use a `private_key` string in place of `private_key_path`. This `private_key` string can be either Base64-encoded DER format for the key bytes or plain-text PEM format. For more details on key generation, refer to the [Snowflake documentation](https://community.snowflake.com/s/article/How-to-configure-Snowflake-key-pair-authentication-fields-in-dbt-connection).