From d2dadb1cb2eedabdf6e2bf0ccb9e38b552f377f8 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Thu, 15 Jun 2023 14:32:18 +0100 Subject: [PATCH 01/25] clarify setup page content/headers as part of references, changing setup pages and clarifying content so its more understandable and scalable. --- .../docs/core/connect-data-platform/bigquery-setup.md | 10 +++------- 1 file changed, 3 insertions(+), 7 deletions(-) diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md index 8df69d2f7e3..7ebec8916ae 100644 --- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md +++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md @@ -15,7 +15,6 @@ meta: config_page: '/reference/resource-configs/bigquery-configs' --- -

Overview of {frontMatter.meta.pypi_package}

@@ -31,21 +30,18 @@ meta:
  • Minimum data platform version: {frontMatter.meta.min_supported_version}
  • -

    Installing {frontMatter.meta.pypi_package}

    +

    Installing {frontMatter.meta.pypi_package}

    -pip is the easiest way to install the adapter: +Use `pip` to install the adapter, which automatically installs `dbt-core` and any additional dependencies. Use the following command for installation: pip install {frontMatter.meta.pypi_package} -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    -

    Configuring {frontMatter.meta.pypi_package}

    -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    +

    To optimize performance and for {frontMatter.meta.platform_name}-specific configuration please refer to {frontMatter.meta.platform_name} configs

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    - ## Authentication Methods BigQuery targets can be specified using one of four methods: From c5dc689f83146112bb05cb629fa9c9f4f0b03dd9 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Thu, 15 Jun 2023 14:34:37 +0100 Subject: [PATCH 02/25] add snippet --- .../core/connect-data-platform/bigquery-setup.md | 12 +----------- website/snippets/setup-pages-intro.md | 11 +++++++++++ 2 files changed, 12 insertions(+), 11 deletions(-) create mode 100644 website/snippets/setup-pages-intro.md diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md index 7ebec8916ae..57afc24dec6 100644 --- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md +++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md @@ -30,17 +30,7 @@ meta:
  • Minimum data platform version: {frontMatter.meta.min_supported_version}
  • -

    Installing {frontMatter.meta.pypi_package}

    - -Use `pip` to install the adapter, which automatically installs `dbt-core` and any additional dependencies. Use the following command for installation: - -pip install {frontMatter.meta.pypi_package} - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    To optimize performance and for {frontMatter.meta.platform_name}-specific configuration please refer to {frontMatter.meta.platform_name} configs

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ## Authentication Methods diff --git a/website/snippets/setup-pages-intro.md b/website/snippets/setup-pages-intro.md new file mode 100644 index 00000000000..bae87933f1d --- /dev/null +++ b/website/snippets/setup-pages-intro.md @@ -0,0 +1,11 @@ +

    Installing {frontMatter.meta.pypi_package}

    + +Use `pip` to install the adapter, which automatically installs `dbt-core` and any additional dependencies. Use the following command for installation: + +pip install {frontMatter.meta.pypi_package} + +

    Configuring {frontMatter.meta.pypi_package}

    + +

    To optimize performance and for {frontMatter.meta.platform_name}-specific configuration please refer to {frontMatter.meta.platform_name} configs

    + +

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    From 858dfddcfc73f531ab084bde3c2c9f64ac1116cc Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Thu, 15 Jun 2023 20:05:45 +0100 Subject: [PATCH 03/25] adding partial --- .../connect-data-platform/bigquery-setup.md | 49 +++++++++---------- website/snippets/_setup-pages-intro.md | 23 +++++++++ website/snippets/setup-pages-intro.md | 11 ----- 3 files changed, 46 insertions(+), 37 deletions(-) create mode 100644 website/snippets/_setup-pages-intro.md delete mode 100644 website/snippets/setup-pages-intro.md diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md index 57afc24dec6..af4a5ea886f 100644 --- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md +++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md @@ -17,31 +17,36 @@ meta: -

    Overview of {frontMatter.meta.pypi_package}

    +## Test -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    +import SetUpPages from '/snippets/_setup-pages-intro.md'; - + -## Authentication Methods +## Prerequisites -BigQuery targets can be specified using one of four methods: +You need to have the required [BigQuery permissions](https://cloud.google.com/bigquery/docs/access-control) to create adapter-specific configurations in your dbt project. BigQuery's permission model is dissimilar from more conventional databases like Snowflake and Redshift. The following permissions are required for dbt user accounts: -1. [oauth via `gcloud`](#oauth-via-gcloud) -2. [oauth token-based](#oauth-token-based) -3. [service account file](#service-account-file) -4. [service account json](#service-account-json) + - BigQuery Data Editor + - BigQuery User +This set of permissions will permit dbt users to read from and create tables and views in a BigQuery project. +## Authentication methods + +You can specify BigQuery targets using one of four methods: + +| Auth method | Description | Supported | +| ----------- | ----------- | --------- | +| OAuth via gcloud | Recommended for local development + +1. [Oauth via `gcloud`](#oauth-via-gcloud) +2. [Oauth token-based](#oauth-token-based) +3. [Service account file](#service-account-file) +4. [Service account json](#service-account-json) + +:::tip For local development, we recommend using the oauth method. If you're scheduling dbt on a server, you should use the service account auth method instead. +::: BigQuery targets should be set up using the following configuration in your `profiles.yml` file. There are a number of [optional configurations](#optional-configurations) you may specify as well. @@ -495,14 +500,6 @@ my-profile: -## Required permissions - -BigQuery's permission model is dissimilar from more conventional databases like Snowflake and Redshift. The following permissions are required for dbt user accounts: -- BigQuery Data Editor -- BigQuery User - -This set of permissions will permit dbt users to read from and create tables and views in a BigQuery project. - ## Local OAuth gcloud setup To connect to BigQuery using the `oauth` method, follow these steps: diff --git a/website/snippets/_setup-pages-intro.md b/website/snippets/_setup-pages-intro.md new file mode 100644 index 00000000000..79bf558afc2 --- /dev/null +++ b/website/snippets/_setup-pages-intro.md @@ -0,0 +1,23 @@ +
      +
    • Maintained by: {props.meta.maintained_by}
    • +
    • Authors: {props.meta.authors}
    • +
    • GitHub repo: {props.meta.github_repo}
    • +
    • PyPI package: {props.meta.pypi_package}
    • +
    • Slack channel: {props.meta.slack_channel_name}
    • +
    • Supported dbt Core version: {props.meta.min_core_version} and newer
    • +
    • dbt Cloud support: {props.meta.cloud_support}
    • +
    • Minimum data platform version: {props.meta.min_supported_version}
    • +
    + + +

    Installing {props.meta.pypi_package}

    + +Use `pip` to install the adapter, which automatically installs `dbt-core` and any additional dependencies. Use the following command for installation: + +pip install {props.meta.pypi_package} + +

    Configuring {props.meta.pypi_package}

    + +

    For {props.meta.platform_name}-specific configuration, please refer to {props.meta.platform_name} configs.

    + +

    For further info, refer to the GitHub repository: {props.meta.github_repo}

    diff --git a/website/snippets/setup-pages-intro.md b/website/snippets/setup-pages-intro.md deleted file mode 100644 index bae87933f1d..00000000000 --- a/website/snippets/setup-pages-intro.md +++ /dev/null @@ -1,11 +0,0 @@ -

    Installing {frontMatter.meta.pypi_package}

    - -Use `pip` to install the adapter, which automatically installs `dbt-core` and any additional dependencies. Use the following command for installation: - -pip install {frontMatter.meta.pypi_package} - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    To optimize performance and for {frontMatter.meta.platform_name}-specific configuration please refer to {frontMatter.meta.platform_name} configs

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    From f149cdcbbe3b490e32f530e371f2f18d15e4a0dd Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Thu, 15 Jun 2023 20:06:32 +0100 Subject: [PATCH 04/25] adding partial --- website/docs/docs/core/connect-data-platform/bigquery-setup.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md index af4a5ea886f..863f9125250 100644 --- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md +++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md @@ -17,8 +17,6 @@ meta: -## Test - import SetUpPages from '/snippets/_setup-pages-intro.md'; From fac6f8031d189deba64e27c6b77ea0dc54c8663f Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Fri, 16 Jun 2023 14:04:28 +0100 Subject: [PATCH 05/25] add partial code --- .../connect-data-platform/alloydb-setup.md | 14 ++------ .../connect-data-platform/athena-setup.md | 27 ++------------- .../azuresynapse-setup.md | 27 ++------------- .../connect-data-platform/bigquery-setup.md | 2 ++ .../connect-data-platform/clickhouse-setup.md | 29 ++-------------- .../connect-data-platform/databend-setup.md | 29 ++-------------- .../connect-data-platform/databricks-setup.md | 30 ++--------------- .../connect-data-platform/decodable-setup.md | 30 ++--------------- .../core/connect-data-platform/doris-setup.md | 28 ++-------------- .../connect-data-platform/dremio-setup.md | 30 ++--------------- .../connect-data-platform/duckdb-setup.md | 28 ++-------------- .../connect-data-platform/exasol-setup.md | 29 ++-------------- .../connect-data-platform/fabric-setup.md | 27 ++------------- .../core/connect-data-platform/fal-setup.md | 31 ++--------------- .../connect-data-platform/firebolt-setup.md | 29 ++-------------- .../core/connect-data-platform/glue-setup.md | 29 ++-------------- .../connect-data-platform/greenplum-setup.md | 29 ++-------------- .../core/connect-data-platform/hive-setup.md | 29 ++-------------- .../connect-data-platform/ibmdb2-setup.md | 29 ++-------------- .../connect-data-platform/impala-setup.md | 29 ++-------------- .../core/connect-data-platform/infer-setup.md | 27 ++------------- .../connect-data-platform/iomete-setup.md | 30 ++--------------- .../core/connect-data-platform/layer-setup.md | 29 ++-------------- .../materialize-setup.md | 27 ++------------- .../connect-data-platform/mindsdb-setup.md | 30 ++--------------- .../core/connect-data-platform/mssql-setup.md | 28 ++-------------- .../core/connect-data-platform/mysql-setup.md | 27 ++------------- .../connect-data-platform/redshift-setup.md | 28 ++-------------- .../connect-data-platform/rockset-setup.md | 28 ++-------------- .../singlestore-setup.md | 30 ++--------------- .../connect-data-platform/snowflake-setup.md | 28 ++-------------- .../core/connect-data-platform/spark-setup.md | 28 +++------------- .../connect-data-platform/sqlite-setup.md | 29 ++-------------- .../connect-data-platform/teradata-setup.md | 28 ++-------------- .../core/connect-data-platform/tidb-setup.md | 29 ++-------------- .../core/connect-data-platform/trino-setup.md | 33 ++----------------- .../connect-data-platform/vertica-setup.md | 26 ++------------- website/snippets/_setup-pages-intro.md | 3 +- 38 files changed, 80 insertions(+), 943 deletions(-) diff --git a/website/docs/docs/core/connect-data-platform/alloydb-setup.md b/website/docs/docs/core/connect-data-platform/alloydb-setup.md index c3f3ee9cfca..db2117a13ee 100644 --- a/website/docs/docs/core/connect-data-platform/alloydb-setup.md +++ b/website/docs/docs/core/connect-data-platform/alloydb-setup.md @@ -14,18 +14,10 @@ meta: config_page: '/reference/resource-configs/postgres-configs' --- -## Overview of AlloyDB support +import SetUpPages from '/snippets/_setup-pages-intro.md'; + + -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    ## Profile Configuration diff --git a/website/docs/docs/core/connect-data-platform/athena-setup.md b/website/docs/docs/core/connect-data-platform/athena-setup.md index db218110dc1..468ba7a7847 100644 --- a/website/docs/docs/core/connect-data-platform/athena-setup.md +++ b/website/docs/docs/core/connect-data-platform/athena-setup.md @@ -15,32 +15,11 @@ meta: config_page: '/reference/resource-configs/no-configs' --- -

    Overview of {frontMatter.meta.pypi_package}

    + -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ## Connecting to Athena with dbt-athena diff --git a/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md b/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md index 073e95530c1..8a4d6b61004 100644 --- a/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md +++ b/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md @@ -24,32 +24,11 @@ Refer to [Microsoft Fabric Synapse Data Warehouse](/docs/core/connect-data-platf ::: -

    Overview of {frontMatter.meta.pypi_package}

    + -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + :::info Dedicated SQL only diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md index 8512feeeb83..6e634fc744c 100644 --- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md +++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md @@ -17,6 +17,8 @@ meta: + + import SetUpPages from '/snippets/_setup-pages-intro.md'; diff --git a/website/docs/docs/core/connect-data-platform/clickhouse-setup.md b/website/docs/docs/core/connect-data-platform/clickhouse-setup.md index fb0965398a2..fce367be812 100644 --- a/website/docs/docs/core/connect-data-platform/clickhouse-setup.md +++ b/website/docs/docs/core/connect-data-platform/clickhouse-setup.md @@ -17,34 +17,9 @@ meta: Some core functionality may be limited. If you're interested in contributing, check out the source code for each repository listed below. +import SetUpPages from '/snippets/_setup-pages-intro.md'; -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ## Connecting to ClickHouse with **dbt-clickhouse** diff --git a/website/docs/docs/core/connect-data-platform/databend-setup.md b/website/docs/docs/core/connect-data-platform/databend-setup.md index daccd14f6c3..5442327fb27 100644 --- a/website/docs/docs/core/connect-data-platform/databend-setup.md +++ b/website/docs/docs/core/connect-data-platform/databend-setup.md @@ -22,34 +22,9 @@ If you're interested in contributing, check out the source code repository liste ::: -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + ## Connecting to Databend Cloud with **dbt-databend-cloud** diff --git a/website/docs/docs/core/connect-data-platform/databricks-setup.md b/website/docs/docs/core/connect-data-platform/databricks-setup.md index eef6522a8f5..4792dceab3b 100644 --- a/website/docs/docs/core/connect-data-platform/databricks-setup.md +++ b/website/docs/docs/core/connect-data-platform/databricks-setup.md @@ -18,35 +18,9 @@ meta: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - -## Installation and Distribution - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + `dbt-databricks` is the recommend adapter for Databricks diff --git a/website/docs/docs/core/connect-data-platform/decodable-setup.md b/website/docs/docs/core/connect-data-platform/decodable-setup.md index b43521732d4..6c3cb487885 100644 --- a/website/docs/docs/core/connect-data-platform/decodable-setup.md +++ b/website/docs/docs/core/connect-data-platform/decodable-setup.md @@ -21,35 +21,9 @@ meta: Some core functionality may be limited. If you're interested in contributing, see the source code for the repository listed below. ::: -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version}
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -dbt-decodable is also available on PyPI. pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -
    -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration.

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + ## Connecting to Decodable with **dbt-decodable** Do the following steps to connect to Decodable with dbt. diff --git a/website/docs/docs/core/connect-data-platform/doris-setup.md b/website/docs/docs/core/connect-data-platform/doris-setup.md index a7e2ba1ba3e..882e6c3ba25 100644 --- a/website/docs/docs/core/connect-data-platform/doris-setup.md +++ b/website/docs/docs/core/connect-data-platform/doris-setup.md @@ -15,33 +15,9 @@ meta: config_page: '/reference/resource-configs/doris-configs' --- -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ## Connecting to Doris/SelectDB with **dbt-doris** diff --git a/website/docs/docs/core/connect-data-platform/dremio-setup.md b/website/docs/docs/core/connect-data-platform/dremio-setup.md index 4d10464400f..a6ac60490cb 100644 --- a/website/docs/docs/core/connect-data-platform/dremio-setup.md +++ b/website/docs/docs/core/connect-data-platform/dremio-setup.md @@ -20,36 +20,10 @@ meta: Some core functionality may be limited. If you're interested in contributing, check out the source code for each repository listed below. ::: +import SetUpPages from '/snippets/_setup-pages-intro.md'; -

    Overview of {frontMatter.meta.pypi_package}

    + -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    - -Follow the repository's link for os dependencies. ## Prerequisites for Dremio Cloud Before connecting from project to Dremio Cloud, follow these prerequisite steps: diff --git a/website/docs/docs/core/connect-data-platform/duckdb-setup.md b/website/docs/docs/core/connect-data-platform/duckdb-setup.md index 7896e4abeae..3792a806e51 100644 --- a/website/docs/docs/core/connect-data-platform/duckdb-setup.md +++ b/website/docs/docs/core/connect-data-platform/duckdb-setup.md @@ -21,33 +21,9 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ## Connecting to DuckDB with dbt-duckdb diff --git a/website/docs/docs/core/connect-data-platform/exasol-setup.md b/website/docs/docs/core/connect-data-platform/exasol-setup.md index 2bf4cd7ffac..509ccd67e84 100644 --- a/website/docs/docs/core/connect-data-platform/exasol-setup.md +++ b/website/docs/docs/core/connect-data-platform/exasol-setup.md @@ -21,34 +21,9 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    - dbt-exasol + ### Connecting to Exasol with **dbt-exasol** diff --git a/website/docs/docs/core/connect-data-platform/fabric-setup.md b/website/docs/docs/core/connect-data-platform/fabric-setup.md index aa7784d96ec..84641753f88 100644 --- a/website/docs/docs/core/connect-data-platform/fabric-setup.md +++ b/website/docs/docs/core/connect-data-platform/fabric-setup.md @@ -21,31 +21,8 @@ To learn how to set up dbt with Azure Synapse Dedicated Pools, see [Microsoft Az ::: -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + ### Prerequisites diff --git a/website/docs/docs/core/connect-data-platform/fal-setup.md b/website/docs/docs/core/connect-data-platform/fal-setup.md index ef4998e8c1b..24fe7033d89 100644 --- a/website/docs/docs/core/connect-data-platform/fal-setup.md +++ b/website/docs/docs/core/connect-data-platform/fal-setup.md @@ -21,35 +21,8 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package}[<sql-adapter>] - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    You must install the adapter for SQL transformations and data storage independently from dbt-fal.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + ## Setting up fal with other adapter diff --git a/website/docs/docs/core/connect-data-platform/firebolt-setup.md b/website/docs/docs/core/connect-data-platform/firebolt-setup.md index c7a5a543512..6500805c094 100644 --- a/website/docs/docs/core/connect-data-platform/firebolt-setup.md +++ b/website/docs/docs/core/connect-data-platform/firebolt-setup.md @@ -19,33 +19,8 @@ meta: Some core functionality may be limited. If you're interested in contributing, check out the source code for the repository listed below. -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + For other information including Firebolt feature support, see the [GitHub README](https://github.com/firebolt-db/dbt-firebolt/blob/main/README.md) and the [changelog](https://github.com/firebolt-db/dbt-firebolt/blob/main/CHANGELOG.md). diff --git a/website/docs/docs/core/connect-data-platform/glue-setup.md b/website/docs/docs/core/connect-data-platform/glue-setup.md index e0fb9556853..42f80046f46 100644 --- a/website/docs/docs/core/connect-data-platform/glue-setup.md +++ b/website/docs/docs/core/connect-data-platform/glue-setup.md @@ -22,33 +22,8 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + For further (and more likely up-to-date) info, see the [README](https://github.com/aws-samples/dbt-glue#readme) diff --git a/website/docs/docs/core/connect-data-platform/greenplum-setup.md b/website/docs/docs/core/connect-data-platform/greenplum-setup.md index 06ada19a1e9..e9ea421b2f5 100644 --- a/website/docs/docs/core/connect-data-platform/greenplum-setup.md +++ b/website/docs/docs/core/connect-data-platform/greenplum-setup.md @@ -16,33 +16,8 @@ meta: config_page: '/reference/resource-configs/greenplum-configs' --- -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + For further (and more likely up-to-date) info, see the [README](https://github.com/markporoshin/dbt-greenplum#README.md) diff --git a/website/docs/docs/core/connect-data-platform/hive-setup.md b/website/docs/docs/core/connect-data-platform/hive-setup.md index 61a929c58da..f180a33662f 100644 --- a/website/docs/docs/core/connect-data-platform/hive-setup.md +++ b/website/docs/docs/core/connect-data-platform/hive-setup.md @@ -16,33 +16,8 @@ meta: config_page: '/reference/resource-configs/hive-configs' --- -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + ## Connection Methods diff --git a/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md b/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md index cb6c7459418..12028fde046 100644 --- a/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md +++ b/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md @@ -22,33 +22,8 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: -## Overview of dbt-ibmdb2 - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + This is an experimental plugin: diff --git a/website/docs/docs/core/connect-data-platform/impala-setup.md b/website/docs/docs/core/connect-data-platform/impala-setup.md index 0a0f1b955a1..da0ddffa05b 100644 --- a/website/docs/docs/core/connect-data-platform/impala-setup.md +++ b/website/docs/docs/core/connect-data-platform/impala-setup.md @@ -16,33 +16,8 @@ meta: config_page: '/reference/resource-configs/impala-configs' --- -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + ## Connection Methods diff --git a/website/docs/docs/core/connect-data-platform/infer-setup.md b/website/docs/docs/core/connect-data-platform/infer-setup.md index 430c5e47f85..2969e871609 100644 --- a/website/docs/docs/core/connect-data-platform/infer-setup.md +++ b/website/docs/docs/core/connect-data-platform/infer-setup.md @@ -16,31 +16,8 @@ meta: min_supported_version: n/a --- -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + ## Connecting to Infer with **dbt-infer** diff --git a/website/docs/docs/core/connect-data-platform/iomete-setup.md b/website/docs/docs/core/connect-data-platform/iomete-setup.md index bc015141c85..66ef526ec8c 100644 --- a/website/docs/docs/core/connect-data-platform/iomete-setup.md +++ b/website/docs/docs/core/connect-data-platform/iomete-setup.md @@ -16,35 +16,9 @@ meta: config_page: '/reference/resource-configs/no-configs' --- -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - -## Installation and Distribution - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + Set up a iomete Target diff --git a/website/docs/docs/core/connect-data-platform/layer-setup.md b/website/docs/docs/core/connect-data-platform/layer-setup.md index f065c0c7313..051094297a2 100644 --- a/website/docs/docs/core/connect-data-platform/layer-setup.md +++ b/website/docs/docs/core/connect-data-platform/layer-setup.md @@ -17,34 +17,9 @@ meta: --- -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ### Profile Configuration diff --git a/website/docs/docs/core/connect-data-platform/materialize-setup.md b/website/docs/docs/core/connect-data-platform/materialize-setup.md index c8777c29490..ec0034dcd37 100644 --- a/website/docs/docs/core/connect-data-platform/materialize-setup.md +++ b/website/docs/docs/core/connect-data-platform/materialize-setup.md @@ -22,32 +22,9 @@ Certain core functionality may vary. If you would like to report a bug, request ::: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration, please refer to {frontMatter.meta.platform_name} Configuration.

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ## Connecting to Materialize diff --git a/website/docs/docs/core/connect-data-platform/mindsdb-setup.md b/website/docs/docs/core/connect-data-platform/mindsdb-setup.md index e6b8c5decaa..47d9d311ff9 100644 --- a/website/docs/docs/core/connect-data-platform/mindsdb-setup.md +++ b/website/docs/docs/core/connect-data-platform/mindsdb-setup.md @@ -19,35 +19,9 @@ meta: The dbt-mindsdb package allows dbt to connect to [MindsDB](https://github.com/mindsdb/mindsdb). -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - -## Installation - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ## Configurations diff --git a/website/docs/docs/core/connect-data-platform/mssql-setup.md b/website/docs/docs/core/connect-data-platform/mssql-setup.md index 5efcc454823..f58827c3554 100644 --- a/website/docs/docs/core/connect-data-platform/mssql-setup.md +++ b/website/docs/docs/core/connect-data-platform/mssql-setup.md @@ -22,33 +22,9 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + :::tip Default settings change in dbt-sqlserver v1.2 / ODBC Driver 18 diff --git a/website/docs/docs/core/connect-data-platform/mysql-setup.md b/website/docs/docs/core/connect-data-platform/mysql-setup.md index 1df6e205272..4b9224e0a0d 100644 --- a/website/docs/docs/core/connect-data-platform/mysql-setup.md +++ b/website/docs/docs/core/connect-data-platform/mysql-setup.md @@ -22,32 +22,9 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + This is an experimental plugin: - It has not been tested extensively. diff --git a/website/docs/docs/core/connect-data-platform/redshift-setup.md b/website/docs/docs/core/connect-data-platform/redshift-setup.md index 7d5fdbf7a97..3ff959199e0 100644 --- a/website/docs/docs/core/connect-data-platform/redshift-setup.md +++ b/website/docs/docs/core/connect-data-platform/redshift-setup.md @@ -18,33 +18,9 @@ meta: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specific configuration, refer to {frontMatter.meta.platform_name} Configuration.

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}.

    + ## Authentication Methods diff --git a/website/docs/docs/core/connect-data-platform/rockset-setup.md b/website/docs/docs/core/connect-data-platform/rockset-setup.md index 4a146829a03..372a6c0c538 100644 --- a/website/docs/docs/core/connect-data-platform/rockset-setup.md +++ b/website/docs/docs/core/connect-data-platform/rockset-setup.md @@ -22,33 +22,9 @@ Certain core functionality may vary. If you would like to report a bug, request ::: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ## Connecting to Rockset with **dbt-rockset** diff --git a/website/docs/docs/core/connect-data-platform/singlestore-setup.md b/website/docs/docs/core/connect-data-platform/singlestore-setup.md index a63466542a9..285c41bafc9 100644 --- a/website/docs/docs/core/connect-data-platform/singlestore-setup.md +++ b/website/docs/docs/core/connect-data-platform/singlestore-setup.md @@ -22,35 +22,9 @@ Certain core functionality may vary. If you would like to report a bug, request ::: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - -## Installation and Distribution - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ### Set up a SingleStore Target diff --git a/website/docs/docs/core/connect-data-platform/snowflake-setup.md b/website/docs/docs/core/connect-data-platform/snowflake-setup.md index 147cfb87867..9bda93d2187 100644 --- a/website/docs/docs/core/connect-data-platform/snowflake-setup.md +++ b/website/docs/docs/core/connect-data-platform/snowflake-setup.md @@ -18,34 +18,10 @@ meta: -

    Overview of {frontMatter.meta.pypi_package}

    -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + ## Authentication Methods diff --git a/website/docs/docs/core/connect-data-platform/spark-setup.md b/website/docs/docs/core/connect-data-platform/spark-setup.md index 00de640ee05..818cf9dfb2f 100644 --- a/website/docs/docs/core/connect-data-platform/spark-setup.md +++ b/website/docs/docs/core/connect-data-platform/spark-setup.md @@ -24,28 +24,14 @@ meta: See [Databricks setup](#databricks-setup) for the Databricks version of this page. ::: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    + -

    Installing {frontMatter.meta.pypi_package}

    -pip is the easiest way to install the adapter: +A note about connecting using an ODBC driver -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -If connecting to Databricks via ODBC driver, it requires `pyodbc`. Depending on your system, you can install it seperately or via pip. See the [`pyodbc` wiki](https://github.com/mkleehammer/pyodbc/wiki/Install) for OS-specific installation details. +If connecting to Databricks via ODBC driver, it requires `pyodbc`. Depending on your system, you can install it separately or via pip. See the [`pyodbc` wiki](https://github.com/mkleehammer/pyodbc/wiki/Install) for OS-specific installation details. If connecting to a Spark cluster via the generic thrift or http methods, it requires `PyHive`. @@ -66,12 +52,6 @@ $ pip install "dbt-spark[session]" -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specific configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    - ## Connection Methods dbt-spark can connect to Spark clusters by three different methods: diff --git a/website/docs/docs/core/connect-data-platform/sqlite-setup.md b/website/docs/docs/core/connect-data-platform/sqlite-setup.md index 3da902a6f80..20897ea90d7 100644 --- a/website/docs/docs/core/connect-data-platform/sqlite-setup.md +++ b/website/docs/docs/core/connect-data-platform/sqlite-setup.md @@ -22,34 +22,9 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + Starting with the release of dbt-core 1.0.0, versions of dbt-sqlite are aligned to the same major+minor [version](https://semver.org/) of dbt-core. - versions 1.1.x of this adapter work with dbt-core 1.1.x diff --git a/website/docs/docs/core/connect-data-platform/teradata-setup.md b/website/docs/docs/core/connect-data-platform/teradata-setup.md index 1fe33ff8929..60ced3ac436 100644 --- a/website/docs/docs/core/connect-data-platform/teradata-setup.md +++ b/website/docs/docs/core/connect-data-platform/teradata-setup.md @@ -19,33 +19,9 @@ meta: Some core functionality may be limited. If you're interested in contributing, check out the source code for the repository listed below. -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ### Connecting to Teradata diff --git a/website/docs/docs/core/connect-data-platform/tidb-setup.md b/website/docs/docs/core/connect-data-platform/tidb-setup.md index e2205c4665e..253497b37ba 100644 --- a/website/docs/docs/core/connect-data-platform/tidb-setup.md +++ b/website/docs/docs/core/connect-data-platform/tidb-setup.md @@ -24,34 +24,9 @@ If you're interested in contributing, check out the source code repository liste ::: -

    Overview of {frontMatter.meta.pypi_package}

    - -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; + ## Connecting to TiDB with **dbt-tidb** diff --git a/website/docs/docs/core/connect-data-platform/trino-setup.md b/website/docs/docs/core/connect-data-platform/trino-setup.md index 711e735ab6d..918ecdaba74 100644 --- a/website/docs/docs/core/connect-data-platform/trino-setup.md +++ b/website/docs/docs/core/connect-data-platform/trino-setup.md @@ -18,38 +18,9 @@ meta: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - -:::info Vendor-supported plugin - -Certain core functionality may vary. If you would like to report a bug, request a feature, or contribute, you can check out the linked repository and open an issue. - -::: - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ## Connecting to Starburst/Trino diff --git a/website/docs/docs/core/connect-data-platform/vertica-setup.md b/website/docs/docs/core/connect-data-platform/vertica-setup.md index fbb8de6b301..dda242393b5 100644 --- a/website/docs/docs/core/connect-data-platform/vertica-setup.md +++ b/website/docs/docs/core/connect-data-platform/vertica-setup.md @@ -21,31 +21,9 @@ If you're interested in contributing, check out the source code for each reposit ::: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.pypi_package} specific configuration please refer to {frontMatter.meta.platform_name} Configuration.

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}.

    +

    Connecting to {frontMatter.meta.platform_name} with {frontMatter.meta.pypi_package}

    diff --git a/website/snippets/_setup-pages-intro.md b/website/snippets/_setup-pages-intro.md index 79bf558afc2..cc68aac913f 100644 --- a/website/snippets/_setup-pages-intro.md +++ b/website/snippets/_setup-pages-intro.md @@ -1,3 +1,4 @@ +
    • Maintained by: {props.meta.maintained_by}
    • Authors: {props.meta.authors}
    • @@ -9,11 +10,9 @@
    • Minimum data platform version: {props.meta.min_supported_version}
    -

    Installing {props.meta.pypi_package}

    Use `pip` to install the adapter, which automatically installs `dbt-core` and any additional dependencies. Use the following command for installation: - pip install {props.meta.pypi_package}

    Configuring {props.meta.pypi_package}

    From a07a8839afdeeaef959bac866a0be6672a35b2d2 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Fri, 16 Jun 2023 14:08:17 +0100 Subject: [PATCH 06/25] add partial --- .../connect-data-platform/fabric-setup.md | 2 ++ .../core/connect-data-platform/fal-setup.md | 2 ++ .../connect-data-platform/firebolt-setup.md | 2 ++ .../core/connect-data-platform/glue-setup.md | 2 ++ .../connect-data-platform/greenplum-setup.md | 2 ++ .../core/connect-data-platform/hive-setup.md | 2 ++ .../connect-data-platform/ibmdb2-setup.md | 2 ++ .../connect-data-platform/impala-setup.md | 1 + .../core/connect-data-platform/infer-setup.md | 2 ++ .../connect-data-platform/iomete-setup.md | 1 + .../connect-data-platform/oracle-setup.md | 29 ++----------------- .../connect-data-platform/postgres-setup.md | 28 ++---------------- 12 files changed, 22 insertions(+), 53 deletions(-) diff --git a/website/docs/docs/core/connect-data-platform/fabric-setup.md b/website/docs/docs/core/connect-data-platform/fabric-setup.md index 84641753f88..6d0a455d1f8 100644 --- a/website/docs/docs/core/connect-data-platform/fabric-setup.md +++ b/website/docs/docs/core/connect-data-platform/fabric-setup.md @@ -22,8 +22,10 @@ To learn how to set up dbt with Azure Synapse Dedicated Pools, see [Microsoft Az ::: import SetUpPages from '/snippets/_setup-pages-intro.md'; + + ### Prerequisites On Debian/Ubuntu make sure you have the ODBC header files before installing diff --git a/website/docs/docs/core/connect-data-platform/fal-setup.md b/website/docs/docs/core/connect-data-platform/fal-setup.md index 24fe7033d89..76539d67c54 100644 --- a/website/docs/docs/core/connect-data-platform/fal-setup.md +++ b/website/docs/docs/core/connect-data-platform/fal-setup.md @@ -22,9 +22,11 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: import SetUpPages from '/snippets/_setup-pages-intro.md'; + + ## Setting up fal with other adapter [fal](http://github.com/fal-ai/fal) offers a Python runtime independent from what database you are using and integrates seamlessly with dbt. It works by downloading the data as a Pandas DataFrame, transforming it in a local Python runtime and uploading it to the database. The only configuration change you need to do is adding it to the `profiles.yml` and setting the `db_profile` property as the database profile you are already using. diff --git a/website/docs/docs/core/connect-data-platform/firebolt-setup.md b/website/docs/docs/core/connect-data-platform/firebolt-setup.md index 6500805c094..8fb91dea299 100644 --- a/website/docs/docs/core/connect-data-platform/firebolt-setup.md +++ b/website/docs/docs/core/connect-data-platform/firebolt-setup.md @@ -20,9 +20,11 @@ Some core functionality may be limited. If you're interested in contributing, ch import SetUpPages from '/snippets/_setup-pages-intro.md'; + + For other information including Firebolt feature support, see the [GitHub README](https://github.com/firebolt-db/dbt-firebolt/blob/main/README.md) and the [changelog](https://github.com/firebolt-db/dbt-firebolt/blob/main/CHANGELOG.md). diff --git a/website/docs/docs/core/connect-data-platform/glue-setup.md b/website/docs/docs/core/connect-data-platform/glue-setup.md index 42f80046f46..ca7d5503d57 100644 --- a/website/docs/docs/core/connect-data-platform/glue-setup.md +++ b/website/docs/docs/core/connect-data-platform/glue-setup.md @@ -23,8 +23,10 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: import SetUpPages from '/snippets/_setup-pages-intro.md'; + + For further (and more likely up-to-date) info, see the [README](https://github.com/aws-samples/dbt-glue#readme) diff --git a/website/docs/docs/core/connect-data-platform/greenplum-setup.md b/website/docs/docs/core/connect-data-platform/greenplum-setup.md index e9ea421b2f5..523a503b128 100644 --- a/website/docs/docs/core/connect-data-platform/greenplum-setup.md +++ b/website/docs/docs/core/connect-data-platform/greenplum-setup.md @@ -17,8 +17,10 @@ meta: --- import SetUpPages from '/snippets/_setup-pages-intro.md'; + + For further (and more likely up-to-date) info, see the [README](https://github.com/markporoshin/dbt-greenplum#README.md) diff --git a/website/docs/docs/core/connect-data-platform/hive-setup.md b/website/docs/docs/core/connect-data-platform/hive-setup.md index f180a33662f..92210162324 100644 --- a/website/docs/docs/core/connect-data-platform/hive-setup.md +++ b/website/docs/docs/core/connect-data-platform/hive-setup.md @@ -17,9 +17,11 @@ meta: --- import SetUpPages from '/snippets/_setup-pages-intro.md'; + + ## Connection Methods dbt-hive can connect to Apache Hive and Cloudera Data Platform clusters. The [Impyla](https://github.com/cloudera/impyla/) library is used to establish connections to Hive. diff --git a/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md b/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md index 12028fde046..692342466b0 100644 --- a/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md +++ b/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md @@ -23,9 +23,11 @@ Some core functionality may be limited. If you're interested in contributing, ch ::: import SetUpPages from '/snippets/_setup-pages-intro.md'; + + This is an experimental plugin: - We have not tested it extensively - Tested with [dbt-adapter-tests](https://pypi.org/project/pytest-dbt-adapter/) and DB2 LUW on Mac OS+RHEL8 diff --git a/website/docs/docs/core/connect-data-platform/impala-setup.md b/website/docs/docs/core/connect-data-platform/impala-setup.md index da0ddffa05b..df82cab6563 100644 --- a/website/docs/docs/core/connect-data-platform/impala-setup.md +++ b/website/docs/docs/core/connect-data-platform/impala-setup.md @@ -17,6 +17,7 @@ meta: --- import SetUpPages from '/snippets/_setup-pages-intro.md'; + diff --git a/website/docs/docs/core/connect-data-platform/infer-setup.md b/website/docs/docs/core/connect-data-platform/infer-setup.md index 2969e871609..7642c553cc4 100644 --- a/website/docs/docs/core/connect-data-platform/infer-setup.md +++ b/website/docs/docs/core/connect-data-platform/infer-setup.md @@ -17,9 +17,11 @@ meta: --- import SetUpPages from '/snippets/_setup-pages-intro.md'; + + ## Connecting to Infer with **dbt-infer** Infer allows you to perform advanced ML Analytics within SQL as if native to your data warehouse. diff --git a/website/docs/docs/core/connect-data-platform/iomete-setup.md b/website/docs/docs/core/connect-data-platform/iomete-setup.md index 66ef526ec8c..2f2d18b1e47 100644 --- a/website/docs/docs/core/connect-data-platform/iomete-setup.md +++ b/website/docs/docs/core/connect-data-platform/iomete-setup.md @@ -21,6 +21,7 @@ import SetUpPages from '/snippets/_setup-pages-intro.md'; + Set up a iomete Target iomete targets should be set up using the following configuration in your profiles.yml file. diff --git a/website/docs/docs/core/connect-data-platform/oracle-setup.md b/website/docs/docs/core/connect-data-platform/oracle-setup.md index 0d677c1c90b..380ab9cf4a2 100644 --- a/website/docs/docs/core/connect-data-platform/oracle-setup.md +++ b/website/docs/docs/core/connect-data-platform/oracle-setup.md @@ -16,35 +16,10 @@ meta: config_page: '/reference/resource-configs/oracle-configs' --- -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    + -## Installation - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    ### Configure the Python driver mode diff --git a/website/docs/docs/core/connect-data-platform/postgres-setup.md b/website/docs/docs/core/connect-data-platform/postgres-setup.md index a6948e6f1ad..2ed0fb9336d 100644 --- a/website/docs/docs/core/connect-data-platform/postgres-setup.md +++ b/website/docs/docs/core/connect-data-platform/postgres-setup.md @@ -18,33 +18,9 @@ meta: -

    Overview of {frontMatter.meta.pypi_package}

    +import SetUpPages from '/snippets/_setup-pages-intro.md'; -
      -
    • Maintained by: {frontMatter.meta.maintained_by}
    • -
    • Authors: {frontMatter.meta.authors}
    • -
    • GitHub repo: {frontMatter.meta.github_repo}
    • -
    • PyPI package: {frontMatter.meta.pypi_package}
    • -
    • Slack channel: {frontMatter.meta.slack_channel_name}
    • -
    • Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
    • -
    • dbt Cloud support: {frontMatter.meta.cloud_support}
    • -
    • Minimum data platform version: {frontMatter.meta.min_supported_version}
    • -
    - - -

    Installing {frontMatter.meta.pypi_package}

    - -pip is the easiest way to install the adapter: - -pip install {frontMatter.meta.pypi_package} - -

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    - -

    Configuring {frontMatter.meta.pypi_package}

    - -

    For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration

    - -

    For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}

    + ## Profile Configuration From 2fca17706d2d5c8f036de7a8dc367d79cc18ed10 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Fri, 16 Jun 2023 17:09:08 +0100 Subject: [PATCH 07/25] frontmatter --- .../docs/docs/core/connect-data-platform/bigquery-setup.md | 2 +- .../docs/docs/core/connect-data-platform/snowflake-setup.md | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md index 6e634fc744c..45b35487e5e 100644 --- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md +++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md @@ -17,7 +17,7 @@ meta: - + import SetUpPages from '/snippets/_setup-pages-intro.md'; diff --git a/website/docs/docs/core/connect-data-platform/snowflake-setup.md b/website/docs/docs/core/connect-data-platform/snowflake-setup.md index 9bda93d2187..3826bfae4e9 100644 --- a/website/docs/docs/core/connect-data-platform/snowflake-setup.md +++ b/website/docs/docs/core/connect-data-platform/snowflake-setup.md @@ -18,10 +18,10 @@ meta: - import SetUpPages from '/snippets/_setup-pages-intro.md'; - + + ## Authentication Methods From 29367b20e208ace8fb127b09a2ed014f863991eb Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Tue, 20 Jun 2023 16:14:28 +0100 Subject: [PATCH 08/25] bigquery --- .../connect-data-platform/bigquery-setup.md | 54 +++++++++++-------- 1 file changed, 31 insertions(+), 23 deletions(-) diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md index 45b35487e5e..a4cf813499e 100644 --- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md +++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md @@ -13,10 +13,12 @@ meta: slack_channel_link: 'https://getdbt.slack.com/archives/C99SNSRTK' platform_name: 'Big Query' config_page: '/reference/resource-configs/bigquery-configs' + addl_frontmatter: 'hello world' --- + import SetUpPages from '/snippets/_setup-pages-intro.md'; @@ -33,25 +35,16 @@ You need to have the required [BigQuery permissions](https://cloud.google.com/bi This set of permissions will permit dbt users to read from and create tables and views in a BigQuery project. ## Authentication methods -You can specify BigQuery targets using one of four methods: - -| Auth method | Description | Supported | -| ----------- | ----------- | --------- | -| OAuth via gcloud | Recommended for local development - +You can specify BigQuery targets using four methods. BigQuery targets should be set up using the following configuration in your `profiles.yml` file. There are a number of [optional configurations](#optional-configurations) you may specify as well. 1. [OAuth via `gcloud`](#oauth-via-gcloud) -2. [OAuth token-based](#oauth-token-based) -3. [service account file](#service-account-file) -4. [service account json](#service-account-json) +2. [OAuth token-based](#oauth-token-based) +3. [Service account file](#service-account-file) +4. [Service account json](#service-account-json) :::tip -For local development, we recommend using the oauth method. If you're scheduling dbt on a server, you should use the service account auth method instead. +For local development, we recommend using the OAuth method. If you're scheduling dbt on a server, you should use the service account auth method instead. ::: - - -BigQuery targets should be set up using the following configuration in your `profiles.yml` file. There are a number of [optional configurations](#optional-configurations) you may specify as well. - ### OAuth via gcloud This connection method requires [local OAuth via `gcloud`](#local-oauth-gcloud-setup). @@ -79,7 +72,7 @@ my-bigquery-db: New in dbt v0.19.0 -If you do not specify a `project`/`database` and are using the `oauth` method, dbt will use the default `project` associated with your user, as defined by `gcloud config set`. +If you do not specify a `project`/`database` and are using the `OAuth` method, dbt will use the default `project` associated with your user, as defined by `gcloud config set`. ### OAuth Token-Based @@ -208,6 +201,17 @@ my-bigquery-db: ## Optional configurations +Use the following optional configurations to specify BigQuery targets in your `profiles.yml` file: + +- [**Priority**](#priority) — Configure the priority of dbt's BigQuery jobs using the `priority` configuration in your BigQuery profile. +- [**Timeouts and retries**](#timeouts-and-retries) — The dbt-bigquery plugin utilizes the BigQuery Python client library to submit queries, which involves job creation and execution. +- [**Dataset locations**](#dataset-locations) — Configure the location of BigQuery datasets using the location configuration in a BigQuery profile, specifying either a multi-regional location. +- [**Maximum bytes billed**](#maximum-bytes-billed) — Set maximum_bytes_billed value in a BigQuery profile to ensure queries don't exceed the configured threshold. +- [**OAuth 2.0 scopes for Google APIs**](#oauth-20-scopes-for-google-apis) — Use the scopes profile configuration to set up your own OAuth scopes for dbt +- [**Service Account impersonation**](#service-account-impersonation) — Authenticate with local OAuth to access to BigQuery resources based on the service account permissions. +- [**Execution project**](#execution-project) — You can optionally specify an execution_project for query execution billing. +- [**Running Python models on Dataproc**](#running-python-models-on-dataproc) — Utilize the integrated services of Dataproc and Cloud Storage to run dbt Python models in GCP. + ### Priority The `priority` for the BigQuery jobs that dbt executes can be configured with the `priority` configuration in your BigQuery profile. The `priority` field can be set to one of `batch` or `interactive`. For more information on query priority, consult the [BigQuery documentation](https://cloud.google.com/bigquery/docs/running-queries). @@ -234,7 +238,8 @@ The `dbt-bigquery` plugin uses the BigQuery Python client library to submit quer Some queries inevitably fail, at different points in process. To handle these cases, dbt supports fine-grained configuration for query timeouts and retries. -#### job_execution_timeout_seconds + + Use the `job_execution_timeout_seconds` configuration to set the number of seconds dbt should wait for queries to complete, after being submitted successfully. Of the four configurations that control timeout and retries, this one is the most common to use. @@ -263,14 +268,14 @@ my-profile: dataset: my_dataset job_execution_timeout_seconds: 600 # 10 minutes ``` - -#### job_creation_timeout_seconds + + It is also possible for a query job to fail to submit in the first place. You can configure the maximum timeout for the job creation step by configuring `job_creation_timeout_seconds`. No timeout is set by default. In the job creation step, dbt is simply submitting a query job to BigQuery's `Jobs.Insert` API, and receiving a query job ID in return. It should take a few seconds at most. In some rare situations, it could take longer. - -#### job_retries + + Google's BigQuery Python client has native support for retrying query jobs that time out, or queries that run into transient errors and are likely to succeed if run again. You can configure the maximum number of retries by configuring `job_retries`. @@ -282,7 +287,8 @@ In older versions of `dbt-bigquery`, the `job_retries` config was just called `r The default value is 1, meaning that dbt will retry failing queries exactly once. You can set the configuration to 0 to disable retries entirely. -#### job_retry_deadline_seconds + + After a query job times out, or encounters a transient error, dbt will wait one second before retrying the same query. In cases where queries are repeatedly timing out, this can add up to a long wait. You can set the `job_retry_deadline_seconds` configuration to set the total number of seconds you're willing to wait ("deadline") while retrying the same query. If dbt hits the deadline, it will give up and return an error. @@ -305,9 +311,11 @@ my-profile: job_retry_deadline_seconds: 1200 ``` - + + + @@ -504,7 +512,7 @@ my-profile: ## Local OAuth gcloud setup -To connect to BigQuery using the `oauth` method, follow these steps: +To connect to BigQuery using the `OAuth` method, follow these steps: 1. Make sure the `gcloud` command is [installed on your computer](https://cloud.google.com/sdk/downloads) 2. Activate the application-default account with From f6ed2831e4de4fc93087cb2796b2677a80289d42 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Thu, 29 Jun 2023 15:33:40 +0100 Subject: [PATCH 09/25] test --- .../docs/docs/core/connect-data-platform/bigquery-setup.md | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md index a4cf813499e..6c947edc684 100644 --- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md +++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md @@ -13,11 +13,12 @@ meta: slack_channel_link: 'https://getdbt.slack.com/archives/C99SNSRTK' platform_name: 'Big Query' config_page: '/reference/resource-configs/bigquery-configs' - addl_frontmatter: 'hello world' + addl_frontmatter: 'a link world' --- +to join this channel, go to {frontMatter.meta.addl_frontmatter} for more info @@ -527,3 +528,5 @@ https://www.googleapis.com/auth/iam.test A browser window should open, and you should be prompted to log into your Google account. Once you've done that, dbt will use your OAuth'd credentials to connect to BigQuery! This command uses the `--scopes` flag to request access to Google Sheets. This makes it possible to transform data in Google Sheets using dbt. If your dbt project does not transform data in Google Sheets, then you may omit the `--scopes` flag. + + From 54701730cad1fa6fd08a97ccc9cd66736f1f581c Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 16 Nov 2023 15:16:47 -0500 Subject: [PATCH 10/25] Updating pip instructions --- .../2021-11-29-open-source-community-growth.md | 2 +- website/blog/2022-04-14-add-ci-cd-to-bitbucket.md | 4 ++-- ...king-dbt-cloud-api-calls-using-dbt-cloud-cli.md | 4 ++-- ...uilding-a-kimball-dimensional-model-with-dbt.md | 4 ++-- .../semantic-layer-2-setup.md | 4 ++-- website/docs/docs/build/metricflow-commands.md | 4 ++-- website/docs/docs/cloud/cloud-cli-installation.md | 10 +++++----- website/docs/docs/connect-adapters.md | 2 +- .../core/connect-data-platform/athena-setup.md | 2 +- .../connect-data-platform/azuresynapse-setup.md | 2 +- .../core/connect-data-platform/bigquery-setup.md | 2 +- .../core/connect-data-platform/clickhouse-setup.md | 2 +- .../core/connect-data-platform/databend-setup.md | 2 +- .../core/connect-data-platform/databricks-setup.md | 2 +- .../core/connect-data-platform/decodable-setup.md | 2 +- .../docs/core/connect-data-platform/doris-setup.md | 2 +- .../core/connect-data-platform/dremio-setup.md | 2 +- .../core/connect-data-platform/duckdb-setup.md | 2 +- .../core/connect-data-platform/exasol-setup.md | 2 +- .../core/connect-data-platform/fabric-setup.md | 2 +- .../docs/core/connect-data-platform/fal-setup.md | 2 +- .../core/connect-data-platform/firebolt-setup.md | 2 +- .../docs/core/connect-data-platform/glue-setup.md | 4 ++-- .../core/connect-data-platform/greenplum-setup.md | 2 +- .../docs/core/connect-data-platform/hive-setup.md | 4 ++-- .../core/connect-data-platform/ibmdb2-setup.md | 2 +- .../core/connect-data-platform/impala-setup.md | 2 +- .../docs/core/connect-data-platform/infer-setup.md | 2 +- .../core/connect-data-platform/iomete-setup.md | 2 +- .../docs/core/connect-data-platform/layer-setup.md | 2 +- .../connect-data-platform/materialize-setup.md | 2 +- .../core/connect-data-platform/mindsdb-setup.md | 2 +- .../docs/core/connect-data-platform/mssql-setup.md | 2 +- .../docs/core/connect-data-platform/mysql-setup.md | 2 +- .../core/connect-data-platform/oracle-setup.md | 2 +- .../core/connect-data-platform/postgres-setup.md | 2 +- .../core/connect-data-platform/redshift-setup.md | 2 +- .../core/connect-data-platform/rockset-setup.md | 2 +- .../connect-data-platform/singlestore-setup.md | 2 +- .../core/connect-data-platform/snowflake-setup.md | 2 +- .../docs/core/connect-data-platform/spark-setup.md | 8 ++++---- .../core/connect-data-platform/sqlite-setup.md | 2 +- .../core/connect-data-platform/starrocks-setup.md | 2 +- .../core/connect-data-platform/teradata-setup.md | 2 +- .../docs/core/connect-data-platform/tidb-setup.md | 2 +- .../docs/core/connect-data-platform/trino-setup.md | 4 ++-- .../core/connect-data-platform/upsolver-setup.md | 2 +- .../core/connect-data-platform/vertica-setup.md | 2 +- website/docs/docs/core/docker-install.md | 2 +- website/docs/docs/core/pip-install.md | 14 +++++++------- website/docs/docs/core/source-install.md | 8 ++++---- .../core-upgrade/08-upgrading-to-v1.0.md | 2 +- .../docs/faqs/Core/install-pip-best-practices.md | 2 +- website/docs/faqs/Core/install-pip-os-prereqs.md | 2 +- website/docs/guides/adapter-creation.md | 2 +- website/docs/guides/airflow-and-dbt-cloud.md | 2 +- website/docs/guides/codespace-qs.md | 2 +- website/docs/guides/custom-cicd-pipelines.md | 6 +++--- website/docs/guides/set-up-ci.md | 6 +++--- website/docs/guides/sl-migration.md | 4 ++-- website/snippets/_sl-test-and-query-metrics.md | 4 ++-- 61 files changed, 91 insertions(+), 91 deletions(-) diff --git a/website/blog/2021-11-29-open-source-community-growth.md b/website/blog/2021-11-29-open-source-community-growth.md index 8a71a504875..98b64cefa3d 100644 --- a/website/blog/2021-11-29-open-source-community-growth.md +++ b/website/blog/2021-11-29-open-source-community-growth.md @@ -57,7 +57,7 @@ For starters, I want to know how much conversation is occurring across the vario There are a ton of metrics that can be tracked in any GitHub project — committers, pull requests, forks, releases — but I started pretty simple. For each of the projects we participate in, I just want to know how the number of GitHub stars grows over time, and whether the growth is accelerating or flattening out. This has become a key performance indicator for open source communities, for better or for worse, and keeping track of it isn't optional. -Finally, I want to know how much Marquez and OpenLineage are being used. It used to be that when you wanted to consume a bit of tech, you'd download a file. Folks like me who study user behavior would track download counts as if they were stock prices. This is no longer the case; today, our tech is increasingly distributed through package managers and image repositories. Docker Hub and PyPI metrics have therefore become good indicators of consumption. Docker image pulls and runs of `pip install` are the modern day download and, as noisy as these metrics are, they indicate a similar level of user commitment. +Finally, I want to know how much Marquez and OpenLineage are being used. It used to be that when you wanted to consume a bit of tech, you'd download a file. Folks like me who study user behavior would track download counts as if they were stock prices. This is no longer the case; today, our tech is increasingly distributed through package managers and image repositories. Docker Hub and PyPI metrics have therefore become good indicators of consumption. Docker image pulls and runs of `python -m pip install` are the modern day download and, as noisy as these metrics are, they indicate a similar level of user commitment. To summarize, here are the metrics I decided to track (for now, anyway): - Slack messages (by user/ by community) diff --git a/website/blog/2022-04-14-add-ci-cd-to-bitbucket.md b/website/blog/2022-04-14-add-ci-cd-to-bitbucket.md index 451013b1572..44346e93741 100644 --- a/website/blog/2022-04-14-add-ci-cd-to-bitbucket.md +++ b/website/blog/2022-04-14-add-ci-cd-to-bitbucket.md @@ -159,7 +159,7 @@ pipelines: artifacts: # Save the dbt run artifacts for the next step (upload) - target/*.json script: - - pip install -r requirements.txt + - python -m pip install -r requirements.txt - mkdir ~/.dbt - cp .ci/profiles.yml ~/.dbt/profiles.yml - dbt deps @@ -208,7 +208,7 @@ pipelines: # Set up dbt environment + dbt packages. Rather than passing # profiles.yml to dbt commands explicitly, we'll store it where dbt # expects it: - - pip install -r requirements.txt + - python -m pip install -r requirements.txt - mkdir ~/.dbt - cp .ci/profiles.yml ~/.dbt/profiles.yml - dbt deps diff --git a/website/blog/2022-05-03-making-dbt-cloud-api-calls-using-dbt-cloud-cli.md b/website/blog/2022-05-03-making-dbt-cloud-api-calls-using-dbt-cloud-cli.md index 2ee774d4f1d..6758a28638c 100644 --- a/website/blog/2022-05-03-making-dbt-cloud-api-calls-using-dbt-cloud-cli.md +++ b/website/blog/2022-05-03-making-dbt-cloud-api-calls-using-dbt-cloud-cli.md @@ -59,7 +59,7 @@ You probably agree that the latter example is definitely more elegant and easier In addition to CLI commands that interact with a single dbt Cloud API endpoint there are composite helper commands that call one or more API endpoints and perform more complex operations. One example of composite commands are `dbt-cloud job export` and `dbt-cloud job import` where, under the hood, the export command performs a `dbt-cloud job get` and writes the job metadata to a file and the import command reads job parameters from a JSON file and calls `dbt-cloud job create`. The export and import commands can be used in tandem to move dbt Cloud jobs between projects. Another example is the `dbt-cloud job delete-all` which fetches a list of all jobs using `dbt-cloud job list` and then iterates over the list prompting the user if they want to delete the job. For each job that the user agrees to delete a `dbt-cloud job delete` is performed. -To install the CLI in your Python environment run `pip install dbt-cloud-cli` and you’re all set. You can use it locally in your development environment or e.g. in a GitHub actions workflow. +To install the CLI in your Python environment run `python -m pip install dbt-cloud-cli` and you’re all set. You can use it locally in your development environment or e.g. in a GitHub actions workflow. ## How the project came to be @@ -310,7 +310,7 @@ The `CatalogExploreCommand.execute` method implements the interactive exploratio I’ve included the app in the latest version of dbt-cloud-cli so you can test it out yourself! To use the app you need install dbt-cloud-cli with extra dependencies: ```bash -pip install dbt-cloud-cli[demo] +python -m pip install dbt-cloud-cli[demo] ``` Now you can the run app: diff --git a/website/blog/2023-04-18-building-a-kimball-dimensional-model-with-dbt.md b/website/blog/2023-04-18-building-a-kimball-dimensional-model-with-dbt.md index 3ca1f6ac2a9..ab364749eff 100644 --- a/website/blog/2023-04-18-building-a-kimball-dimensional-model-with-dbt.md +++ b/website/blog/2023-04-18-building-a-kimball-dimensional-model-with-dbt.md @@ -79,12 +79,12 @@ Depending on which database you’ve chosen, install the relevant database adapt ```text # install adaptor for duckdb -pip install dbt-duckdb +python -m pip install dbt-duckdb # OR # install adaptor for postgresql -pip install dbt-postgres +python -m pip install dbt-postgres ``` ### Step 4: Setup dbt profile diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md index ffbd78b939c..6e9153a3780 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-2-setup.md @@ -23,8 +23,8 @@ We'll use pip to install MetricFlow and our dbt adapter: python -m venv [virtual environment name] source [virtual environment name]/bin/activate # install dbt and MetricFlow -pip install "dbt-metricflow[adapter name]" -# e.g. pip install "dbt-metricflow[snowflake]" +python -m pip install "dbt-metricflow[adapter name]" +# e.g. python -m pip install "dbt-metricflow[snowflake]" ``` Lastly, to get to the pre-Semantic Layer starting state, checkout the `start-here` branch. diff --git a/website/docs/docs/build/metricflow-commands.md b/website/docs/docs/build/metricflow-commands.md index 4d2477ad2ed..67589c07836 100644 --- a/website/docs/docs/build/metricflow-commands.md +++ b/website/docs/docs/build/metricflow-commands.md @@ -17,7 +17,7 @@ MetricFlow is compatible with Python versions 3.8, 3.9, 3.10, and 3.11. MetricFlow is a dbt package that allows you to define and query metrics in your dbt project. You can use MetricFlow to query metrics in your dbt project in the dbt Cloud CLI, dbt Cloud IDE, or dbt Core. -**Note** — MetricFlow commands aren't supported in dbt Cloud jobs yet. However, you can add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs. +**Note** — MetricFlow commands aren't supported in dbt Cloud jobs yet. However, you can add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`python -m pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs. @@ -54,7 +54,7 @@ You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-star 1. Create or activate your virtual environment `python -m venv venv` 2. Run `pip install dbt-metricflow` - * You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `pip install "dbt-metricflow[snowflake]"` + * You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `python -m pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `python -m pip install "dbt-metricflow[snowflake]"` **Note**, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow. diff --git a/website/docs/docs/cloud/cloud-cli-installation.md b/website/docs/docs/cloud/cloud-cli-installation.md index 896b3c92f75..6c11a2250a9 100644 --- a/website/docs/docs/cloud/cloud-cli-installation.md +++ b/website/docs/docs/cloud/cloud-cli-installation.md @@ -155,13 +155,13 @@ If you already have dbt Core installed, the dbt Cloud CLI may conflict. Here are - Uninstall the dbt Cloud CLI using the command: `pip uninstall dbt` - Reinstall dbt Core using the following command, replacing "adapter_name" with the appropriate adapter name: ```shell - pip install dbt-adapter_name --force-reinstall + python -m pip install dbt-adapter_name --force-reinstall ``` - For example, if I used Snowflake as an adapter, I would run: `pip install dbt-snowflake --force-reinstall` + For example, if I used Snowflake as an adapter, I would run: `python -m pip install dbt-snowflake --force-reinstall` -------- -Before installing the dbt Cloud CLI, make sure you have Python installed and your virtual environment venv or pyenv . If you already have a Python environment configured, you can skip to the [pip installation step](#install-dbt-cloud-cli-in-pip). +Before installing the dbt Cloud CLI, make sure you have Python installed and your virtual environment venv or pyenv . If you already have a Python environment configured, you can skip to the [python -m pip installation step](#install-dbt-cloud-cli-in-pip). ### Install a virtual environment @@ -200,7 +200,7 @@ We recommend using virtual environments (venv) to namespace `cloud-cli`. ```bash pip3 uninstall dbt-core dbt - pip install dbt-adapter_name --force-reinstall + python -m pip install dbt-adapter_name --force-reinstall ``` 4. Clone your repository to your local computer using `git clone`. For example, to clone a GitHub repo using HTTPS format, run `git clone https://github.com/YOUR-USERNAME/YOUR-REPOSITORY`. @@ -243,7 +243,7 @@ To update, follow the same process explained in [Windows](/docs/cloud/cloud-cli- To update: - Make sure you're in your virtual environment -- Run `pip install --upgrade dbt`. +- Run `python -m pip install --upgrade dbt`. diff --git a/website/docs/docs/connect-adapters.md b/website/docs/docs/connect-adapters.md index e301cfc237e..6ccc1b4f376 100644 --- a/website/docs/docs/connect-adapters.md +++ b/website/docs/docs/connect-adapters.md @@ -15,7 +15,7 @@ Explore the fastest and most reliable way to deploy dbt using dbt Cloud, a hoste Install dbt Core, an open-source tool, locally using the command line. dbt communicates with a number of different data platforms by using a dedicated adapter plugin for each. When you install dbt Core, you'll also need to install the specific adapter for your database, [connect to dbt Core](/docs/core/about-core-setup), and set up a `profiles.yml` file. -With a few exceptions [^1], you can install all [Verified adapters](/docs/supported-data-platforms) from PyPI using `pip install adapter-name`. For example to install Snowflake, use the command `pip install dbt-snowflake`. The installation will include `dbt-core` and any other required dependencies, which may include both other dependencies and even other adapter plugins. Read more about [installing dbt](/docs/core/installation). +With a few exceptions [^1], you can install all [Verified adapters](/docs/supported-data-platforms) from PyPI using `python -m pip install adapter-name`. For example to install Snowflake, use the command `python -m pip install dbt-snowflake`. The installation will include `dbt-core` and any other required dependencies, which may include both other dependencies and even other adapter plugins. Read more about [installing dbt](/docs/core/installation). [^1]: Here are the two different adapters. Use the PyPI package name when installing with `pip` diff --git a/website/docs/docs/core/connect-data-platform/athena-setup.md b/website/docs/docs/core/connect-data-platform/athena-setup.md index db218110dc1..9ee64e38009 100644 --- a/website/docs/docs/core/connect-data-platform/athena-setup.md +++ b/website/docs/docs/core/connect-data-platform/athena-setup.md @@ -32,7 +32,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md b/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md index 073e95530c1..b78f583f073 100644 --- a/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md +++ b/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md @@ -41,7 +41,7 @@ Refer to [Microsoft Fabric Synapse Data Warehouse](/docs/core/connect-data-platf pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md index 96eafadea3b..f352bed8ace 100644 --- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md +++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md @@ -35,7 +35,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/clickhouse-setup.md b/website/docs/docs/core/connect-data-platform/clickhouse-setup.md index fb0965398a2..83356c01404 100644 --- a/website/docs/docs/core/connect-data-platform/clickhouse-setup.md +++ b/website/docs/docs/core/connect-data-platform/clickhouse-setup.md @@ -36,7 +36,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/databend-setup.md b/website/docs/docs/core/connect-data-platform/databend-setup.md index daccd14f6c3..70237dfea07 100644 --- a/website/docs/docs/core/connect-data-platform/databend-setup.md +++ b/website/docs/docs/core/connect-data-platform/databend-setup.md @@ -40,7 +40,7 @@ If you're interested in contributing, check out the source code repository liste pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/databricks-setup.md b/website/docs/docs/core/connect-data-platform/databricks-setup.md index caf52d09de3..b9b68290578 100644 --- a/website/docs/docs/core/connect-data-platform/databricks-setup.md +++ b/website/docs/docs/core/connect-data-platform/databricks-setup.md @@ -36,7 +36,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/decodable-setup.md b/website/docs/docs/core/connect-data-platform/decodable-setup.md index b43521732d4..17d1ff9a6fd 100644 --- a/website/docs/docs/core/connect-data-platform/decodable-setup.md +++ b/website/docs/docs/core/connect-data-platform/decodable-setup.md @@ -39,7 +39,7 @@ Some core functionality may be limited. If you're interested in contributing, se dbt-decodable is also available on PyPI. pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/doris-setup.md b/website/docs/docs/core/connect-data-platform/doris-setup.md index a7e2ba1ba3e..92526c6d786 100644 --- a/website/docs/docs/core/connect-data-platform/doris-setup.md +++ b/website/docs/docs/core/connect-data-platform/doris-setup.md @@ -33,7 +33,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/dremio-setup.md b/website/docs/docs/core/connect-data-platform/dremio-setup.md index fa6ca154fcd..39ab959789e 100644 --- a/website/docs/docs/core/connect-data-platform/dremio-setup.md +++ b/website/docs/docs/core/connect-data-platform/dremio-setup.md @@ -39,7 +39,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/duckdb-setup.md b/website/docs/docs/core/connect-data-platform/duckdb-setup.md index a3fee5a5164..0b1fd57ce86 100644 --- a/website/docs/docs/core/connect-data-platform/duckdb-setup.md +++ b/website/docs/docs/core/connect-data-platform/duckdb-setup.md @@ -39,7 +39,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/exasol-setup.md b/website/docs/docs/core/connect-data-platform/exasol-setup.md index 2bf4cd7ffac..c8f7a50c3bf 100644 --- a/website/docs/docs/core/connect-data-platform/exasol-setup.md +++ b/website/docs/docs/core/connect-data-platform/exasol-setup.md @@ -39,7 +39,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/fabric-setup.md b/website/docs/docs/core/connect-data-platform/fabric-setup.md index aa7784d96ec..ce7a41b3e29 100644 --- a/website/docs/docs/core/connect-data-platform/fabric-setup.md +++ b/website/docs/docs/core/connect-data-platform/fabric-setup.md @@ -37,7 +37,7 @@ To learn how to set up dbt with Azure Synapse Dedicated Pools, see [Microsoft Az pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/fal-setup.md b/website/docs/docs/core/connect-data-platform/fal-setup.md index ef4998e8c1b..79c269f7403 100644 --- a/website/docs/docs/core/connect-data-platform/fal-setup.md +++ b/website/docs/docs/core/connect-data-platform/fal-setup.md @@ -39,7 +39,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package}[<sql-adapter>] +python -m pip install {frontMatter.meta.pypi_package}[<sql-adapter>]

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/firebolt-setup.md b/website/docs/docs/core/connect-data-platform/firebolt-setup.md index c7a5a543512..6bccfb8dcfe 100644 --- a/website/docs/docs/core/connect-data-platform/firebolt-setup.md +++ b/website/docs/docs/core/connect-data-platform/firebolt-setup.md @@ -37,7 +37,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/glue-setup.md b/website/docs/docs/core/connect-data-platform/glue-setup.md index e56e5bcd902..8d379afae58 100644 --- a/website/docs/docs/core/connect-data-platform/glue-setup.md +++ b/website/docs/docs/core/connect-data-platform/glue-setup.md @@ -40,7 +40,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    @@ -210,7 +210,7 @@ Configure a Python virtual environment to isolate package version and code depen $ sudo yum install git $ python3 -m venv dbt_venv $ source dbt_venv/bin/activate -$ python3 -m pip install --upgrade pip +$ python3 -m python -m pip install --upgrade pip ``` Configure the last version of AWS CLI diff --git a/website/docs/docs/core/connect-data-platform/greenplum-setup.md b/website/docs/docs/core/connect-data-platform/greenplum-setup.md index 06ada19a1e9..29050e879e3 100644 --- a/website/docs/docs/core/connect-data-platform/greenplum-setup.md +++ b/website/docs/docs/core/connect-data-platform/greenplum-setup.md @@ -34,7 +34,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/hive-setup.md b/website/docs/docs/core/connect-data-platform/hive-setup.md index 61a929c58da..9433b830592 100644 --- a/website/docs/docs/core/connect-data-platform/hive-setup.md +++ b/website/docs/docs/core/connect-data-platform/hive-setup.md @@ -34,7 +34,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    @@ -154,7 +154,7 @@ you must install the `dbt-hive` plugin. The following commands will install the latest version of `dbt-hive` as well as the requisite version of `dbt-core` and `impyla` driver used for connections. ``` -pip install dbt-hive +python -m pip install dbt-hive ``` ### Supported Functionality diff --git a/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md b/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md index cb6c7459418..ff3ab1236aa 100644 --- a/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md +++ b/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md @@ -40,7 +40,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/impala-setup.md b/website/docs/docs/core/connect-data-platform/impala-setup.md index 0a0f1b955a1..d9df9c89742 100644 --- a/website/docs/docs/core/connect-data-platform/impala-setup.md +++ b/website/docs/docs/core/connect-data-platform/impala-setup.md @@ -34,7 +34,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/infer-setup.md b/website/docs/docs/core/connect-data-platform/infer-setup.md index 430c5e47f85..47fe2aa5647 100644 --- a/website/docs/docs/core/connect-data-platform/infer-setup.md +++ b/website/docs/docs/core/connect-data-platform/infer-setup.md @@ -34,7 +34,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/iomete-setup.md b/website/docs/docs/core/connect-data-platform/iomete-setup.md index bc015141c85..1c71df577a8 100644 --- a/website/docs/docs/core/connect-data-platform/iomete-setup.md +++ b/website/docs/docs/core/connect-data-platform/iomete-setup.md @@ -36,7 +36,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/layer-setup.md b/website/docs/docs/core/connect-data-platform/layer-setup.md index f065c0c7313..4b65a58f5b5 100644 --- a/website/docs/docs/core/connect-data-platform/layer-setup.md +++ b/website/docs/docs/core/connect-data-platform/layer-setup.md @@ -36,7 +36,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/materialize-setup.md b/website/docs/docs/core/connect-data-platform/materialize-setup.md index c8777c29490..b2b9c870f86 100644 --- a/website/docs/docs/core/connect-data-platform/materialize-setup.md +++ b/website/docs/docs/core/connect-data-platform/materialize-setup.md @@ -39,7 +39,7 @@ Certain core functionality may vary. If you would like to report a bug, request pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/mindsdb-setup.md b/website/docs/docs/core/connect-data-platform/mindsdb-setup.md index e6b8c5decaa..d1b1fd87499 100644 --- a/website/docs/docs/core/connect-data-platform/mindsdb-setup.md +++ b/website/docs/docs/core/connect-data-platform/mindsdb-setup.md @@ -39,7 +39,7 @@ The dbt-mindsdb package allows dbt to connect to [MindsDB](https://github.com/mi pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/mssql-setup.md b/website/docs/docs/core/connect-data-platform/mssql-setup.md index 5efcc454823..3fa6cc22738 100644 --- a/website/docs/docs/core/connect-data-platform/mssql-setup.md +++ b/website/docs/docs/core/connect-data-platform/mssql-setup.md @@ -40,7 +40,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/mysql-setup.md b/website/docs/docs/core/connect-data-platform/mysql-setup.md index 1df6e205272..2cf439aa17a 100644 --- a/website/docs/docs/core/connect-data-platform/mysql-setup.md +++ b/website/docs/docs/core/connect-data-platform/mysql-setup.md @@ -39,7 +39,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/oracle-setup.md b/website/docs/docs/core/connect-data-platform/oracle-setup.md index b1195fbd0a0..1209df207af 100644 --- a/website/docs/docs/core/connect-data-platform/oracle-setup.md +++ b/website/docs/docs/core/connect-data-platform/oracle-setup.md @@ -36,7 +36,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/postgres-setup.md b/website/docs/docs/core/connect-data-platform/postgres-setup.md index f56d3f22576..8f9ed2f7915 100644 --- a/website/docs/docs/core/connect-data-platform/postgres-setup.md +++ b/website/docs/docs/core/connect-data-platform/postgres-setup.md @@ -36,7 +36,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/redshift-setup.md b/website/docs/docs/core/connect-data-platform/redshift-setup.md index 175d5f6a715..7692ebfbc78 100644 --- a/website/docs/docs/core/connect-data-platform/redshift-setup.md +++ b/website/docs/docs/core/connect-data-platform/redshift-setup.md @@ -36,7 +36,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/rockset-setup.md b/website/docs/docs/core/connect-data-platform/rockset-setup.md index 4a146829a03..9f1ddc51da3 100644 --- a/website/docs/docs/core/connect-data-platform/rockset-setup.md +++ b/website/docs/docs/core/connect-data-platform/rockset-setup.md @@ -40,7 +40,7 @@ Certain core functionality may vary. If you would like to report a bug, request pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/singlestore-setup.md b/website/docs/docs/core/connect-data-platform/singlestore-setup.md index a63466542a9..5673c640cc5 100644 --- a/website/docs/docs/core/connect-data-platform/singlestore-setup.md +++ b/website/docs/docs/core/connect-data-platform/singlestore-setup.md @@ -42,7 +42,7 @@ Certain core functionality may vary. If you would like to report a bug, request pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/snowflake-setup.md b/website/docs/docs/core/connect-data-platform/snowflake-setup.md index 98bcf447fed..0315547ad53 100644 --- a/website/docs/docs/core/connect-data-platform/snowflake-setup.md +++ b/website/docs/docs/core/connect-data-platform/snowflake-setup.md @@ -36,7 +36,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/spark-setup.md b/website/docs/docs/core/connect-data-platform/spark-setup.md index 895f0559953..6646f37a770 100644 --- a/website/docs/docs/core/connect-data-platform/spark-setup.md +++ b/website/docs/docs/core/connect-data-platform/spark-setup.md @@ -41,7 +41,7 @@ See [Databricks setup](#databricks-setup) for the Databricks version of this pag pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    @@ -51,15 +51,15 @@ If connecting to a Spark cluster via the generic thrift or http methods, it requ ```zsh # odbc connections -$ pip install "dbt-spark[ODBC]" +$ python -m pip install "dbt-spark[ODBC]" # thrift or http connections -$ pip install "dbt-spark[PyHive]" +$ python -m pip install "dbt-spark[PyHive]" ``` ```zsh # session connections -$ pip install "dbt-spark[session]" +$ python -m pip install "dbt-spark[session]" ```

    Configuring {frontMatter.meta.pypi_package}

    diff --git a/website/docs/docs/core/connect-data-platform/sqlite-setup.md b/website/docs/docs/core/connect-data-platform/sqlite-setup.md index 3da902a6f80..2912e26bd07 100644 --- a/website/docs/docs/core/connect-data-platform/sqlite-setup.md +++ b/website/docs/docs/core/connect-data-platform/sqlite-setup.md @@ -40,7 +40,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/starrocks-setup.md b/website/docs/docs/core/connect-data-platform/starrocks-setup.md index e5c1abac037..485e1d18fb7 100644 --- a/website/docs/docs/core/connect-data-platform/starrocks-setup.md +++ b/website/docs/docs/core/connect-data-platform/starrocks-setup.md @@ -34,7 +34,7 @@ meta: pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/teradata-setup.md b/website/docs/docs/core/connect-data-platform/teradata-setup.md index 1ba8e506b88..539ba94e38e 100644 --- a/website/docs/docs/core/connect-data-platform/teradata-setup.md +++ b/website/docs/docs/core/connect-data-platform/teradata-setup.md @@ -37,7 +37,7 @@ Some core functionality may be limited. If you're interested in contributing, ch pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/tidb-setup.md b/website/docs/docs/core/connect-data-platform/tidb-setup.md index e2205c4665e..c7820ea005d 100644 --- a/website/docs/docs/core/connect-data-platform/tidb-setup.md +++ b/website/docs/docs/core/connect-data-platform/tidb-setup.md @@ -42,7 +42,7 @@ If you're interested in contributing, check out the source code repository liste pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/trino-setup.md b/website/docs/docs/core/connect-data-platform/trino-setup.md index 39d8ed8ab3f..8a609bbb34d 100644 --- a/website/docs/docs/core/connect-data-platform/trino-setup.md +++ b/website/docs/docs/core/connect-data-platform/trino-setup.md @@ -41,7 +41,7 @@ Certain core functionality may vary. If you would like to report a bug, request pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    @@ -284,7 +284,7 @@ The only authentication parameter to set for OAuth 2.0 is `method: oauth`. If yo For more information, refer to both [OAuth 2.0 authentication](https://trino.io/docs/current/security/oauth2.html) in the Trino docs and the [README](https://github.com/trinodb/trino-python-client#oauth2-authentication) for the Trino Python client. -It's recommended that you install `keyring` to cache the OAuth 2.0 token over multiple dbt invocations by running `pip install 'trino[external-authentication-token-cache]'`. The `keyring` package is not installed by default. +It's recommended that you install `keyring` to cache the OAuth 2.0 token over multiple dbt invocations by running `python -m pip install 'trino[external-authentication-token-cache]'`. The `keyring` package is not installed by default. #### Example profiles.yml for OAuth diff --git a/website/docs/docs/core/connect-data-platform/upsolver-setup.md b/website/docs/docs/core/connect-data-platform/upsolver-setup.md index 6b2f410fc07..8e4203e0b0c 100644 --- a/website/docs/docs/core/connect-data-platform/upsolver-setup.md +++ b/website/docs/docs/core/connect-data-platform/upsolver-setup.md @@ -33,7 +33,7 @@ pagination_next: null pip is the easiest way to install the adapter: -pip install {frontMatter.meta.pypi_package} +python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/connect-data-platform/vertica-setup.md b/website/docs/docs/core/connect-data-platform/vertica-setup.md index fbb8de6b301..e3f3dc4f9f3 100644 --- a/website/docs/docs/core/connect-data-platform/vertica-setup.md +++ b/website/docs/docs/core/connect-data-platform/vertica-setup.md @@ -37,7 +37,7 @@ If you're interested in contributing, check out the source code for each reposit

    Installing {frontMatter.meta.pypi_package}

    -pip is the easiest way to install the adapter: pip install {frontMatter.meta.pypi_package} +pip is the easiest way to install the adapter: python -m pip install {frontMatter.meta.pypi_package}

    Installing {frontMatter.meta.pypi_package} will also install dbt-core and any other dependencies.

    diff --git a/website/docs/docs/core/docker-install.md b/website/docs/docs/core/docker-install.md index dfb2a669e34..8de3bcb5c06 100644 --- a/website/docs/docs/core/docker-install.md +++ b/website/docs/docs/core/docker-install.md @@ -5,7 +5,7 @@ description: "You can use Docker to install dbt and adapter plugins from the com dbt Core and all adapter plugins maintained by dbt Labs are available as [Docker](https://docs.docker.com/) images, and distributed via [GitHub Packages](https://docs.github.com/en/packages/learn-github-packages/introduction-to-github-packages) in a [public registry](https://github.com/dbt-labs/dbt-core/pkgs/container/dbt-core). -Using a prebuilt Docker image to install dbt Core in production has a few benefits: it already includes dbt-core, one or more database adapters, and pinned versions of all their dependencies. By contrast, `pip install dbt-core dbt-` takes longer to run, and will always install the latest compatible versions of every dependency. +Using a prebuilt Docker image to install dbt Core in production has a few benefits: it already includes dbt-core, one or more database adapters, and pinned versions of all their dependencies. By contrast, `python -m pip install dbt-core dbt-` takes longer to run, and will always install the latest compatible versions of every dependency. You might also be able to use Docker to install and develop locally if you don't have a Python environment set up. Note that running dbt in this manner can be significantly slower if your operating system differs from the system that built the Docker image. If you're a frequent local developer, we recommend that you install dbt Core via [Homebrew](/docs/core/homebrew-install) or [pip](/docs/core/pip-install) instead. diff --git a/website/docs/docs/core/pip-install.md b/website/docs/docs/core/pip-install.md index 44fac00e493..622f5f4e876 100644 --- a/website/docs/docs/core/pip-install.md +++ b/website/docs/docs/core/pip-install.md @@ -39,7 +39,7 @@ alias env_dbt='source /bin/activate' Once you know [which adapter](/docs/supported-data-platforms) you're using, you can install it as `dbt-`. For example, if using Postgres: ```shell -pip install dbt-postgres +python -m pip install dbt-postgres ``` This will install `dbt-core` and `dbt-postgres` _only_: @@ -62,7 +62,7 @@ All adapters build on top of `dbt-core`. Some also depend on other adapters: for To upgrade a specific adapter plugin: ```shell -pip install --upgrade dbt- +python -m pip install --upgrade dbt- ``` ### Install dbt-core only @@ -70,7 +70,7 @@ pip install --upgrade dbt- If you're building a tool that integrates with dbt Core, you may want to install the core library alone, without a database adapter. Note that you won't be able to use dbt as a CLI tool. ```shell -pip install dbt-core +python -m pip install dbt-core ``` ### Change dbt Core versions @@ -79,23 +79,23 @@ You can upgrade or downgrade versions of dbt Core by using the `--upgrade` optio To upgrade dbt to the latest version: ``` -pip install --upgrade dbt-core +python -m pip install --upgrade dbt-core ``` To downgrade to an older version, specify the version you want to use. This command can be useful when you're resolving package dependencies. As an example: ``` -pip install --upgrade dbt-core==0.19.0 +python -m pip install --upgrade dbt-core==0.19.0 ``` -### `pip install dbt` +### `python -m pip install dbt` Note that, as of v1.0.0, `pip install dbt` is no longer supported and will raise an explicit error. Since v0.13, the PyPI package named `dbt` was a simple "pass-through" of `dbt-core` and the four original database adapter plugins. For v1, we formalized that split. If you have workflows or integrations that relied on installing the package named `dbt`, you can achieve the same behavior going forward by installing the same five packages that it used: ```shell -pip install \ +python -m pip install \ dbt-core \ dbt-postgres \ dbt-redshift \ diff --git a/website/docs/docs/core/source-install.md b/website/docs/docs/core/source-install.md index 42086159c03..d17adc13c53 100644 --- a/website/docs/docs/core/source-install.md +++ b/website/docs/docs/core/source-install.md @@ -17,10 +17,10 @@ To install `dbt-core` from the GitHub code source: ```shell git clone https://github.com/dbt-labs/dbt-core.git cd dbt-core -pip install -r requirements.txt +python -m pip install -r requirements.txt ``` -This will install `dbt-core` and `dbt-postgres`. To install in editable mode (includes your local changes as you make them), use `pip install -e editable-requirements.txt` instead. +This will install `dbt-core` and `dbt-postgres`. To install in editable mode (includes your local changes as you make them), use `python -m pip install -e editable-requirements.txt` instead. ### Installing adapter plugins @@ -29,12 +29,12 @@ To install an adapter plugin from source, you will need to first locate its sour ```shell git clone https://github.com/dbt-labs/dbt-redshift.git cd dbt-redshift -pip install . +python -m pip install . ``` You do _not_ need to install `dbt-core` before installing an adapter plugin -- the plugin includes `dbt-core` among its dependencies, and it will install the latest compatible version automatically. -To install in editable mode, such as while contributing, use `pip install -e .` instead. +To install in editable mode, such as while contributing, use `python -m pip install -e .` instead. diff --git a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md index 3f45e44076c..c0ba804cd78 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md @@ -45,7 +45,7 @@ Global project macros have been reorganized, and some old unused macros have bee ### Installation - [Installation docs](/docs/supported-data-platforms) reflects adapter-specific installations -- `pip install dbt` is no longer supported, and will raise an explicit error. Install the specific adapter plugin you need as `pip install dbt-`. +- `python -m pip install dbt` is no longer supported, and will raise an explicit error. Install the specific adapter plugin you need as `python -m pip install dbt-`. - `brew install dbt` is no longer supported. Install the specific adapter plugin you need (among Postgres, Redshift, Snowflake, or BigQuery) as `brew install dbt-`. - Removed official support for python 3.6, which is reaching end of life on December 23, 2021 diff --git a/website/docs/faqs/Core/install-pip-best-practices.md b/website/docs/faqs/Core/install-pip-best-practices.md index e36d58296ec..72360a52acc 100644 --- a/website/docs/faqs/Core/install-pip-best-practices.md +++ b/website/docs/faqs/Core/install-pip-best-practices.md @@ -30,6 +30,6 @@ Before installing dbt, make sure you have the latest versions: ```shell -pip install --upgrade pip wheel setuptools +python -m pip install --upgrade pip wheel setuptools ``` diff --git a/website/docs/faqs/Core/install-pip-os-prereqs.md b/website/docs/faqs/Core/install-pip-os-prereqs.md index 41a4e4ec60e..ab1a725f3a1 100644 --- a/website/docs/faqs/Core/install-pip-os-prereqs.md +++ b/website/docs/faqs/Core/install-pip-os-prereqs.md @@ -6,7 +6,7 @@ id: install-pip-os-prereqs.md --- -Your operating system may require pre-installation setup before installing dbt Core with pip. After downloading and installing any dependencies specific to your development environment, you can proceed with the [pip installation of dbt Core](/docs/core/pip-install). +Your operating system may require pre-installation setup before installing dbt Core with pip. After downloading and installing any dependencies specific to your development environment, you can proceed with the [python -m pip installation of dbt Core](/docs/core/pip-install). ### CentOS diff --git a/website/docs/guides/adapter-creation.md b/website/docs/guides/adapter-creation.md index 8a9145f0258..3c003a5b888 100644 --- a/website/docs/guides/adapter-creation.md +++ b/website/docs/guides/adapter-creation.md @@ -799,7 +799,7 @@ dbt-tests-adapter ```sh -pip install -r dev_requirements.txt +python -m pip install -r dev_requirements.txt ``` ### Set up and configure pytest diff --git a/website/docs/guides/airflow-and-dbt-cloud.md b/website/docs/guides/airflow-and-dbt-cloud.md index a3ff59af14e..e5df3f70308 100644 --- a/website/docs/guides/airflow-and-dbt-cloud.md +++ b/website/docs/guides/airflow-and-dbt-cloud.md @@ -51,7 +51,7 @@ In this guide, you'll learn how to: You’ll also gain a better understanding of how this will: - Reduce the cognitive load when building and maintaining pipelines -- Avoid dependency hell (think: `pip install` conflicts) +- Avoid dependency hell (think: `python -m pip install` conflicts) - Implement better recoveries from failures - Define clearer workflows so that data and analytics engineers work better, together ♥️ diff --git a/website/docs/guides/codespace-qs.md b/website/docs/guides/codespace-qs.md index 7712ed8f8e8..b28b0ddaacf 100644 --- a/website/docs/guides/codespace-qs.md +++ b/website/docs/guides/codespace-qs.md @@ -61,7 +61,7 @@ If you'd like to work with a larger selection of Jaffle Shop data, you can gener 1. Install the Python package called [jafgen](https://pypi.org/project/jafgen/). At the terminal's prompt, run: ```shell - /workspaces/test (main) $ pip install jafgen + /workspaces/test (main) $ python -m pip install jafgen ``` 1. When installation is done, run: diff --git a/website/docs/guides/custom-cicd-pipelines.md b/website/docs/guides/custom-cicd-pipelines.md index 672c6e6dab8..bd6d7617623 100644 --- a/website/docs/guides/custom-cicd-pipelines.md +++ b/website/docs/guides/custom-cicd-pipelines.md @@ -336,7 +336,7 @@ lint-project: rules: - if: $CI_PIPELINE_SOURCE == "push" && $CI_COMMIT_BRANCH != 'main' script: - - pip install sqlfluff==0.13.1 + - python -m pip install sqlfluff==0.13.1 - sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022 # this job calls the dbt Cloud API to run a job @@ -379,7 +379,7 @@ steps: displayName: 'Use Python 3.7' - script: | - pip install requests + python -m pip install requests displayName: 'Install python dependencies' - script: | @@ -434,7 +434,7 @@ pipelines: - step: name: Lint dbt project script: - - pip install sqlfluff==0.13.1 + - python -m pip install sqlfluff==0.13.1 - sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022 'main': # override if your default branch doesn't run on a branch named "main" diff --git a/website/docs/guides/set-up-ci.md b/website/docs/guides/set-up-ci.md index 83362094ec6..89d7c5a14fa 100644 --- a/website/docs/guides/set-up-ci.md +++ b/website/docs/guides/set-up-ci.md @@ -167,7 +167,7 @@ jobs: with: python-version: "3.9" - name: Install SQLFluff - run: "pip install sqlfluff" + run: "python -m pip install sqlfluff" - name: Lint project run: "sqlfluff lint models --dialect snowflake" @@ -204,7 +204,7 @@ lint-project: rules: - if: $CI_PIPELINE_SOURCE == "push" && $CI_COMMIT_BRANCH != 'main' script: - - pip install sqlfluff + - python -m pip install sqlfluff - sqlfluff lint models --dialect snowflake ``` @@ -235,7 +235,7 @@ pipelines: - step: name: Lint dbt project script: - - pip install sqlfluff==0.13.1 + - python -m pip install sqlfluff==0.13.1 - sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022 'main': # override if your default branch doesn't run on a branch named "main" diff --git a/website/docs/guides/sl-migration.md b/website/docs/guides/sl-migration.md index 0cfde742af2..c3cca81f68e 100644 --- a/website/docs/guides/sl-migration.md +++ b/website/docs/guides/sl-migration.md @@ -25,10 +25,10 @@ dbt Labs recommends completing these steps in a local dev environment (such as t 1. Create new Semantic Model configs as YAML files in your dbt project.* 1. Upgrade the metrics configs in your project to the new spec.* 1. Delete your old metrics file or remove the `.yml` file extension so they're ignored at parse time. Remove the `dbt-metrics` package from your project. Remove any macros that reference `dbt-metrics`, like `metrics.calculate()`. Make sure that any packages you’re using don't have references to the old metrics spec. -1. Install the CLI with `pip install "dbt-metricflow[your_adapter_name]"`. For example: +1. Install the CLI with `python -m pip install "dbt-metricflow[your_adapter_name]"`. For example: ```bash - pip install "dbt-metricflow[snowflake]" + python -m pip install "dbt-metricflow[snowflake]" ``` **Note** - The MetricFlow CLI is not available in the IDE at this time. Support is coming soon. diff --git a/website/snippets/_sl-test-and-query-metrics.md b/website/snippets/_sl-test-and-query-metrics.md index 43ebd929cb3..2e9490f089d 100644 --- a/website/snippets/_sl-test-and-query-metrics.md +++ b/website/snippets/_sl-test-and-query-metrics.md @@ -48,8 +48,8 @@ The dbt Cloud CLI is strongly recommended to define and query metrics for your d 1. Install [MetricFlow](/docs/build/metricflow-commands) as an extension of a dbt adapter from PyPI. 2. Create or activate your virtual environment with `python -m venv venv` or `source your-venv/bin/activate`. -3. Run `pip install dbt-metricflow`. - - You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. As an example for a Snowflake adapter, run `pip install "dbt-metricflow[snowflake]"`. +3. Run `python -m pip install dbt-metricflow`. + - You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `python -m pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. As an example for a Snowflake adapter, run `python -m pip install "dbt-metricflow[snowflake]"`. - You'll need to manage versioning between dbt Core, your adapter, and MetricFlow. 4. Run `dbt parse`. This allows MetricFlow to build a semantic graph and generate a `semantic_manifest.json`. - This creates the file in your `/target` directory. If you're working from the Jaffle shop example, run `dbt seed && dbt run` before proceeding to ensure the data exists in your warehouse. From 2de9eb1d0810afee410b6ae2f73ee95976b09910 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 16 Nov 2023 15:32:50 -0500 Subject: [PATCH 11/25] Delete website/docs/guides/dremio-lakehouse.md --- website/docs/guides/dremio-lakehouse.md | 193 ------------------------ 1 file changed, 193 deletions(-) delete mode 100644 website/docs/guides/dremio-lakehouse.md diff --git a/website/docs/guides/dremio-lakehouse.md b/website/docs/guides/dremio-lakehouse.md deleted file mode 100644 index e140969aabd..00000000000 --- a/website/docs/guides/dremio-lakehouse.md +++ /dev/null @@ -1,193 +0,0 @@ ---- -title: Build a data lakehouse with dbt Core and Dremio Cloud -id: build-dremio-lakehouse -description: Learn how to build a data lakehouse with dbt Core and Dremio Cloud. -displayText: Build a data lakehouse with dbt Core and Dremio Cloud -hoverSnippet: Learn how to build a data lakehouse with dbt Core and Dremio Cloud -# time_to_complete: '30 minutes' commenting out until we test -platform: 'dbt-core' -icon: 'guides' -hide_table_of_contents: true -tags: ['Dremio', 'dbt Core'] -level: 'Intermediate' -recently_updated: true ---- -## Introduction - -This is a guide will demonstrate how to build a data lakehouse with dbt Core 1.5+ and Dremio Cloud. You can simplify and optimize your data infrastructure with dot’s robust transformation framework and Dremio’s open and easy data lakehouse. The integrated solution empowers companies to establish a strong data and analytics foundation, fostering self-service analytics and enhancing business insights while simplifying operations by eliminating the necessity to write complex ETL pipelines. - -## Prerequisites - -* You must have a [Dremio Cloud](https://docs.dremio.com/cloud/) account. -* You must have Python 3 installed. -* You must have dbt Core v1.5 or newer [installed](/docs/core/installation). -* You must have the Dremio adatper 1.5.0 or newer [installed and configured](/docs/core/connect-data-platform/dremio-setup) for Dremio Cloud. -* You must have basic working knowledge of Git and the command line interface (CLI). - -## Validate your environment - -Validate your environment by running the following commands in your CLI: - -```shell - -$ python3 --version -Python 3.11.4 # Must be Python 3 - -``` - -```shell - -$ dbt --version -Core: - - installed: 1.5.0 # Must be 1.5+ - - latest: 1.6.3 - Update available! - - Your version of dbt-core is out of date! - You can find instructions for upgrading here: - https://docs.getdbt.com/docs/installation - -Plugins: - - dremio: 1.5.0 - Up to date! # Must be 1.5+ - -``` - -## Getting started - -1. Clone the Dremio dbt Core sample project from the [github repo](https://github.com/dremio-brock/DremioDBTSample/tree/master/dremioSamples). - -2. Open the relation.py file in the Dremio adapter directory `$HOME/Library/Python/3.9/lib/python/site-packages/dbt/adapters/dremio/relation.py` in your IDE and locate lines 51 and 52. - -Update lines 51 and 52 if they don't have the following syntax: - -```python - -PATTERN = re.compile(r"""((?:[^."']|"[^"]*"|'[^']*')+)""") -return ".".join(PATTERN.split(identifier)[1::2]) - -``` - -The complete selection should look like this: - -```python -def quoted_by_component(self, identifier, componentName): - if componentName == ComponentName.Schema: - PATTERN = re.compile(r"""((?:[^."']|"[^"]*"|'[^']*')+)""") - return ".".join(PATTERN.split(identifier)[1::2]) - else: - return self.quoted(identifier) - -``` - -This is required because the plugin doesn’t support schema names in Dremio containing dots and spaces. - -## Build your pipeline - -1. Create a `profile.yml` file in the `$HOME/.dbt/profiles.yml` path and add the following configs: - -```yaml - -dremioSamples: - outputs: - cloud_dev: - dremio_space: dev - dremio_space_folder: no_schema - object_storage_path: dev - object_storage_source: $scratch - pat: - cloud_host: api.dremio.cloud - cloud_project_id: - threads: 1 - type: dremio - use_ssl: true - user: - target: dev - - ``` - - 2. Execute the transformation pipeline: - - ```shell - - $ dbt run -t cloud_dev - - ``` - - If the above configurations have been implemented, the output will look something like this: - -```shell - -17:24:16 Running with dbt=1.5.0 -17:24:17 Found 5 models, 0 tests, 0 snapshots, 0 analyses, 348 macros, 0 operations, 0 seed files, 2 sources, 0 exposures, 0 metrics, 0 groups -17:24:17 -17:24:29 Concurrency: 1 threads (target='cloud_dev') -17:24:29 -17:24:29 1 of 5 START sql view model Preparation.trips .................................. [RUN] -17:24:31 1 of 5 OK created sql view model Preparation. trips ............................. [OK in 2.61s] -17:24:31 2 of 5 START sql view model Preparation.weather ................................ [RUN] -17:24:34 2 of 5 OK created sql view model Preparation.weather ........................... [OK in 2.15s] -17:24:34 3 of 5 START sql view model Business.Transportation.nyc_trips .................. [RUN] -17:24:36 3 of 5 OK created sql view model Business.Transportation.nyc_trips ............. [OK in 2.18s] -17:24:36 4 of 5 START sql view model Business.Weather.nyc_weather ....................... [RUN] -17:24:38 4 of 5 OK created sql view model Business.Weather.nyc_weather .................. [OK in 2.09s] -17:24:38 5 of 5 START sql view model Application.nyc_trips_with_weather ................. [RUN] -17:24:41 5 of 5 OK created sql view model Application.nyc_trips_with_weather ............ [OK in 2.74s] -17:24:41 -17:24:41 Finished running 5 view models in 0 hours 0 minutes and 24.03 seconds (24.03s). -17:24:41 -17:24:41 Completed successfully -17:24:41 -17:24:41 Done. PASS=5 WARN=0 ERROR=0 SKIP=0 TOTAL=5 - -``` - -Now that you have a running environment and a completed job, you can view the data in Dremio and expand your code. This is a snapshot of the project structure in an IDE: - - - -## About the schema.yml - -The `schema.yml` file defines Dremio sources and models to be used and what data models are in scope. In this guides sample project, there are two data sources: - -1. The `NYC-weather.csv` stored in the **Samples** database and -2. The `sample_data` from the **Samples database**.** - -The models correspond to both weather & trip data respectively and will be joined for analysis. - -The sources can be found by navigating to the **Object Storage** section of the Dremio Cloud UI. - - - -## About the models - -**Preparation** — `preparation_trips.sql` and `preparation_weather.sql` are building views on top of the trips and weather data. - -**Business** — `business_transportation_nyc_trips.sql` applies some level of transformation on `preparation_trips.sql` view. `Business_weather_nyc.sql` has no transformation on the `preparation_weather.sql` view. - -**Application** — `application_nyc_trips_with_weather.sql` joins the output from the Business model. This is what your business users will consume. - -## The Job output - -When you run the dbt job, it will create a **dev** space folder that has all the data assets created. This is what you will see in Dremio Cloud UI. Spaces in Dremio is a way to organize data assets which map to business units or data products. - - - -Open the **Application folder** and you will see the output of the simple transformation we did using dbt. - - - -## Query the data - -Now that you have run the job and completed the transformation, it's time to query your data. Click on the `nyc_trips_with_weather` view. That will take you to the SQL Runner page. Click **Show SQL Pane** on the upper right corner of the page. Run the following query: - -```sql - -SELECT vendor_id, - AVG(tip_amount) -FROM dev.application."nyc_treips_with_weather" -GROUP BY vendor_id - -``` - - - -This completes the integration setup and data is ready for business consumption. \ No newline at end of file From bf32ef962484c19802779060366ff50c476b2b7a Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Thu, 16 Nov 2023 15:38:00 -0500 Subject: [PATCH 12/25] Update website/docs/docs/core/connect-data-platform/glue-setup.md --- website/docs/docs/core/connect-data-platform/glue-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/core/connect-data-platform/glue-setup.md b/website/docs/docs/core/connect-data-platform/glue-setup.md index 8d379afae58..9e8fff462f1 100644 --- a/website/docs/docs/core/connect-data-platform/glue-setup.md +++ b/website/docs/docs/core/connect-data-platform/glue-setup.md @@ -210,7 +210,7 @@ Configure a Python virtual environment to isolate package version and code depen $ sudo yum install git $ python3 -m venv dbt_venv $ source dbt_venv/bin/activate -$ python3 -m python -m pip install --upgrade pip +$ python3 -m pip install --upgrade pip ``` Configure the last version of AWS CLI From 999b1ba80825aed54e0b7b8ac9f53923018970bb Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 20 Nov 2023 10:59:40 +0000 Subject: [PATCH 13/25] fix links --- website/docs/docs/core/connect-data-platform/fabric-setup.md | 2 +- .../docs/docs/core/connect-data-platform/materialize-setup.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/website/docs/docs/core/connect-data-platform/fabric-setup.md b/website/docs/docs/core/connect-data-platform/fabric-setup.md index c7faf5e0c67..11a8cf6f98b 100644 --- a/website/docs/docs/core/connect-data-platform/fabric-setup.md +++ b/website/docs/docs/core/connect-data-platform/fabric-setup.md @@ -4,7 +4,7 @@ description: "Read this guide to learn about the Microsoft Fabric Synapse Data W id: fabric-setup meta: maintained_by: Microsoft - authors: '[Microsoft](https://github.com/Microsoft)' + authors: 'Microsoft' github_repo: 'Microsoft/dbt-fabric' pypi_package: 'dbt-fabric' min_core_version: '1.4.0' diff --git a/website/docs/docs/core/connect-data-platform/materialize-setup.md b/website/docs/docs/core/connect-data-platform/materialize-setup.md index ec0034dcd37..70505fe1d65 100644 --- a/website/docs/docs/core/connect-data-platform/materialize-setup.md +++ b/website/docs/docs/core/connect-data-platform/materialize-setup.md @@ -6,7 +6,7 @@ meta: maintained_by: Materialize Inc. pypi_package: 'dbt-materialize' authors: 'Materialize team' - github_repo: 'MaterializeInc/materialize/blob/main/misc/dbt-materialize' + github_repo: 'MaterializeInc/materialize' min_core_version: 'v0.18.1' min_supported_version: 'v0.28.0' cloud_support: Not Supported From 5ea8046fe765c83ec43b1e1edd79dedd4ab933dd Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 20 Nov 2023 11:08:01 +0000 Subject: [PATCH 14/25] add guidance in adapters guide --- website/docs/guides/adapter-creation.md | 10 +++++++++- 1 file changed, 9 insertions(+), 1 deletion(-) diff --git a/website/docs/guides/adapter-creation.md b/website/docs/guides/adapter-creation.md index 8a9145f0258..aa4819e73d0 100644 --- a/website/docs/guides/adapter-creation.md +++ b/website/docs/guides/adapter-creation.md @@ -1108,7 +1108,7 @@ The following subjects need to be addressed across three pages of this docs site | How To... | File to change within `/website/docs/` | Action | Info to Include | |----------------------|--------------------------------------------------------------|--------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| Connect | `/docs/core/connect-data-platform/{MY-DATA-PLATFORM}-setup.md` | Create | Give all information needed to define a target in `~/.dbt/profiles.yml` and get `dbt debug` to connect to the database successfully. All possible configurations should be mentioned. | +| Connect | `/docs/core/connect-data-platform/{MY-DATA-PLATFORM}-setup.md` | Create | Give all information needed to define a target in `~/.dbt/profiles.yml` and get `dbt debug` to connect to the database successfully. All possible configurations should be mentioned. | | Configure | `reference/resource-configs/{MY-DATA-PLATFORM}-configs.md` | Create | What options and configuration specific to your data platform do users need to know? e.g. table distribution and indexing options, column_quoting policy, which incremental strategies are supported | | Discover and Install | `docs/supported-data-platforms.md` | Modify | Is it a vendor- or community- supported adapter? How to install Python adapter package? Ideally with pip and PyPI hosted package, but can also use `git+` link to GitHub Repo | | Add link to sidebar | `website/sidebars.js` | Modify | Add the document id to the correct location in the sidebar menu | @@ -1123,6 +1123,14 @@ Below are some recent pull requests made by partners to document their data plat - [SingleStore](https://github.com/dbt-labs/docs.getdbt.com/pull/1044) - [Firebolt](https://github.com/dbt-labs/docs.getdbt.com/pull/941) +Note — Use the following re-usable component to auto-fill the frontmatter content on your new page: + +```markdown +import SetUpPages from '/snippets/_setup-pages-intro.md'; + + +``` + ## Promote a new adapter The most important thing here is recognizing that people are successful in the community when they join, first and foremost, to engage authentically. From 8ce6765c2ab6b4e76bddb88e9054cfb4e5571bc8 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Mon, 20 Nov 2023 17:39:11 +0000 Subject: [PATCH 15/25] Update website/snippets/_setup-pages-intro.md Co-authored-by: Anders --- website/snippets/_setup-pages-intro.md | 1 - 1 file changed, 1 deletion(-) diff --git a/website/snippets/_setup-pages-intro.md b/website/snippets/_setup-pages-intro.md index cc68aac913f..44cbbb1a0c2 100644 --- a/website/snippets/_setup-pages-intro.md +++ b/website/snippets/_setup-pages-intro.md @@ -19,4 +19,3 @@ Use `pip` to install the adapter, which automatically installs `dbt-core` and an

    For {props.meta.platform_name}-specific configuration, please refer to {props.meta.platform_name} configs.

    -

    For further info, refer to the GitHub repository: {props.meta.github_repo}

    From a8cfad3af9488cc7a235447a6dfbf4b15b7c5eef Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Mon, 20 Nov 2023 14:12:02 -0500 Subject: [PATCH 16/25] Updating "Future Release" language --- website/docs/docs/dbt-versions/core-versions.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/core-versions.md b/website/docs/docs/dbt-versions/core-versions.md index 2467f3c946b..304eb9db6a7 100644 --- a/website/docs/docs/dbt-versions/core-versions.md +++ b/website/docs/docs/dbt-versions/core-versions.md @@ -56,7 +56,7 @@ After a minor version reaches the end of its critical support period, one year a ### Future versions -We aim to release a new minor "feature" every 3 months. _This is an indicative timeline ONLY._ For the latest information about upcoming releases, including their planned release dates and which features and fixes might be included in each, always consult the [`dbt-core` repository milestones](https://github.com/dbt-labs/dbt-core/milestones). +For the latest information about upcoming releases, including their planned release dates and which features and fixes might be included in each, consult the [`dbt-core` repository milestones](https://github.com/dbt-labs/dbt-core/milestones) and [product roadmaps](https://github.com/dbt-labs/dbt-core/tree/main/docs/roadmap). ## Best practices for upgrading From 69979d76e70b454c7b155fda95974873df8a6fd0 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Mon, 20 Nov 2023 15:11:14 -0500 Subject: [PATCH 17/25] Adding package-lock to 1.7 guide --- .../docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md index 9ebd3c64cf3..5155cdd3f0d 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md +++ b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md @@ -32,6 +32,8 @@ This is a relatively small behavior change, but worth calling out in case you no - Don't add a `freshness:` block. - Explicitly set `freshness: null` +From v1.7 on, when `dbt deps` is run, it will create/update the `package-lock.yml` file in the _project_root_ where `packages.yml` is recorded. The `package-lock.yml` file contains a record of all packages installed and, if subsequent `dbt deps` runs contain no updated packages in `depenedencies.yml` or `packages.yml`, dbt-core installs from `package-lock.yml`. + ## New and changed features and functionality - [`dbt docs generate`](/reference/commands/cmd-docs) now supports `--select` to generate [catalog metadata](/reference/artifacts/catalog-json) for a subset of your project. Currently available for Snowflake and Postgres only, but other adapters are coming soon. From 707fdc38bc7ecc2b61382dc473f4f1bbcfc67d41 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Mon, 20 Nov 2023 15:13:26 -0500 Subject: [PATCH 18/25] Update website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md --- .../docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md index 5155cdd3f0d..764a0244279 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md +++ b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md @@ -32,7 +32,7 @@ This is a relatively small behavior change, but worth calling out in case you no - Don't add a `freshness:` block. - Explicitly set `freshness: null` -From v1.7 on, when `dbt deps` is run, it will create/update the `package-lock.yml` file in the _project_root_ where `packages.yml` is recorded. The `package-lock.yml` file contains a record of all packages installed and, if subsequent `dbt deps` runs contain no updated packages in `depenedencies.yml` or `packages.yml`, dbt-core installs from `package-lock.yml`. +From v1.7 on, when [`dbt deps`](/reference/commands/deps) is run, it will create or update the `package-lock.yml` file in the _project_root_ where `packages.yml` is recorded. The `package-lock.yml` file contains a record of all packages installed and, if subsequent `dbt deps` runs contain no updated packages in `depenedencies.yml` or `packages.yml`, dbt-core installs from `package-lock.yml`. ## New and changed features and functionality From 2416d697ef2be9f1bd51e01273692a2712a1dc19 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Mon, 20 Nov 2023 15:27:12 -0500 Subject: [PATCH 19/25] Update website/docs/docs/core/pip-install.md Co-authored-by: Doug Beatty <44704949+dbeatty10@users.noreply.github.com> --- website/docs/docs/core/pip-install.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/core/pip-install.md b/website/docs/docs/core/pip-install.md index 622f5f4e876..e1a0e65312c 100644 --- a/website/docs/docs/core/pip-install.md +++ b/website/docs/docs/core/pip-install.md @@ -88,7 +88,7 @@ To downgrade to an older version, specify the version you want to use. This comm python -m pip install --upgrade dbt-core==0.19.0 ``` -### `python -m pip install dbt` +### `pip install dbt` Note that, as of v1.0.0, `pip install dbt` is no longer supported and will raise an explicit error. Since v0.13, the PyPI package named `dbt` was a simple "pass-through" of `dbt-core` and the four original database adapter plugins. For v1, we formalized that split. From 4eeb35aaa6e9ef54d6b54b42cff8aa4b139191cc Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Mon, 20 Nov 2023 15:43:59 -0500 Subject: [PATCH 20/25] Update website/docs/guides/airflow-and-dbt-cloud.md Co-authored-by: Doug Beatty <44704949+dbeatty10@users.noreply.github.com> --- website/docs/guides/airflow-and-dbt-cloud.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/guides/airflow-and-dbt-cloud.md b/website/docs/guides/airflow-and-dbt-cloud.md index e5df3f70308..a3ff59af14e 100644 --- a/website/docs/guides/airflow-and-dbt-cloud.md +++ b/website/docs/guides/airflow-and-dbt-cloud.md @@ -51,7 +51,7 @@ In this guide, you'll learn how to: You’ll also gain a better understanding of how this will: - Reduce the cognitive load when building and maintaining pipelines -- Avoid dependency hell (think: `python -m pip install` conflicts) +- Avoid dependency hell (think: `pip install` conflicts) - Implement better recoveries from failures - Define clearer workflows so that data and analytics engineers work better, together ♥️ From 05c9d596ef90e9aeea8f1d7358b1bf2823eb05f2 Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Mon, 20 Nov 2023 15:44:42 -0500 Subject: [PATCH 21/25] Apply suggestions from code review Co-authored-by: Doug Beatty <44704949+dbeatty10@users.noreply.github.com> --- website/docs/docs/cloud/cloud-cli-installation.md | 2 +- website/docs/faqs/Core/install-pip-os-prereqs.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/website/docs/docs/cloud/cloud-cli-installation.md b/website/docs/docs/cloud/cloud-cli-installation.md index 6c11a2250a9..1e72706caec 100644 --- a/website/docs/docs/cloud/cloud-cli-installation.md +++ b/website/docs/docs/cloud/cloud-cli-installation.md @@ -161,7 +161,7 @@ If you already have dbt Core installed, the dbt Cloud CLI may conflict. Here are -------- -Before installing the dbt Cloud CLI, make sure you have Python installed and your virtual environment venv or pyenv . If you already have a Python environment configured, you can skip to the [python -m pip installation step](#install-dbt-cloud-cli-in-pip). +Before installing the dbt Cloud CLI, make sure you have Python installed and your virtual environment venv or pyenv . If you already have a Python environment configured, you can skip to the [pip installation step](#install-dbt-cloud-cli-in-pip). ### Install a virtual environment diff --git a/website/docs/faqs/Core/install-pip-os-prereqs.md b/website/docs/faqs/Core/install-pip-os-prereqs.md index ab1a725f3a1..41a4e4ec60e 100644 --- a/website/docs/faqs/Core/install-pip-os-prereqs.md +++ b/website/docs/faqs/Core/install-pip-os-prereqs.md @@ -6,7 +6,7 @@ id: install-pip-os-prereqs.md --- -Your operating system may require pre-installation setup before installing dbt Core with pip. After downloading and installing any dependencies specific to your development environment, you can proceed with the [python -m pip installation of dbt Core](/docs/core/pip-install). +Your operating system may require pre-installation setup before installing dbt Core with pip. After downloading and installing any dependencies specific to your development environment, you can proceed with the [pip installation of dbt Core](/docs/core/pip-install). ### CentOS From 37f7d54f05ea5a74748972e2666640854c17598a Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Mon, 20 Nov 2023 16:06:16 -0500 Subject: [PATCH 22/25] Adding python -m --- website/snippets/_setup-pages-intro.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/snippets/_setup-pages-intro.md b/website/snippets/_setup-pages-intro.md index 44cbbb1a0c2..5ded5ba5ebc 100644 --- a/website/snippets/_setup-pages-intro.md +++ b/website/snippets/_setup-pages-intro.md @@ -13,7 +13,7 @@

    Installing {props.meta.pypi_package}

    Use `pip` to install the adapter, which automatically installs `dbt-core` and any additional dependencies. Use the following command for installation: -pip install {props.meta.pypi_package} +python -m pip install {props.meta.pypi_package}

    Configuring {props.meta.pypi_package}

    From 28d004ed9e4055406f0a72fb1c6395605a4449ba Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Mon, 20 Nov 2023 16:07:50 -0500 Subject: [PATCH 23/25] Update website/docs/docs/dbt-versions/core-versions.md Co-authored-by: Leona B. Campbell <3880403+runleonarun@users.noreply.github.com> --- website/docs/docs/dbt-versions/core-versions.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/core-versions.md b/website/docs/docs/dbt-versions/core-versions.md index 304eb9db6a7..c497401a17d 100644 --- a/website/docs/docs/dbt-versions/core-versions.md +++ b/website/docs/docs/dbt-versions/core-versions.md @@ -56,7 +56,7 @@ After a minor version reaches the end of its critical support period, one year a ### Future versions -For the latest information about upcoming releases, including their planned release dates and which features and fixes might be included in each, consult the [`dbt-core` repository milestones](https://github.com/dbt-labs/dbt-core/milestones) and [product roadmaps](https://github.com/dbt-labs/dbt-core/tree/main/docs/roadmap). +For the latest information about upcoming releases, including planned release dates and which features and fixes might be included, consult the [`dbt-core` repository milestones](https://github.com/dbt-labs/dbt-core/milestones) and [product roadmaps](https://github.com/dbt-labs/dbt-core/tree/main/docs/roadmap). ## Best practices for upgrading From fb7c3107b42fa16a667ebcefd27f880a1e3eaa3b Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Mon, 20 Nov 2023 16:12:18 -0500 Subject: [PATCH 24/25] Update website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md Co-authored-by: Leona B. Campbell <3880403+runleonarun@users.noreply.github.com> --- .../docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md index 764a0244279..18863daba6f 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md +++ b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md @@ -32,7 +32,7 @@ This is a relatively small behavior change, but worth calling out in case you no - Don't add a `freshness:` block. - Explicitly set `freshness: null` -From v1.7 on, when [`dbt deps`](/reference/commands/deps) is run, it will create or update the `package-lock.yml` file in the _project_root_ where `packages.yml` is recorded. The `package-lock.yml` file contains a record of all packages installed and, if subsequent `dbt deps` runs contain no updated packages in `depenedencies.yml` or `packages.yml`, dbt-core installs from `package-lock.yml`. +Beginning with v1.7, running [`dbt deps`](/reference/commands/deps) creates or updates the `package-lock.yml` file in the _project_root_ where `packages.yml` is recorded. The `package-lock.yml` file contains a record of all packages installed and, if subsequent `dbt deps` runs contain no updated packages in `depenedencies.yml` or `packages.yml`, dbt-core installs from `package-lock.yml`. ## New and changed features and functionality From 247dcc2238273c1a28e7d042e4c622405ce8537e Mon Sep 17 00:00:00 2001 From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> Date: Mon, 20 Nov 2023 16:57:39 -0500 Subject: [PATCH 25/25] Fixing guide typo --- website/docs/guides/dremio-lakehouse.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/guides/dremio-lakehouse.md b/website/docs/guides/dremio-lakehouse.md index 1c59c04d175..59da64a5f88 100644 --- a/website/docs/guides/dremio-lakehouse.md +++ b/website/docs/guides/dremio-lakehouse.md @@ -14,7 +14,7 @@ recently_updated: true --- ## Introduction -This guide will demonstrate how to build a data lakehouse with dbt Core 1.5 or new and Dremio Cloud. You can simplify and optimize your data infrastructure with dbt's robust transformation framework and Dremio’s open and easy data lakehouse. The integrated solution empowers companies to establish a strong data and analytics foundation, fostering self-service analytics and enhancing business insights while simplifying operations by eliminating the necessity to write complex Extract, Transform, and Load (ETL) pipelines. +This guide will demonstrate how to build a data lakehouse with dbt Core 1.5 or newer and Dremio Cloud. You can simplify and optimize your data infrastructure with dbt's robust transformation framework and Dremio’s open and easy data lakehouse. The integrated solution empowers companies to establish a strong data and analytics foundation, fostering self-service analytics and enhancing business insights while simplifying operations by eliminating the necessity to write complex Extract, Transform, and Load (ETL) pipelines. ### Prerequisites