:root` (contact your dbt Labs account representative for appropriate account ID).
+
+### Completing the connection
+
+To complete the connection, dbt Labs must now provision a VPC Endpoint to connect to your VPC Endpoint Service. This requires you send the following information:
+
+ - VPC Endpoint Service name:
+
+
+
+ - **DNS configuration:** If the connection to the VCS service requires a custom domain and/or URL for TLS, a private hosted zone can be configured by the dbt Labs Infrastructure team in the dbt Cloud private network. For example:
+ - **Private hosted zone:** `examplecorp.com`
+ - **DNS record:** `github.examplecorp.com`
+
+### Accepting the connection request
+
+When you have been notified that the resources are provisioned within the dbt Cloud environment, you must accept the endpoint connection (unless the VPC Endpoint Service is set to auto-accept connection requests). Requests can be accepted through the AWS console, as seen below, or through the AWS CLI.
+
+
+
+Once you accept the endpoint connection request, you can use the PrivateLink endpoint in dbt Cloud.
+
+## Configure in dbt Cloud
+
+Once dbt confirms that the PrivateLink integration is complete, you can use it in a new or existing git configuration.
+1. Select **PrivateLink Endpoint** as the connection type, and your configured integrations will appear in the dropdown menu.
+2. Select the configured endpoint from the drop down list.
+3. Click **Save**.
+
+
+
+
\ No newline at end of file
diff --git a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md b/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md
index b387c64788f..e104ea8640c 100644
--- a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md
+++ b/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md
@@ -5,7 +5,7 @@ description: "Automatically generate project documentation as you run jobs."
pagination_next: null
---
-dbt enables you to generate documentation for your project and data warehouse, and renders the documentation in a website. For more information, see [Documentation](/docs/collaborate/documentation).
+dbt Cloud enables you to generate documentation for your project and data platform, rendering it as a website. The documentation is only updated with new information after a fully successful job run, ensuring accuracy and relevance. Refer to [Documentation](/docs/collaborate/documentation) for more details.
## Set up a documentation job
@@ -52,13 +52,15 @@ You configure project documentation to generate documentation when the job you s
To generate documentation in the dbt Cloud IDE, run the `dbt docs generate` command in the
Command Bar in the dbt Cloud IDE. This command will generate the Docs for your dbt project as it exists in development in your IDE session.
-
+
After generating your documentation, you can click the **Book** icon above the file tree, to see the latest version of your documentation rendered in a new browser window.
## Viewing documentation
-Once you set up a job to generate documentation for your project, you can click **Documentation** in the top left. Your project's documentation should open. This link will always navigate you to the most recent version of your project's documentation in dbt Cloud.
+Once you set up a job to generate documentation for your project, you can click **Documentation** in the top left. Your project's documentation should open. This link will always help you find the most recent version of your project's documentation in dbt Cloud.
+
+These generated docs always show the last fully successful run, which means that if you have any failed tasks, including tests, then you will not see changes to the docs by this run. If you don't see a fully successful run, then you won't see any changes to the documentation.
The dbt Cloud IDE makes it possible to view [documentation](/docs/collaborate/documentation)
for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production.
diff --git a/website/docs/docs/collaborate/explore-multiple-projects.md b/website/docs/docs/collaborate/explore-multiple-projects.md
new file mode 100644
index 00000000000..3be35110a37
--- /dev/null
+++ b/website/docs/docs/collaborate/explore-multiple-projects.md
@@ -0,0 +1,46 @@
+---
+title: "Explore multiple projects"
+sidebar_label: "Explore multiple projects"
+description: "Learn about project-level lineage in dbt Explorer and its uses."
+pagination_next: null
+---
+
+You can also view all the different projects and public models in the account, where the public models are defined, and how they are used to gain a better understanding about your cross-project resources.
+
+The resource-level lineage graph for a given project displays the cross-project relationships in the DAG. The different icons indicate whether you’re looking at an upstream producer project (parent) or a downstream consumer project (child).
+
+When you view an upstream (parent) project, its public models display a counter icon in the upper right corner indicating how many downstream (child) projects depend on them. Selecting a model reveals the lineage indicating the projects dependent on that model. These counts include all projects listing the upstream one as a dependency in its `dependencies.yml`, even without a direct `{{ ref() }}`. Selecting a project node from a public model opens its detailed lineage graph, which is subject to your [permission](/docs/cloud/manage-access/enterprise-permissions).
+
+
+
+When viewing a downstream (child) project that imports and refs public models from upstream (parent) projects, public models will show up in the lineage graph and display an icon on the graph edge that indicates what the relationship is to a model from another project. Hovering over this icon indicates the specific dbt Cloud project that produces that model. Double-clicking on a model from another project opens the resource-level lineage graph of the parent project, which is subject to your permissions.
+
+
+
+
+## Explore the project-level lineage graph
+
+For cross-project collaboration, you can interact with the DAG in all the same ways as described in [Explore your project's lineage](/docs/collaborate/explore-projects#project-lineage) but you can also interact with it at the project level and view the details.
+
+To get a list view of all the projects, select the account name at the top of the **Explore** page near the navigation bar. This view includes a public model list, project list, and a search bar for project searches. You can also view the project-level lineage graph by clicking the Lineage view icon in the page's upper right corner.
+
+If you have permissions for a project in the account, you can view all public models used across the entire account. However, you can only view full public model details and private models if you have permissions for a project where the models are defined.
+
+From the project-level lineage graph, you can:
+
+- Click the Lineage view icon (in the graph’s upper right corner) to view the cross-project lineage graph.
+- Click the List view icon (in the graph’s upper right corner) to view the project list.
+ - Select a project from the **Projects** tab to switch to that project’s main **Explore** page.
+ - Select a model from the **Public Models** tab to view the [model’s details page](/docs/collaborate/explore-projects#view-resource-details).
+ - Perform searches on your projects with the search bar.
+- Select a project node in the graph (double-clicking) to switch to that particular project’s lineage graph.
+
+When you select a project node in the graph, a project details panel opens on the graph’s right-hand side where you can:
+
+- View counts of the resources defined in the project.
+- View a list of its public models, if any.
+- View a list of other projects that uses the project, if any.
+- Click **Open Project Lineage** to switch to the project’s lineage graph.
+- Click the Share icon to copy the project panel link to your clipboard so you can share the graph with someone.
+
+
\ No newline at end of file
diff --git a/website/docs/docs/collaborate/explore-projects.md b/website/docs/docs/collaborate/explore-projects.md
index 282ef566356..78fe6f45cc7 100644
--- a/website/docs/docs/collaborate/explore-projects.md
+++ b/website/docs/docs/collaborate/explore-projects.md
@@ -2,7 +2,7 @@
title: "Explore your dbt projects"
sidebar_label: "Explore dbt projects"
description: "Learn about dbt Explorer and how to interact with it to understand, improve, and leverage your data pipelines."
-pagination_next: null
+pagination_next: "docs/collaborate/model-performance"
pagination_prev: null
---
@@ -36,7 +36,7 @@ For a richer experience with dbt Explorer, you must:
- Run [dbt source freshness](/reference/commands/source#dbt-source-freshness) within a job in the environment to view source freshness data.
- Run [dbt snapshot](/reference/commands/snapshot) or [dbt build](/reference/commands/build) within a job in the environment to view snapshot details.
-Richer and more timely metadata will become available as dbt, the Discovery API, and the underlying dbt Cloud platform evolves.
+Richer and more timely metadata will become available as dbt Core, the Discovery API, and the underlying dbt Cloud platform evolves.
## Explore your project's lineage graph {#project-lineage}
@@ -46,6 +46,8 @@ If you don't see the project lineage graph immediately, click **Render Lineage**
The nodes in the lineage graph represent the project’s resources and the edges represent the relationships between the nodes. Nodes are color-coded and include iconography according to their resource type.
+By default, dbt Explorer shows the project's [applied state](/docs/dbt-cloud-apis/project-state#definition-logical-vs-applied-state-of-dbt-nodes) lineage. That is, it shows models that have been successfully built and are available to query, not just the models defined in the project.
+
To explore the lineage graphs of tests and macros, view [their resource details pages](#view-resource-details). By default, dbt Explorer excludes these resources from the full lineage graph unless a search query returns them as results.
To interact with the full lineage graph, you can:
@@ -53,17 +55,23 @@ To interact with the full lineage graph, you can:
- Hover over any item in the graph to display the resource’s name and type.
- Zoom in and out on the graph by mouse-scrolling.
- Grab and move the graph and the nodes.
+- Right click on a node (context menu) to:
+ - Refocus on the node, including its parent and child nodes
+ - Refocus on the node and its children only
+ - Refocus on the node and it parents only
+ - View the node's [resource details](#view-resource-details) page
+
- Select a resource to highlight its relationship with other resources in your project. A panel opens on the graph’s right-hand side that displays a high-level summary of the resource’s details. The side panel includes a **General** tab for information like description, materialized type, and other details.
- Click the Share icon in the side panel to copy the graph’s link to your clipboard.
- Click the View Resource icon in the side panel to [view the resource details](#view-resource-details).
-- [Search and select specific resources](#search-resources) or a subset of the DAG using selectors and graph operators. For example:
+- [Search and select specific resources](#search-resources) or a subset of the DAG using [selectors](/reference/node-selection/methods) and [graph operators](/reference/node-selection/graph-operators). This can help you narrow the focus on the resources that interest you. For example:
- `+[RESOURCE_NAME]` — Displays all parent nodes of the resource
- `resource_type:model [RESOURCE_NAME]` — Displays all models matching the name search
- [View resource details](#view-resource-details) by selecting a node (double-clicking) in the graph.
- Click the List view icon in the graph's upper right corner to return to the main **Explore** page.
-
+
## Search for resources {#search-resources}
@@ -74,9 +82,15 @@ Select a node (single-click) in the lineage graph to highlight its relationship
### Search with keywords
When searching with keywords, dbt Explorer searches through your resource metadata (such as resource type, resource name, column name, source name, tags, schema, database, version, alias/identifier, and package name) and returns any matches.
-### Search with selector methods
+- Keyword search features a side panel (to the right of the main section) to filter search results by resource type.
+- Use this panel to select specific resource tags or model access levels under the **Models** option.
+ - For example, a search for "sale" returns results that include all resources with the keyword "sale" in their metadata. Filtering by **Models** and **Sources** refines these results to only include models or sources.
+
+- When searching for an exact column name, the results show all relational nodes containing that column in their schemas. If there's a match, a notice in the search result indicates the resource contains the specified column.
-You can search with [selector methods](/reference/node-selection/methods). Below are the selectors currently available in dbt Explorer:
+### Search with selectors
+
+You can search with [selectors](/reference/node-selection/methods). Below are the selectors currently available in dbt Explorer:
- `fqn:` — Find resources by [file or fully qualified name](/reference/node-selection/methods#the-fqn-method). This selector is the search bar's default. If you want to use the default, it's unnecessary to add `fqn:` before the search term.
- `source:` — Find resources by a specified [source](/reference/node-selection/methods#the-source-method).
@@ -91,23 +105,15 @@ You can search with [selector methods](/reference/node-selection/methods). Below
-### Search with graph operators
-
-You can use [graph operators](/reference/node-selection/graph-operators) on keywords or selector methods. For example, `+orders` returns all the parents of `orders`.
+Because the results of selectors are immutable, the filter side panel is not available with this search method.
-### Search with set operators
+When searching with selector methods, you can also use [graph operators](/reference/node-selection/graph-operators). For example, `+orders` returns all the parents of `orders`. This functionality is not available for keyword search.
You can use multiple selector methods in your search query with [set operators](/reference/node-selection/set-operators). A space implies a union set operator and a comma for an intersection. For example:
- `resource_type:metric,tag:nightly` — Returns metrics with the tag `nightly`
- `+snowplow_sessions +fct_orders` — Returns resources that are parent nodes of either `snowplow_sessions` or `fct_orders`
-### Search with both keywords and selector methods
-
-You can use keyword search to highlight results that are filtered by the selector search. For example, if you don't have a resource called `customers`, then `resource_type:metric customers` returns all the metrics in your project and highlights those that are related to the term `customers` in the name, in a column, tagged as customers, and so on.
-
-When searching in this way, the selectors behave as filters that you can use to narrow the search and keywords as a way to find matches within those filtered results.
-
-
+
## Browse with the sidebar
@@ -120,7 +126,7 @@ To browse using a different view, you can choose one of these options from the *
- **File Tree** — All resources in the project organized by the file in which they are defined. This mirrors the file tree in your dbt project repository.
- **Database** — All resources in the project organized by the database and schema in which they are built. This mirrors your data platform's structure that represents the [applied state](/docs/dbt-cloud-apis/project-state) of your project.
-
+
## View model versions
@@ -132,7 +138,7 @@ You can view the definition and latest run results of any resource in your proje
The details (metadata) available to you depends on the resource’s type, its definition, and the [commands](/docs/deploy/job-commands) that run within jobs in the production environment.
-
+
### Example of model details
@@ -143,11 +149,11 @@ An example of the details you might get for a model:
- **Lineage** graph — The model’s lineage graph that you can interact with. The graph includes one parent node and one child node from the model. Click the Expand icon in the graph's upper right corner to view the model in full lineage graph mode.
- **Description** section — A [description of the model](/docs/collaborate/documentation#adding-descriptions-to-your-project).
- **Recent** section — Information on the last time the model ran, how long it ran for, whether the run was successful, the job ID, and the run ID.
- - **Tests** section — [Tests](/docs/build/tests) for the model.
+ - **Tests** section — [Tests](/docs/build/tests) for the model, including a status indicator for the latest test status. A :white_check_mark: denotes a passing test.
- **Details** section — Key properties like the model’s relation name (for example, how it’s represented and how you can query it in the data platform: `database.schema.identifier`); model governance attributes like access, group, and if contracted; and more.
- **Relationships** section — The nodes the model **Depends On**, is **Referenced by**, and (if applicable) is **Used by** for projects that have declared the models' project as a dependency.
- **Code** tab — The source code and compiled code for the model.
-- **Columns** tab — The available columns in the model. This tab also shows tests results (if any) that you can select to view the test's details page. A :white_check_mark: denotes a passing test.
+- **Columns** tab — The available columns in the model. This tab also shows tests results (if any) that you can select to view the test's details page. A :white_check_mark: denotes a passing test. To filter the columns in the resource, you can use the search bar that's located at the top of the columns view.
### Example of exposure details
@@ -189,47 +195,6 @@ An example of the details you might get for each source table within a source co
- **Relationships** section — A table that lists all the sources used with their freshness status, the timestamp of when freshness was last checked, and the timestamp of when the source was last loaded.
- **Columns** tab — The available columns in the source. This tab also shows tests results (if any) that you can select to view the test's details page. A :white_check_mark: denotes a passing test.
-## About project-level lineage
-You can also view all the different projects and public models in the account, where the public models are defined, and how they are used to gain a better understanding about your cross-project resources.
-
-When viewing the resource-level lineage graph for a given project that uses cross-project references, you can see cross-project relationships represented in the DAG. The iconography is slightly different depending on whether you're viewing the lineage of an upstream producer project or a downstream consumer project.
-
-When viewing an upstream (parent) project that produces public models that are imported by downstream (child) projects, public models will have a counter icon in their upper right corner that indicates the number of projects that declare the current project as a dependency. Selecting that model reveals the lineage to show the specific projects that are dependent on this model. Projects show up in this counter if they declare the parent project as a dependency in its `dependencies.yml` regardless of whether or not there's a direct `{{ ref() }}` against the public model. Selecting a project node from a public model opens the resource-level lineage graph for that project, which is subject to your permissions.
-
-
-
-When viewing a downstream (child) project that imports and refs public models from upstream (parent) projects, public models will show up in the lineage graph and display an icon on the graph edge that indicates what the relationship is to a model from another project. Hovering over this icon indicates the specific dbt Cloud project that produces that model. Double-clicking on a model from another project opens the resource-level lineage graph of the parent project, which is subject to your permissions.
-
-
-
-
-### Explore the project-level lineage graph
-
-For cross-project collaboration, you can interact with the DAG in all the same ways as described in [Explore your project's lineage](#project-lineage) but you can also interact with it at the project level and view the details.
-
-To get a list view of all the projects, select the account name at the top of the **Explore** page near the navigation bar. This view includes a public model list, project list, and a search bar for project searches. You can also view the project-level lineage graph by clicking the Lineage view icon in the page's upper right corner.
-
-If you have permissions for a project in the account, you can view all public models used across the entire account. However, you can only view full public model details and private models if you have permissions for a project where the models are defined.
-
-From the project-level lineage graph, you can:
-
-- Click the Lineage view icon (in the graph’s upper right corner) to view the cross-project lineage graph.
-- Click the List view icon (in the graph’s upper right corner) to view the project list.
- - Select a project from the **Projects** tab to switch to that project’s main **Explore** page.
- - Select a model from the **Public Models** tab to view the [model’s details page](#view-resource-details).
- - Perform searches on your projects with the search bar.
-- Select a project node in the graph (double-clicking) to switch to that particular project’s lineage graph.
-
-When you select a project node in the graph, a project details panel opens on the graph’s right-hand side where you can:
-
-- View counts of the resources defined in the project.
-- View a list of its public models, if any.
-- View a list of other projects that uses the project, if any.
-- Click **Open Project Lineage** to switch to the project’s lineage graph.
-- Click the Share icon to copy the project panel link to your clipboard so you can share the graph with someone.
-
-
-
## Related content
- [Enterprise permissions](/docs/cloud/manage-access/enterprise-permissions)
- [About model governance](/docs/collaborate/govern/about-model-governance)
diff --git a/website/docs/docs/collaborate/govern/model-contracts.md b/website/docs/docs/collaborate/govern/model-contracts.md
index 442a20df1b6..342d86c1a77 100644
--- a/website/docs/docs/collaborate/govern/model-contracts.md
+++ b/website/docs/docs/collaborate/govern/model-contracts.md
@@ -125,8 +125,8 @@ Select the adapter-specific tab for more information on [constraint](/reference/
| Constraint type | Support | Platform enforcement |
|:-----------------|:-------------|:---------------------|
| not_null | ✅ Supported | ✅ Enforced |
-| primary_key | ✅ Supported | ✅ Enforced |
-| foreign_key | ✅ Supported | ✅ Enforced |
+| primary_key | ✅ Supported | ❌ Not enforced |
+| foreign_key | ✅ Supported | ❌ Not enforced |
| unique | ❌ Not supported | ❌ Not enforced |
| check | ❌ Not supported | ❌ Not enforced |
diff --git a/website/docs/docs/collaborate/govern/model-versions.md b/website/docs/docs/collaborate/govern/model-versions.md
index 49ed65f9a36..2a79e2f46e7 100644
--- a/website/docs/docs/collaborate/govern/model-versions.md
+++ b/website/docs/docs/collaborate/govern/model-versions.md
@@ -393,6 +393,32 @@ dbt.exceptions.AmbiguousAliasError: Compilation Error
We opted to use `generate_alias_name` for this functionality so that the logic remains accessible to end users, and could be reimplemented with custom logic.
:::
+### Run a model with multiple versions
+
+To run a model with multiple versions, you can use the [`--select` flag](/reference/node-selection/syntax). For example:
+
+- Run all versions of `dim_customers`:
+
+ ```bash
+ dbt run --select dim_customers # Run all versions of the model
+ ```
+- Run only version 2 of `dim_customers`:
+
+ You can use either of the following commands (both achieve the same result):
+
+ ```bash
+ dbt run --select dim_customers.v2 # Run a specific version of the model
+ dbt run --select dim_customers_v2 # Alternative syntax for the specific version
+ ```
+
+- Run the latest version of `dim_customers` using the `--select` flag shorthand:
+
+ ```bash
+ dbt run -s dim_customers version:latest # Run the latest version of the model
+ ```
+
+These commands provide flexibility in managing and executing different versions of a dbt model.
+
### Optimizing model versions
How you define each model version is completely up to you. While it's easy to start by copy-pasting from one model's SQL definition into another, you should think about _what actually is changing_ from one version to another.
diff --git a/website/docs/docs/collaborate/govern/project-dependencies.md b/website/docs/docs/collaborate/govern/project-dependencies.md
index 174e4572890..569d69a87e6 100644
--- a/website/docs/docs/collaborate/govern/project-dependencies.md
+++ b/website/docs/docs/collaborate/govern/project-dependencies.md
@@ -22,8 +22,12 @@ This year, dbt Labs is introducing an expanded notion of `dependencies` across m
- **Packages** — Familiar and pre-existing type of dependency. You take this dependency by installing the package's full source code (like a software library).
- **Projects** — A _new_ way to take a dependency on another project. Using a metadata service that runs behind the scenes, dbt Cloud resolves references on-the-fly to public models defined in other projects. You don't need to parse or run those upstream models yourself. Instead, you treat your dependency on those models as an API that returns a dataset. The maintainer of the public model is responsible for guaranteeing its quality and stability.
+import UseCaseInfo from '/snippets/_packages_or_dependencies.md';
+
+
+
+Refer to the [FAQs](#faqs) for more info.
-Starting in dbt v1.6 or higher, `packages.yml` has been renamed to `dependencies.yml`. However, if you need use Jinja within your packages config, such as an environment variable for your private package, you need to keep using `packages.yml` for your packages for now. Refer to the [FAQs](#faqs) for more info.
## Prerequisites
@@ -33,22 +37,6 @@ In order to add project dependencies and resolve cross-project `ref`, you must:
- Have a successful run of the upstream ("producer") project
- Have a multi-tenant or single-tenant [dbt Cloud Enterprise](https://www.getdbt.com/pricing) account (Azure ST is not supported but coming soon)
-
## Example
As an example, let's say you work on the Marketing team at the Jaffle Shop. The name of your team's project is `jaffle_marketing`:
diff --git a/website/docs/docs/collaborate/model-performance.md b/website/docs/docs/collaborate/model-performance.md
new file mode 100644
index 00000000000..7ef675b4e1e
--- /dev/null
+++ b/website/docs/docs/collaborate/model-performance.md
@@ -0,0 +1,41 @@
+---
+title: "Model performance"
+sidebar_label: "Model performance"
+description: "Learn about ."
+---
+
+dbt Explorer provides metadata on dbt Cloud runs for in-depth model performance and quality analysis. This feature assists in reducing infrastructure costs and saving time for data teams by highlighting where to fine-tune projects and deployments — such as model refactoring or job configuration adjustments.
+
+
+
+:::tip Beta
+
+The model performance beta feature is now available in dbt Explorer! Check it out!
+:::
+
+## The Performance overview page
+
+You can pinpoint areas for performance enhancement by using the Performance overview page. This page presents a comprehensive analysis across all project models and displays the longest-running models, those most frequently executed, and the ones with the highest failure rates during runs/tests. Data can be segmented by environment and job type which can offer insights into:
+
+- Most executed models (total count).
+- Models with the longest execution time (average duration).
+- Models with the most failures, detailing run failures (percentage and count) and test failures (percentage and count).
+
+Each data point links to individual models in Explorer.
+
+
+
+You can view historical metadata for up to the past three months. Select the time horizon using the filter, which defaults to a two-week lookback.
+
+
+
+## The Model performance tab
+
+You can view trends in execution times, counts, and failures by using the Model performance tab for historical performance analysis. Daily execution data includes:
+
+- Average model execution time.
+- Model execution counts, including failures/errors (total sum).
+
+Clicking on a data point reveals a table listing all job runs for that day, with each row providing a direct link to the details of a specific run.
+
+
\ No newline at end of file
diff --git a/website/docs/docs/collaborate/project-recommendations.md b/website/docs/docs/collaborate/project-recommendations.md
new file mode 100644
index 00000000000..e6263a875fc
--- /dev/null
+++ b/website/docs/docs/collaborate/project-recommendations.md
@@ -0,0 +1,50 @@
+---
+title: "Project recommendations"
+sidebar_label: "Project recommendations"
+description: "dbt Explorer provides recommendations that you can take to improve the quality of your dbt project."
+---
+
+:::tip Beta
+
+The project recommendations beta feature is now available in dbt Explorer! Check it out!
+
+:::
+
+dbt Explorer provides recommendations about your project from the `dbt_project_evaluator` [package](https://hub.getdbt.com/dbt-labs/dbt_project_evaluator/latest/) using metadata from the Discovery API.
+
+Explorer also offers a global view, showing all the recommendations across the project for easy sorting and summarizing.
+
+These recommendations provide insight into how you can build a more well documented, well tested, and well built project, leading to less confusion and more trust.
+
+The Recommendations overview page includes two top-level metrics measuring the test and documentation coverage of the models in your project.
+
+- **Model test coverage** — The percent of models in your project (models not from a package or imported via dbt Mesh) with at least one dbt test configured on them.
+- **Model documentation coverage** — The percent of models in your project (models not from a package or imported via dbt Mesh) with a description.
+
+
+
+## List of rules
+
+| Category | Name | Description | Package Docs Link |
+| --- | --- | --- | --- |
+| Modeling | Direct Join to Source | Model that joins both a model and source, indicating a missing staging model | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/modeling/#direct-join-to-source) |
+| Modeling | Duplicate Sources | More than one source node corresponds to the same data warehouse relation | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/modeling/#duplicate-sources) |
+| Modeling | Multiple Sources Joined | Models with more than one source parent, indicating lack of staging models | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/modeling/#multiple-sources-joined) |
+| Modeling | Root Model | Models with no parents, indicating potential hardcoded references and need for sources | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/modeling/#root-models) |
+| Modeling | Source Fanout | Sources with more than one model child, indicating a need for staging models | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/modeling/#source-fanout) |
+| Modeling | Unused Source | Sources that are not referenced by any resource | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/modeling/#unused-sources) |
+| Performance | Exposure Dependent on View | Exposures with at least one model parent materialized as a view, indicating potential query performance issues | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/performance/#exposure-parents-materializations) |
+| Testing | Missing Primary Key Test | Models with insufficient testing on the grain of the model. | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/testing/#missing-primary-key-tests) |
+| Documentation | Undocumented Models | Models without a model-level description | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/documentation/#undocumented-models) |
+| Documentation | Undocumented Source | Sources (collections of source tables) without descriptions | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/documentation/#undocumented-sources) |
+| Documentation | Undocumented Source Tables | Source tables without descriptions | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/documentation/#undocumented-source-tables) |
+| Governance | Public Model Missing Contract | Models with public access that do not have a model contract to ensure the data types | [GitHub](https://dbt-labs.github.io/dbt-project-evaluator/0.8/rules/governance/#public-models-without-contracts) |
+
+
+## The Recommendations tab
+
+Models, sources and exposures each also have a Recommendations tab on their resource details page, with the specific recommendations that correspond to that resource:
+
+
+
+
diff --git a/website/docs/docs/community-adapters.md b/website/docs/docs/community-adapters.md
index 444ea0e04b4..d1e63f03128 100644
--- a/website/docs/docs/community-adapters.md
+++ b/website/docs/docs/community-adapters.md
@@ -17,4 +17,4 @@ Community adapters are adapter plugins contributed and maintained by members of
| [TiDB](/docs/core/connect-data-platform/tidb-setup) | [Firebolt](/docs/core/connect-data-platform/firebolt-setup) | [MindsDB](/docs/core/connect-data-platform/mindsdb-setup)
| [Vertica](/docs/core/connect-data-platform/vertica-setup) | [AWS Glue](/docs/core/connect-data-platform/glue-setup) | [MySQL](/docs/core/connect-data-platform/mysql-setup) |
| [Upsolver](/docs/core/connect-data-platform/upsolver-setup) | [Databend Cloud](/docs/core/connect-data-platform/databend-setup) | [fal - Python models](/docs/core/connect-data-platform/fal-setup) |
-
+| [TimescaleDB](https://dbt-timescaledb.debruyn.dev/) | | |
diff --git a/website/docs/docs/connect-adapters.md b/website/docs/docs/connect-adapters.md
index e301cfc237e..56ff538dc9b 100644
--- a/website/docs/docs/connect-adapters.md
+++ b/website/docs/docs/connect-adapters.md
@@ -15,7 +15,7 @@ Explore the fastest and most reliable way to deploy dbt using dbt Cloud, a hoste
Install dbt Core, an open-source tool, locally using the command line. dbt communicates with a number of different data platforms by using a dedicated adapter plugin for each. When you install dbt Core, you'll also need to install the specific adapter for your database, [connect to dbt Core](/docs/core/about-core-setup), and set up a `profiles.yml` file.
-With a few exceptions [^1], you can install all [Verified adapters](/docs/supported-data-platforms) from PyPI using `pip install adapter-name`. For example to install Snowflake, use the command `pip install dbt-snowflake`. The installation will include `dbt-core` and any other required dependencies, which may include both other dependencies and even other adapter plugins. Read more about [installing dbt](/docs/core/installation).
+With a few exceptions [^1], you can install all [Verified adapters](/docs/supported-data-platforms) from PyPI using `python -m pip install adapter-name`. For example to install Snowflake, use the command `python -m pip install dbt-snowflake`. The installation will include `dbt-core` and any other required dependencies, which may include both other dependencies and even other adapter plugins. Read more about [installing dbt](/docs/core/installation-overview).
[^1]: Here are the two different adapters. Use the PyPI package name when installing with `pip`
diff --git a/website/docs/docs/core/about-core-setup.md b/website/docs/docs/core/about-core-setup.md
index 64e7694b793..8b170ba70d4 100644
--- a/website/docs/docs/core/about-core-setup.md
+++ b/website/docs/docs/core/about-core-setup.md
@@ -3,7 +3,7 @@ title: About dbt Core setup
id: about-core-setup
description: "Configuration settings for dbt Core."
sidebar_label: "About dbt Core setup"
-pagination_next: "docs/core/about-dbt-core"
+pagination_next: "docs/core/dbt-core-environments"
pagination_prev: null
---
@@ -11,9 +11,10 @@ dbt Core is an [open-source](https://github.com/dbt-labs/dbt-core) tool that ena
This section of our docs will guide you through various settings to get started:
-- [About dbt Core](/docs/core/about-dbt-core)
-- [Installing dbt](/docs/core/installation)
- [Connecting to a data platform](/docs/core/connect-data-platform/profiles.yml)
- [How to run your dbt projects](/docs/running-a-dbt-project/run-your-dbt-projects)
+To learn about developing dbt projects in dbt Cloud, refer to [Develop with dbt Cloud](/docs/cloud/about-develop-dbt).
+ - dbt Cloud provides a command line interface with the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation). Both dbt Core and the dbt Cloud CLI are command line tools that let you run dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its [features](/docs/cloud/about-cloud/dbt-cloud-features).
+
If you need a more detailed first-time setup guide for specific data platforms, read our [quickstart guides](https://docs.getdbt.com/guides).
diff --git a/website/docs/docs/core/about-dbt-core.md b/website/docs/docs/core/about-dbt-core.md
deleted file mode 100644
index a35d92420f3..00000000000
--- a/website/docs/docs/core/about-dbt-core.md
+++ /dev/null
@@ -1,25 +0,0 @@
----
-title: "About dbt Core"
-id: "about-dbt-core"
-sidebar_label: "About dbt Core"
----
-
-[dbt Core](https://github.com/dbt-labs/dbt-core) is an open sourced project where you can develop from the command line and run your dbt project.
-
-To use dbt Core, your workflow generally looks like:
-
-1. **Build your dbt project in a code editor —** popular choices include VSCode and Atom.
-
-2. **Run your project from the command line —** macOS ships with a default Terminal program, however you can also use iTerm or the command line prompt within a code editor to execute dbt commands.
-
-:::info How we set up our computers for working on dbt projects
-
-We've written a [guide](https://discourse.getdbt.com/t/how-we-set-up-our-computers-for-working-on-dbt-projects/243) for our recommended setup when running dbt projects using dbt Core.
-
-:::
-
-If you're using the command line, we recommend learning some basics of your terminal to help you work more effectively. In particular, it's important to understand `cd`, `ls` and `pwd` to be able to navigate through the directory structure of your computer easily.
-
-You can find more information on installing and setting up the dbt Core [here](/docs/core/installation).
-
-**Note** — dbt supports a dbt Cloud CLI and dbt Core, both command line interface tools that enable you to run dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its [features](/docs/cloud/about-cloud/dbt-cloud-features).
diff --git a/website/docs/docs/core/connect-data-platform/about-core-connections.md b/website/docs/docs/core/connect-data-platform/about-core-connections.md
index 492e5ae878a..61a7805d232 100644
--- a/website/docs/docs/core/connect-data-platform/about-core-connections.md
+++ b/website/docs/docs/core/connect-data-platform/about-core-connections.md
@@ -14,6 +14,7 @@ dbt Core can connect with a variety of data platform providers including:
- [Apache Spark](/docs/core/connect-data-platform/spark-setup)
- [Databricks](/docs/core/connect-data-platform/databricks-setup)
- [Google BigQuery](/docs/core/connect-data-platform/bigquery-setup)
+- [Microsoft Fabric](/docs/core/connect-data-platform/fabric-setup)
- [PostgreSQL](/docs/core/connect-data-platform/postgres-setup)
- [Snowflake](/docs/core/connect-data-platform/snowflake-setup)
- [Starburst or Trino](/docs/core/connect-data-platform/trino-setup)
diff --git a/website/docs/docs/core/connect-data-platform/alloydb-setup.md b/website/docs/docs/core/connect-data-platform/alloydb-setup.md
index c01ba06d887..cbfecb48169 100644
--- a/website/docs/docs/core/connect-data-platform/alloydb-setup.md
+++ b/website/docs/docs/core/connect-data-platform/alloydb-setup.md
@@ -14,18 +14,10 @@ meta:
config_page: '/reference/resource-configs/postgres-configs'
---
-## Overview of AlloyDB support
+import SetUpPages from '/snippets/_setup-pages-intro.md';
+
+
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
## Profile Configuration
diff --git a/website/docs/docs/core/connect-data-platform/athena-setup.md b/website/docs/docs/core/connect-data-platform/athena-setup.md
index db218110dc1..468ba7a7847 100644
--- a/website/docs/docs/core/connect-data-platform/athena-setup.md
+++ b/website/docs/docs/core/connect-data-platform/athena-setup.md
@@ -15,32 +15,11 @@ meta:
config_page: '/reference/resource-configs/no-configs'
---
- Overview of {frontMatter.meta.pypi_package}
+
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+import SetUpPages from '/snippets/_setup-pages-intro.md';
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Connecting to Athena with dbt-athena
diff --git a/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md b/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md
index 073e95530c1..8a4d6b61004 100644
--- a/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md
+++ b/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md
@@ -24,32 +24,11 @@ Refer to [Microsoft Fabric Synapse Data Warehouse](/docs/core/connect-data-platf
:::
- Overview of {frontMatter.meta.pypi_package}
+
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+import SetUpPages from '/snippets/_setup-pages-intro.md';
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
:::info Dedicated SQL only
diff --git a/website/docs/docs/core/connect-data-platform/bigquery-setup.md b/website/docs/docs/core/connect-data-platform/bigquery-setup.md
index 96eafadea3b..8238bc043c4 100644
--- a/website/docs/docs/core/connect-data-platform/bigquery-setup.md
+++ b/website/docs/docs/core/connect-data-platform/bigquery-setup.md
@@ -18,33 +18,9 @@ meta:
- Overview of {frontMatter.meta.pypi_package}
-
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
+
## Authentication Methods
diff --git a/website/docs/docs/core/connect-data-platform/clickhouse-setup.md b/website/docs/docs/core/connect-data-platform/clickhouse-setup.md
index fb0965398a2..fce367be812 100644
--- a/website/docs/docs/core/connect-data-platform/clickhouse-setup.md
+++ b/website/docs/docs/core/connect-data-platform/clickhouse-setup.md
@@ -17,34 +17,9 @@ meta:
Some core functionality may be limited. If you're interested in contributing, check out the source code for each repository listed below.
+import SetUpPages from '/snippets/_setup-pages-intro.md';
- Overview of {frontMatter.meta.pypi_package}
-
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Connecting to ClickHouse with **dbt-clickhouse**
diff --git a/website/docs/docs/core/connect-data-platform/databend-setup.md b/website/docs/docs/core/connect-data-platform/databend-setup.md
index daccd14f6c3..5442327fb27 100644
--- a/website/docs/docs/core/connect-data-platform/databend-setup.md
+++ b/website/docs/docs/core/connect-data-platform/databend-setup.md
@@ -22,34 +22,9 @@ If you're interested in contributing, check out the source code repository liste
:::
- Overview of {frontMatter.meta.pypi_package}
-
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
+
## Connecting to Databend Cloud with **dbt-databend-cloud**
diff --git a/website/docs/docs/core/connect-data-platform/databricks-setup.md b/website/docs/docs/core/connect-data-platform/databricks-setup.md
index caf52d09de3..1ea6afda370 100644
--- a/website/docs/docs/core/connect-data-platform/databricks-setup.md
+++ b/website/docs/docs/core/connect-data-platform/databricks-setup.md
@@ -18,34 +18,11 @@ meta:
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
-
`dbt-databricks` is the recommended adapter for Databricks. It includes features not available in `dbt-spark`, such as:
- Unity Catalog support
- No need to install additional drivers or dependencies for use on the CLI
diff --git a/website/docs/docs/core/connect-data-platform/decodable-setup.md b/website/docs/docs/core/connect-data-platform/decodable-setup.md
index b43521732d4..6c3cb487885 100644
--- a/website/docs/docs/core/connect-data-platform/decodable-setup.md
+++ b/website/docs/docs/core/connect-data-platform/decodable-setup.md
@@ -21,35 +21,9 @@ meta:
Some core functionality may be limited. If you're interested in contributing, see the source code for the repository listed below.
:::
- Overview of {frontMatter.meta.pypi_package}
-
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version}
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-dbt-decodable is also available on PyPI. pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration.
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
+
## Connecting to Decodable with **dbt-decodable**
Do the following steps to connect to Decodable with dbt.
diff --git a/website/docs/docs/core/connect-data-platform/doris-setup.md b/website/docs/docs/core/connect-data-platform/doris-setup.md
index a7e2ba1ba3e..a3e5364d907 100644
--- a/website/docs/docs/core/connect-data-platform/doris-setup.md
+++ b/website/docs/docs/core/connect-data-platform/doris-setup.md
@@ -4,8 +4,8 @@ description: "Read this guide to learn about the Doris warehouse setup in dbt."
id: "doris-setup"
meta:
maintained_by: SelectDB
- authors: long2ice,catpineapple
- github_repo: 'selectdb/dbt-selectdb'
+ authors: catpineapple,JNSimba
+ github_repo: 'selectdb/dbt-doris'
pypi_package: 'dbt-doris'
min_core_version: 'v1.3.0'
cloud_support: Not Supported
@@ -15,33 +15,9 @@ meta:
config_page: '/reference/resource-configs/doris-configs'
---
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Connecting to Doris/SelectDB with **dbt-doris**
diff --git a/website/docs/docs/core/connect-data-platform/dremio-setup.md b/website/docs/docs/core/connect-data-platform/dremio-setup.md
index fa6ca154fcd..839dd8cffa8 100644
--- a/website/docs/docs/core/connect-data-platform/dremio-setup.md
+++ b/website/docs/docs/core/connect-data-platform/dremio-setup.md
@@ -21,33 +21,9 @@ Some core functionality may be limited. If you're interested in contributing, ch
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
Follow the repository's link for OS dependencies.
@@ -62,7 +38,6 @@ Before connecting from project to Dremio Cloud, follow these prerequisite steps:
* Ensure that Python 3.9.x or later is installed on the system that you are running dbt on.
-
## Prerequisites for Dremio Software
* Ensure that you are using version 22.0 or later.
diff --git a/website/docs/docs/core/connect-data-platform/duckdb-setup.md b/website/docs/docs/core/connect-data-platform/duckdb-setup.md
index a3fee5a5164..6e118e54061 100644
--- a/website/docs/docs/core/connect-data-platform/duckdb-setup.md
+++ b/website/docs/docs/core/connect-data-platform/duckdb-setup.md
@@ -21,33 +21,9 @@ Some core functionality may be limited. If you're interested in contributing, ch
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Connecting to DuckDB with dbt-duckdb
diff --git a/website/docs/docs/core/connect-data-platform/exasol-setup.md b/website/docs/docs/core/connect-data-platform/exasol-setup.md
index 2bf4cd7ffac..509ccd67e84 100644
--- a/website/docs/docs/core/connect-data-platform/exasol-setup.md
+++ b/website/docs/docs/core/connect-data-platform/exasol-setup.md
@@ -21,34 +21,9 @@ Some core functionality may be limited. If you're interested in contributing, ch
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
- dbt-exasol
+
### Connecting to Exasol with **dbt-exasol**
diff --git a/website/docs/docs/core/connect-data-platform/fabric-setup.md b/website/docs/docs/core/connect-data-platform/fabric-setup.md
index ef5a748552d..deef1e04b22 100644
--- a/website/docs/docs/core/connect-data-platform/fabric-setup.md
+++ b/website/docs/docs/core/connect-data-platform/fabric-setup.md
@@ -4,11 +4,11 @@ description: "Read this guide to learn about the Microsoft Fabric Synapse Data W
id: fabric-setup
meta:
maintained_by: Microsoft
- authors: '[Microsoft](https://github.com/Microsoft)'
+ authors: 'Microsoft'
github_repo: 'Microsoft/dbt-fabric'
pypi_package: 'dbt-fabric'
min_core_version: '1.4.0'
- cloud_support: Not Supported
+ cloud_support: Supported
platform_name: 'Microsoft Fabric'
config_page: '/reference/resource-configs/fabric-configs'
---
@@ -21,31 +21,10 @@ To learn how to set up dbt with Azure Synapse Dedicated Pools, refer to [Microso
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
### Prerequisites
diff --git a/website/docs/docs/core/connect-data-platform/fal-setup.md b/website/docs/docs/core/connect-data-platform/fal-setup.md
index ef4998e8c1b..76539d67c54 100644
--- a/website/docs/docs/core/connect-data-platform/fal-setup.md
+++ b/website/docs/docs/core/connect-data-platform/fal-setup.md
@@ -21,36 +21,11 @@ Some core functionality may be limited. If you're interested in contributing, ch
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}[<sql-adapter>]
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
-You must install the adapter for SQL transformations and data storage independently from dbt-fal.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
-
## Setting up fal with other adapter
diff --git a/website/docs/docs/core/connect-data-platform/firebolt-setup.md b/website/docs/docs/core/connect-data-platform/firebolt-setup.md
index c7a5a543512..8fb91dea299 100644
--- a/website/docs/docs/core/connect-data-platform/firebolt-setup.md
+++ b/website/docs/docs/core/connect-data-platform/firebolt-setup.md
@@ -19,34 +19,11 @@ meta:
Some core functionality may be limited. If you're interested in contributing, check out the source code for the repository listed below.
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
-
For other information including Firebolt feature support, see the [GitHub README](https://github.com/firebolt-db/dbt-firebolt/blob/main/README.md) and the [changelog](https://github.com/firebolt-db/dbt-firebolt/blob/main/CHANGELOG.md).
diff --git a/website/docs/docs/core/connect-data-platform/glue-setup.md b/website/docs/docs/core/connect-data-platform/glue-setup.md
index e56e5bcd902..afb95fe6af5 100644
--- a/website/docs/docs/core/connect-data-platform/glue-setup.md
+++ b/website/docs/docs/core/connect-data-platform/glue-setup.md
@@ -22,34 +22,11 @@ Some core functionality may be limited. If you're interested in contributing, ch
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
-
For further (and more likely up-to-date) info, see the [README](https://github.com/aws-samples/dbt-glue#readme)
diff --git a/website/docs/docs/core/connect-data-platform/greenplum-setup.md b/website/docs/docs/core/connect-data-platform/greenplum-setup.md
index 06ada19a1e9..523a503b128 100644
--- a/website/docs/docs/core/connect-data-platform/greenplum-setup.md
+++ b/website/docs/docs/core/connect-data-platform/greenplum-setup.md
@@ -16,34 +16,11 @@ meta:
config_page: '/reference/resource-configs/greenplum-configs'
---
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
-
For further (and more likely up-to-date) info, see the [README](https://github.com/markporoshin/dbt-greenplum#README.md)
diff --git a/website/docs/docs/core/connect-data-platform/hive-setup.md b/website/docs/docs/core/connect-data-platform/hive-setup.md
index 61a929c58da..33e45e28a0d 100644
--- a/website/docs/docs/core/connect-data-platform/hive-setup.md
+++ b/website/docs/docs/core/connect-data-platform/hive-setup.md
@@ -16,34 +16,11 @@ meta:
config_page: '/reference/resource-configs/hive-configs'
---
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
-
## Connection Methods
@@ -154,7 +131,7 @@ you must install the `dbt-hive` plugin.
The following commands will install the latest version of `dbt-hive` as well as the requisite version of `dbt-core` and `impyla` driver used for connections.
```
-pip install dbt-hive
+python -m pip install dbt-hive
```
### Supported Functionality
diff --git a/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md b/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md
index cb6c7459418..692342466b0 100644
--- a/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md
+++ b/website/docs/docs/core/connect-data-platform/ibmdb2-setup.md
@@ -22,34 +22,11 @@ Some core functionality may be limited. If you're interested in contributing, ch
:::
-## Overview of dbt-ibmdb2
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
-
This is an experimental plugin:
- We have not tested it extensively
diff --git a/website/docs/docs/core/connect-data-platform/impala-setup.md b/website/docs/docs/core/connect-data-platform/impala-setup.md
index 0a0f1b955a1..df82cab6563 100644
--- a/website/docs/docs/core/connect-data-platform/impala-setup.md
+++ b/website/docs/docs/core/connect-data-platform/impala-setup.md
@@ -16,33 +16,9 @@ meta:
config_page: '/reference/resource-configs/impala-configs'
---
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Connection Methods
diff --git a/website/docs/docs/core/connect-data-platform/infer-setup.md b/website/docs/docs/core/connect-data-platform/infer-setup.md
index 430c5e47f85..7642c553cc4 100644
--- a/website/docs/docs/core/connect-data-platform/infer-setup.md
+++ b/website/docs/docs/core/connect-data-platform/infer-setup.md
@@ -16,32 +16,11 @@ meta:
min_supported_version: n/a
---
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
-
## Connecting to Infer with **dbt-infer**
diff --git a/website/docs/docs/core/connect-data-platform/iomete-setup.md b/website/docs/docs/core/connect-data-platform/iomete-setup.md
index bc015141c85..2f2d18b1e47 100644
--- a/website/docs/docs/core/connect-data-platform/iomete-setup.md
+++ b/website/docs/docs/core/connect-data-platform/iomete-setup.md
@@ -16,35 +16,10 @@ meta:
config_page: '/reference/resource-configs/no-configs'
---
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
-## Installation and Distribution
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
Set up a iomete Target
diff --git a/website/docs/docs/core/connect-data-platform/layer-setup.md b/website/docs/docs/core/connect-data-platform/layer-setup.md
index f065c0c7313..051094297a2 100644
--- a/website/docs/docs/core/connect-data-platform/layer-setup.md
+++ b/website/docs/docs/core/connect-data-platform/layer-setup.md
@@ -17,34 +17,9 @@ meta:
---
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
### Profile Configuration
diff --git a/website/docs/docs/core/connect-data-platform/materialize-setup.md b/website/docs/docs/core/connect-data-platform/materialize-setup.md
index c8777c29490..70505fe1d65 100644
--- a/website/docs/docs/core/connect-data-platform/materialize-setup.md
+++ b/website/docs/docs/core/connect-data-platform/materialize-setup.md
@@ -6,7 +6,7 @@ meta:
maintained_by: Materialize Inc.
pypi_package: 'dbt-materialize'
authors: 'Materialize team'
- github_repo: 'MaterializeInc/materialize/blob/main/misc/dbt-materialize'
+ github_repo: 'MaterializeInc/materialize'
min_core_version: 'v0.18.1'
min_supported_version: 'v0.28.0'
cloud_support: Not Supported
@@ -22,32 +22,9 @@ Certain core functionality may vary. If you would like to report a bug, request
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration, please refer to {frontMatter.meta.platform_name} Configuration.
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Connecting to Materialize
diff --git a/website/docs/docs/core/connect-data-platform/mindsdb-setup.md b/website/docs/docs/core/connect-data-platform/mindsdb-setup.md
index e6b8c5decaa..47d9d311ff9 100644
--- a/website/docs/docs/core/connect-data-platform/mindsdb-setup.md
+++ b/website/docs/docs/core/connect-data-platform/mindsdb-setup.md
@@ -19,35 +19,9 @@ meta:
The dbt-mindsdb package allows dbt to connect to [MindsDB](https://github.com/mindsdb/mindsdb).
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-## Installation
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Configurations
diff --git a/website/docs/docs/core/connect-data-platform/mssql-setup.md b/website/docs/docs/core/connect-data-platform/mssql-setup.md
index 5efcc454823..f58827c3554 100644
--- a/website/docs/docs/core/connect-data-platform/mssql-setup.md
+++ b/website/docs/docs/core/connect-data-platform/mssql-setup.md
@@ -22,33 +22,9 @@ Some core functionality may be limited. If you're interested in contributing, ch
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
:::tip Default settings change in dbt-sqlserver v1.2 / ODBC Driver 18
diff --git a/website/docs/docs/core/connect-data-platform/mysql-setup.md b/website/docs/docs/core/connect-data-platform/mysql-setup.md
index 1df6e205272..4b9224e0a0d 100644
--- a/website/docs/docs/core/connect-data-platform/mysql-setup.md
+++ b/website/docs/docs/core/connect-data-platform/mysql-setup.md
@@ -22,32 +22,9 @@ Some core functionality may be limited. If you're interested in contributing, ch
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
This is an experimental plugin:
- It has not been tested extensively.
diff --git a/website/docs/docs/core/connect-data-platform/oracle-setup.md b/website/docs/docs/core/connect-data-platform/oracle-setup.md
index b1195fbd0a0..31e41f1a9a7 100644
--- a/website/docs/docs/core/connect-data-platform/oracle-setup.md
+++ b/website/docs/docs/core/connect-data-platform/oracle-setup.md
@@ -16,35 +16,10 @@ meta:
config_page: '/reference/resource-configs/oracle-configs'
---
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
-## Installation
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
### Configure the Python driver mode
diff --git a/website/docs/docs/core/connect-data-platform/postgres-setup.md b/website/docs/docs/core/connect-data-platform/postgres-setup.md
index f56d3f22576..ec03a205568 100644
--- a/website/docs/docs/core/connect-data-platform/postgres-setup.md
+++ b/website/docs/docs/core/connect-data-platform/postgres-setup.md
@@ -18,33 +18,9 @@ meta:
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Profile Configuration
diff --git a/website/docs/docs/core/connect-data-platform/profiles.yml.md b/website/docs/docs/core/connect-data-platform/profiles.yml.md
index 97254dda1c4..f8acb65f3d2 100644
--- a/website/docs/docs/core/connect-data-platform/profiles.yml.md
+++ b/website/docs/docs/core/connect-data-platform/profiles.yml.md
@@ -3,7 +3,7 @@ title: "About profiles.yml"
id: profiles.yml
---
-If you're using [dbt Core](/docs/core/about-dbt-core), you'll need a `profiles.yml` file that contains the connection details for your data platform. When you run dbt Core from the command line, it reads your `dbt_project.yml` file to find the `profile` name, and then looks for a profile with the same name in your `profiles.yml` file. This profile contains all the information dbt needs to connect to your data platform.
+If you're using [dbt Core](/docs/core/installation-overview), you'll need a `profiles.yml` file that contains the connection details for your data platform. When you run dbt Core from the command line, it reads your `dbt_project.yml` file to find the `profile` name, and then looks for a profile with the same name in your `profiles.yml` file. This profile contains all the information dbt needs to connect to your data platform.
For detailed info, you can refer to the [Connection profiles](/docs/core/connect-data-platform/connection-profiles).
diff --git a/website/docs/docs/core/connect-data-platform/redshift-setup.md b/website/docs/docs/core/connect-data-platform/redshift-setup.md
index 175d5f6a715..464d3b084d8 100644
--- a/website/docs/docs/core/connect-data-platform/redshift-setup.md
+++ b/website/docs/docs/core/connect-data-platform/redshift-setup.md
@@ -18,33 +18,9 @@ meta:
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specific configuration, refer to {frontMatter.meta.platform_name} Configuration.
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}.
+
## Configurations
@@ -70,8 +46,9 @@ pip is the easiest way to install the adapter:
The authentication methods that dbt Core supports are:
- `database` — Password-based authentication (default, will be used if `method` is not provided)
-- `IAM` — IAM
+- `IAM` — IAM
+For dbt Cloud users, log in using the default **Database username** and **password**. This is necessary because dbt Cloud does not support `IAM` authentication.
Click on one of these authentication methods for further details on how to configure your connection profile. Each tab also includes an example `profiles.yml` configuration file for you to review.
diff --git a/website/docs/docs/core/connect-data-platform/rockset-setup.md b/website/docs/docs/core/connect-data-platform/rockset-setup.md
index 4a146829a03..372a6c0c538 100644
--- a/website/docs/docs/core/connect-data-platform/rockset-setup.md
+++ b/website/docs/docs/core/connect-data-platform/rockset-setup.md
@@ -22,33 +22,9 @@ Certain core functionality may vary. If you would like to report a bug, request
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Connecting to Rockset with **dbt-rockset**
diff --git a/website/docs/docs/core/connect-data-platform/singlestore-setup.md b/website/docs/docs/core/connect-data-platform/singlestore-setup.md
index a63466542a9..285c41bafc9 100644
--- a/website/docs/docs/core/connect-data-platform/singlestore-setup.md
+++ b/website/docs/docs/core/connect-data-platform/singlestore-setup.md
@@ -22,35 +22,9 @@ Certain core functionality may vary. If you would like to report a bug, request
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-## Installation and Distribution
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
### Set up a SingleStore Target
diff --git a/website/docs/docs/core/connect-data-platform/snowflake-setup.md b/website/docs/docs/core/connect-data-platform/snowflake-setup.md
index 98bcf447fed..2b426ef667b 100644
--- a/website/docs/docs/core/connect-data-platform/snowflake-setup.md
+++ b/website/docs/docs/core/connect-data-platform/snowflake-setup.md
@@ -18,33 +18,9 @@ meta:
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Authentication Methods
diff --git a/website/docs/docs/core/connect-data-platform/spark-setup.md b/website/docs/docs/core/connect-data-platform/spark-setup.md
index 895f0559953..93595cea3f6 100644
--- a/website/docs/docs/core/connect-data-platform/spark-setup.md
+++ b/website/docs/docs/core/connect-data-platform/spark-setup.md
@@ -24,26 +24,10 @@ meta:
See [Databricks setup](#databricks-setup) for the Databricks version of this page.
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
If connecting to Databricks via ODBC driver, it requires `pyodbc`. Depending on your system, you can install it seperately or via pip. See the [`pyodbc` wiki](https://github.com/mkleehammer/pyodbc/wiki/Install) for OS-specific installation details.
@@ -51,15 +35,15 @@ If connecting to a Spark cluster via the generic thrift or http methods, it requ
```zsh
# odbc connections
-$ pip install "dbt-spark[ODBC]"
+$ python -m pip install "dbt-spark[ODBC]"
# thrift or http connections
-$ pip install "dbt-spark[PyHive]"
+$ python -m pip install "dbt-spark[PyHive]"
```
```zsh
# session connections
-$ pip install "dbt-spark[session]"
+$ python -m pip install "dbt-spark[session]"
```
Configuring {frontMatter.meta.pypi_package}
@@ -70,7 +54,7 @@ $ pip install "dbt-spark[session]"
## Connection Methods
-dbt-spark can connect to Spark clusters by three different methods:
+dbt-spark can connect to Spark clusters by four different methods:
- [`odbc`](#odbc) is the preferred method when connecting to Databricks. It supports connecting to a SQL Endpoint or an all-purpose interactive cluster.
- [`thrift`](#thrift) connects directly to the lead node of a cluster, either locally hosted / on premise or in the cloud (e.g. Amazon EMR).
diff --git a/website/docs/docs/core/connect-data-platform/sqlite-setup.md b/website/docs/docs/core/connect-data-platform/sqlite-setup.md
index 3da902a6f80..20897ea90d7 100644
--- a/website/docs/docs/core/connect-data-platform/sqlite-setup.md
+++ b/website/docs/docs/core/connect-data-platform/sqlite-setup.md
@@ -22,34 +22,9 @@ Some core functionality may be limited. If you're interested in contributing, ch
:::
- Overview of {frontMatter.meta.pypi_package}
-
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
+
Starting with the release of dbt-core 1.0.0, versions of dbt-sqlite are aligned to the same major+minor [version](https://semver.org/) of dbt-core.
- versions 1.1.x of this adapter work with dbt-core 1.1.x
diff --git a/website/docs/docs/core/connect-data-platform/starrocks-setup.md b/website/docs/docs/core/connect-data-platform/starrocks-setup.md
index e5c1abac037..485e1d18fb7 100644
--- a/website/docs/docs/core/connect-data-platform/starrocks-setup.md
+++ b/website/docs/docs/core/connect-data-platform/starrocks-setup.md
@@ -34,7 +34,7 @@ meta:
pip is the easiest way to install the adapter:
-pip install {frontMatter.meta.pypi_package}
+python -m pip install {frontMatter.meta.pypi_package}
Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
diff --git a/website/docs/docs/core/connect-data-platform/teradata-setup.md b/website/docs/docs/core/connect-data-platform/teradata-setup.md
index 1ba8e506b88..1a30a1a4a54 100644
--- a/website/docs/docs/core/connect-data-platform/teradata-setup.md
+++ b/website/docs/docs/core/connect-data-platform/teradata-setup.md
@@ -19,29 +19,12 @@ meta:
Some core functionality may be limited. If you're interested in contributing, check out the source code for the repository listed below.
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
+
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Python compatibility
+## Python compatibility
| Plugin version | Python 3.6 | Python 3.7 | Python 3.8 | Python 3.9 | Python 3.10 | Python 3.11 |
| -------------- | ----------- | ----------- | ----------- | ----------- | ----------- | ------------ |
@@ -56,18 +39,12 @@ pip is the easiest way to install the adapter:
|1.5.x | ❌ | ✅ | ✅ | ✅ | ✅ | ✅
|1.6.x | ❌ | ❌ | ✅ | ✅ | ✅ | ✅
- dbt dependent packages version compatibility
+## dbt dependent packages version compatibility
| dbt-teradata | dbt-core | dbt-teradata-util | dbt-util |
|--------------|------------|-------------------|----------------|
| 1.2.x | 1.2.x | 0.1.0 | 0.9.x or below |
-
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+| 1.6.7 | 1.6.7 | 1.1.1 | 1.1.1 |
### Connecting to Teradata
diff --git a/website/docs/docs/core/connect-data-platform/tidb-setup.md b/website/docs/docs/core/connect-data-platform/tidb-setup.md
index e2205c4665e..253497b37ba 100644
--- a/website/docs/docs/core/connect-data-platform/tidb-setup.md
+++ b/website/docs/docs/core/connect-data-platform/tidb-setup.md
@@ -24,34 +24,9 @@ If you're interested in contributing, check out the source code repository liste
:::
- Overview of {frontMatter.meta.pypi_package}
-
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
+
## Connecting to TiDB with **dbt-tidb**
diff --git a/website/docs/docs/core/connect-data-platform/trino-setup.md b/website/docs/docs/core/connect-data-platform/trino-setup.md
index 39d8ed8ab3f..a7dc658358f 100644
--- a/website/docs/docs/core/connect-data-platform/trino-setup.md
+++ b/website/docs/docs/core/connect-data-platform/trino-setup.md
@@ -18,38 +18,9 @@ meta:
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-:::info Vendor-supported plugin
-
-Certain core functionality may vary. If you would like to report a bug, request a feature, or contribute, you can check out the linked repository and open an issue.
-
-:::
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter:
-
-pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.platform_name}-specifc configuration please refer to {frontMatter.meta.platform_name} Configuration
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}
+
## Connecting to Starburst/Trino
@@ -284,7 +255,7 @@ The only authentication parameter to set for OAuth 2.0 is `method: oauth`. If yo
For more information, refer to both [OAuth 2.0 authentication](https://trino.io/docs/current/security/oauth2.html) in the Trino docs and the [README](https://github.com/trinodb/trino-python-client#oauth2-authentication) for the Trino Python client.
-It's recommended that you install `keyring` to cache the OAuth 2.0 token over multiple dbt invocations by running `pip install 'trino[external-authentication-token-cache]'`. The `keyring` package is not installed by default.
+It's recommended that you install `keyring` to cache the OAuth 2.0 token over multiple dbt invocations by running `python -m pip install 'trino[external-authentication-token-cache]'`. The `keyring` package is not installed by default.
#### Example profiles.yml for OAuth
diff --git a/website/docs/docs/core/connect-data-platform/upsolver-setup.md b/website/docs/docs/core/connect-data-platform/upsolver-setup.md
index 6b2f410fc07..8e4203e0b0c 100644
--- a/website/docs/docs/core/connect-data-platform/upsolver-setup.md
+++ b/website/docs/docs/core/connect-data-platform/upsolver-setup.md
@@ -33,7 +33,7 @@ pagination_next: null
pip is the easiest way to install the adapter:
-pip install {frontMatter.meta.pypi_package}
+python -m pip install {frontMatter.meta.pypi_package}
Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
diff --git a/website/docs/docs/core/connect-data-platform/vertica-setup.md b/website/docs/docs/core/connect-data-platform/vertica-setup.md
index fbb8de6b301..b1424289137 100644
--- a/website/docs/docs/core/connect-data-platform/vertica-setup.md
+++ b/website/docs/docs/core/connect-data-platform/vertica-setup.md
@@ -6,9 +6,9 @@ meta:
authors: 'Vertica (Former authors: Matthew Carter, Andy Regan, Andrew Hedengren)'
github_repo: 'vertica/dbt-vertica'
pypi_package: 'dbt-vertica'
- min_core_version: 'v1.4.0 and newer'
+ min_core_version: 'v1.6.0 and newer'
cloud_support: 'Not Supported'
- min_supported_version: 'Vertica 12.0.0'
+ min_supported_version: 'Vertica 23.4.0'
slack_channel_name: 'n/a'
slack_channel_link: 'https://www.getdbt.com/community/'
platform_name: 'Vertica'
@@ -21,31 +21,9 @@ If you're interested in contributing, check out the source code for each reposit
:::
- Overview of {frontMatter.meta.pypi_package}
+import SetUpPages from '/snippets/_setup-pages-intro.md';
-
- - Maintained by: {frontMatter.meta.maintained_by}
- - Authors: {frontMatter.meta.authors}
- - GitHub repo: {frontMatter.meta.github_repo}
- - PyPI package:
{frontMatter.meta.pypi_package}
- - Slack channel: {frontMatter.meta.slack_channel_name}
- - Supported dbt Core version: {frontMatter.meta.min_core_version} and newer
- - dbt Cloud support: {frontMatter.meta.cloud_support}
- - Minimum data platform version: {frontMatter.meta.min_supported_version}
-
-
-
- Installing {frontMatter.meta.pypi_package}
-
-pip is the easiest way to install the adapter: pip install {frontMatter.meta.pypi_package}
-
-Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
-
- Configuring {frontMatter.meta.pypi_package}
-
-For {frontMatter.meta.pypi_package} specific configuration please refer to {frontMatter.meta.platform_name} Configuration.
-
-For further info, refer to the GitHub repository: {frontMatter.meta.github_repo}.
+
Connecting to {frontMatter.meta.platform_name} with {frontMatter.meta.pypi_package}
diff --git a/website/docs/docs/core/docker-install.md b/website/docs/docs/core/docker-install.md
index dfb2a669e34..6c1ec9da9e1 100644
--- a/website/docs/docs/core/docker-install.md
+++ b/website/docs/docs/core/docker-install.md
@@ -5,13 +5,13 @@ description: "You can use Docker to install dbt and adapter plugins from the com
dbt Core and all adapter plugins maintained by dbt Labs are available as [Docker](https://docs.docker.com/) images, and distributed via [GitHub Packages](https://docs.github.com/en/packages/learn-github-packages/introduction-to-github-packages) in a [public registry](https://github.com/dbt-labs/dbt-core/pkgs/container/dbt-core).
-Using a prebuilt Docker image to install dbt Core in production has a few benefits: it already includes dbt-core, one or more database adapters, and pinned versions of all their dependencies. By contrast, `pip install dbt-core dbt-` takes longer to run, and will always install the latest compatible versions of every dependency.
+Using a prebuilt Docker image to install dbt Core in production has a few benefits: it already includes dbt-core, one or more database adapters, and pinned versions of all their dependencies. By contrast, `python -m pip install dbt-core dbt-` takes longer to run, and will always install the latest compatible versions of every dependency.
You might also be able to use Docker to install and develop locally if you don't have a Python environment set up. Note that running dbt in this manner can be significantly slower if your operating system differs from the system that built the Docker image. If you're a frequent local developer, we recommend that you install dbt Core via [Homebrew](/docs/core/homebrew-install) or [pip](/docs/core/pip-install) instead.
### Prerequisites
* You've installed Docker. For more information, see the [Docker](https://docs.docker.com/) site.
-* You understand which database adapter(s) you need. For more information, see [About dbt adapters](/docs/core/installation#about-dbt-adapters).
+* You understand which database adapter(s) you need. For more information, see [About dbt adapters](docs/core/installation-overview#about-dbt-data-platforms-and-adapters).
* You understand how dbt Core is versioned. For more information, see [About dbt Core versions](/docs/dbt-versions/core).
* You have a general understanding of the dbt, dbt workflow, developing locally in the command line interface (CLI). For more information, see [About dbt](/docs/introduction#how-do-i-use-dbt).
diff --git a/website/docs/docs/core/installation-overview.md b/website/docs/docs/core/installation-overview.md
index cb1df26b0f8..8c139012667 100644
--- a/website/docs/docs/core/installation-overview.md
+++ b/website/docs/docs/core/installation-overview.md
@@ -1,25 +1,35 @@
---
-title: "About installing dbt"
-id: "installation"
+title: "About dbt Core and installation"
description: "You can install dbt Core using a few different tested methods."
pagination_next: "docs/core/homebrew-install"
pagination_prev: null
---
+[dbt Core](https://github.com/dbt-labs/dbt-core) is an open sourced project where you can develop from the command line and run your dbt project.
+
+To use dbt Core, your workflow generally looks like:
+
+1. **Build your dbt project in a code editor —** popular choices include VSCode and Atom.
+
+2. **Run your project from the command line —** macOS ships with a default Terminal program, however you can also use iTerm or the command line prompt within a code editor to execute dbt commands.
+
+:::info How we set up our computers for working on dbt projects
+
+We've written a [guide](https://discourse.getdbt.com/t/how-we-set-up-our-computers-for-working-on-dbt-projects/243) for our recommended setup when running dbt projects using dbt Core.
+
+:::
+
+If you're using the command line, we recommend learning some basics of your terminal to help you work more effectively. In particular, it's important to understand `cd`, `ls` and `pwd` to be able to navigate through the directory structure of your computer easily.
+
+## Install dbt Core
+
You can install dbt Core on the command line by using one of these methods:
- [Use pip to install dbt](/docs/core/pip-install) (recommended)
- [Use Homebrew to install dbt](/docs/core/homebrew-install)
- [Use a Docker image to install dbt](/docs/core/docker-install)
- [Install dbt from source](/docs/core/source-install)
-
-:::tip Pro tip: Using the --help flag
-
-Most command-line tools, including dbt, have a `--help` flag that you can use to show available commands and arguments. For example, you can use the `--help` flag with dbt in two ways:
-— `dbt --help`: Lists the commands available for dbt
-— `dbt run --help`: Lists the flags available for the `run` command
-
-:::
+- You can also develop locally using the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation). The dbt Cloud CLI and dbt Core are both command line tools that let you run dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its [features](/docs/cloud/about-cloud/dbt-cloud-features).
## Upgrading dbt Core
@@ -32,3 +42,11 @@ dbt provides a number of resources for understanding [general best practices](/b
## About dbt data platforms and adapters
dbt works with a number of different data platforms (databases, query engines, and other SQL-speaking technologies). It does this by using a dedicated _adapter_ for each. When you install dbt Core, you'll also want to install the specific adapter for your database. For more details, see [Supported Data Platforms](/docs/supported-data-platforms).
+
+:::tip Pro tip: Using the --help flag
+
+Most command-line tools, including dbt, have a `--help` flag that you can use to show available commands and arguments. For example, you can use the `--help` flag with dbt in two ways:
+— `dbt --help`: Lists the commands available for dbt
+— `dbt run --help`: Lists the flags available for the `run` command
+
+:::
diff --git a/website/docs/docs/core/pip-install.md b/website/docs/docs/core/pip-install.md
index 44fac00e493..e1a0e65312c 100644
--- a/website/docs/docs/core/pip-install.md
+++ b/website/docs/docs/core/pip-install.md
@@ -39,7 +39,7 @@ alias env_dbt='source /bin/activate'
Once you know [which adapter](/docs/supported-data-platforms) you're using, you can install it as `dbt-`. For example, if using Postgres:
```shell
-pip install dbt-postgres
+python -m pip install dbt-postgres
```
This will install `dbt-core` and `dbt-postgres` _only_:
@@ -62,7 +62,7 @@ All adapters build on top of `dbt-core`. Some also depend on other adapters: for
To upgrade a specific adapter plugin:
```shell
-pip install --upgrade dbt-
+python -m pip install --upgrade dbt-
```
### Install dbt-core only
@@ -70,7 +70,7 @@ pip install --upgrade dbt-
If you're building a tool that integrates with dbt Core, you may want to install the core library alone, without a database adapter. Note that you won't be able to use dbt as a CLI tool.
```shell
-pip install dbt-core
+python -m pip install dbt-core
```
### Change dbt Core versions
@@ -79,13 +79,13 @@ You can upgrade or downgrade versions of dbt Core by using the `--upgrade` optio
To upgrade dbt to the latest version:
```
-pip install --upgrade dbt-core
+python -m pip install --upgrade dbt-core
```
To downgrade to an older version, specify the version you want to use. This command can be useful when you're resolving package dependencies. As an example:
```
-pip install --upgrade dbt-core==0.19.0
+python -m pip install --upgrade dbt-core==0.19.0
```
### `pip install dbt`
@@ -95,7 +95,7 @@ Note that, as of v1.0.0, `pip install dbt` is no longer supported and will raise
If you have workflows or integrations that relied on installing the package named `dbt`, you can achieve the same behavior going forward by installing the same five packages that it used:
```shell
-pip install \
+python -m pip install \
dbt-core \
dbt-postgres \
dbt-redshift \
diff --git a/website/docs/docs/core/source-install.md b/website/docs/docs/core/source-install.md
index 42086159c03..d17adc13c53 100644
--- a/website/docs/docs/core/source-install.md
+++ b/website/docs/docs/core/source-install.md
@@ -17,10 +17,10 @@ To install `dbt-core` from the GitHub code source:
```shell
git clone https://github.com/dbt-labs/dbt-core.git
cd dbt-core
-pip install -r requirements.txt
+python -m pip install -r requirements.txt
```
-This will install `dbt-core` and `dbt-postgres`. To install in editable mode (includes your local changes as you make them), use `pip install -e editable-requirements.txt` instead.
+This will install `dbt-core` and `dbt-postgres`. To install in editable mode (includes your local changes as you make them), use `python -m pip install -e editable-requirements.txt` instead.
### Installing adapter plugins
@@ -29,12 +29,12 @@ To install an adapter plugin from source, you will need to first locate its sour
```shell
git clone https://github.com/dbt-labs/dbt-redshift.git
cd dbt-redshift
-pip install .
+python -m pip install .
```
You do _not_ need to install `dbt-core` before installing an adapter plugin -- the plugin includes `dbt-core` among its dependencies, and it will install the latest compatible version automatically.
-To install in editable mode, such as while contributing, use `pip install -e .` instead.
+To install in editable mode, such as while contributing, use `python -m pip install -e .` instead.
diff --git a/website/docs/docs/dbt-cloud-apis/project-state.md b/website/docs/docs/dbt-cloud-apis/project-state.md
index a5ee71ebb1b..62136b35463 100644
--- a/website/docs/docs/dbt-cloud-apis/project-state.md
+++ b/website/docs/docs/dbt-cloud-apis/project-state.md
@@ -66,7 +66,7 @@ Most Discovery API use cases will favor the _applied state_ since it pertains to
| Seed | Yes | Yes | Yes | Downstream | Applied & definition |
| Snapshot | Yes | Yes | Yes | Upstream & downstream | Applied & definition |
| Test | Yes | Yes | No | Upstream | Applied & definition |
-| Exposure | No | No | No | Upstream | Applied & definition |
+| Exposure | No | No | No | Upstream | Definition |
| Metric | No | No | No | Upstream & downstream | Definition |
| Semantic model | No | No | No | Upstream & downstream | Definition |
| Group | No | No | No | Downstream | Definition |
diff --git a/website/docs/docs/dbt-cloud-apis/service-tokens.md b/website/docs/docs/dbt-cloud-apis/service-tokens.md
index 9553f48a013..f1369711d2b 100644
--- a/website/docs/docs/dbt-cloud-apis/service-tokens.md
+++ b/website/docs/docs/dbt-cloud-apis/service-tokens.md
@@ -115,3 +115,5 @@ To rotate your token:
4. Copy the new token and replace the old one in your systems. Store it in a safe place, as it will not be available again once the creation screen is closed.
5. Delete the old token in dbt Cloud by clicking the **trash can icon**. _Only take this action after the new token is in place to avoid service disruptions_.
+## FAQs
+
diff --git a/website/docs/docs/dbt-cloud-apis/sl-graphql.md b/website/docs/docs/dbt-cloud-apis/sl-graphql.md
index f73007c9a02..b7d13d0d453 100644
--- a/website/docs/docs/dbt-cloud-apis/sl-graphql.md
+++ b/website/docs/docs/dbt-cloud-apis/sl-graphql.md
@@ -48,7 +48,7 @@ Authentication uses a dbt Cloud [service account tokens](/docs/dbt-cloud-apis/se
{"Authorization": "Bearer "}
```
-Each GQL request also requires a dbt Cloud `environmentId`. The API uses both the service token in the header and environmentId for authentication.
+Each GQL request also requires a dbt Cloud `environmentId`. The API uses both the service token in the header and `environmentId` for authentication.
### Metadata calls
@@ -150,6 +150,60 @@ metricsForDimensions(
): [Metric!]!
```
+**Metric Types**
+
+```graphql
+Metric {
+ name: String!
+ description: String
+ type: MetricType!
+ typeParams: MetricTypeParams!
+ filter: WhereFilter
+ dimensions: [Dimension!]!
+ queryableGranularities: [TimeGranularity!]!
+}
+```
+
+```
+MetricType = [SIMPLE, RATIO, CUMULATIVE, DERIVED]
+```
+
+**Metric Type parameters**
+
+```graphql
+MetricTypeParams {
+ measure: MetricInputMeasure
+ inputMeasures: [MetricInputMeasure!]!
+ numerator: MetricInput
+ denominator: MetricInput
+ expr: String
+ window: MetricTimeWindow
+ grainToDate: TimeGranularity
+ metrics: [MetricInput!]
+}
+```
+
+
+**Dimension Types**
+
+```graphql
+Dimension {
+ name: String!
+ description: String
+ type: DimensionType!
+ typeParams: DimensionTypeParams
+ isPartition: Boolean!
+ expr: String
+ queryableGranularities: [TimeGranularity!]!
+}
+```
+
+```
+DimensionType = [CATEGORICAL, TIME]
+```
+
+### Querying
+
**Create Dimension Values query**
```graphql
@@ -205,59 +259,128 @@ query(
): QueryResult!
```
-**Metric Types**
+The GraphQL API uses a polling process for querying since queries can be long-running in some cases. It works by first creating a query with a mutation, `createQuery, which returns a query ID. This ID is then used to continuously check (poll) for the results and status of your query. The typical flow would look as follows:
+1. Kick off a query
```graphql
-Metric {
- name: String!
- description: String
- type: MetricType!
- typeParams: MetricTypeParams!
- filter: WhereFilter
- dimensions: [Dimension!]!
- queryableGranularities: [TimeGranularity!]!
+mutation {
+ createQuery(
+ environmentId: 123456
+ metrics: [{name: "order_total"}]
+ groupBy: [{name: "metric_time"}]
+ ) {
+ queryId # => Returns 'QueryID_12345678'
+ }
}
```
-
-```
-MetricType = [SIMPLE, RATIO, CUMULATIVE, DERIVED]
+2. Poll for results
+```graphql
+{
+ query(environmentId: 123456, queryId: "QueryID_12345678") {
+ sql
+ status
+ error
+ totalPages
+ jsonResult
+ arrowResult
+ }
+}
```
+3. Keep querying 2. at an appropriate interval until status is `FAILED` or `SUCCESSFUL`
+
+### Output format and pagination
+
+**Output format**
+
+By default, the output is in Arrow format. You can switch to JSON format using the following parameter. However, due to performance limitations, we recommend using the JSON parameter for testing and validation. The JSON received is a base64 encoded string. To access it, you can decode it using a base64 decoder. The JSON is created from pandas, which means you can change it back to a dataframe using `pandas.read_json(json, orient="table")`. Or you can work with the data directly using `json["data"]`, and find the table schema using `json["schema"]["fields"]`. Alternatively, you can pass `encoded:false` to the jsonResult field to get a raw JSON string directly.
-**Metric Type parameters**
```graphql
-MetricTypeParams {
- measure: MetricInputMeasure
- inputMeasures: [MetricInputMeasure!]!
- numerator: MetricInput
- denominator: MetricInput
- expr: String
- window: MetricTimeWindow
- grainToDate: TimeGranularity
- metrics: [MetricInput!]
+{
+ query(environmentId: BigInt!, queryId: Int!, pageNum: Int! = 1) {
+ sql
+ status
+ error
+ totalPages
+ arrowResult
+ jsonResult(orient: PandasJsonOrient! = TABLE, encoded: Boolean! = true)
+ }
}
```
+The results default to the table but you can change it to any [pandas](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.to_json.html) supported value.
-**Dimension Types**
+**Pagination**
-```graphql
-Dimension {
- name: String!
- description: String
- type: DimensionType!
- typeParams: DimensionTypeParams
- isPartition: Boolean!
- expr: String
- queryableGranularities: [TimeGranularity!]!
+By default, we return 1024 rows per page. If your result set exceeds this, you need to increase the page number using the `pageNum` option.
+
+### Run a Python query
+
+The `arrowResult` in the GraphQL query response is a byte dump, which isn't visually useful. You can convert this byte data into an Arrow table using any Arrow-supported language. Refer to the following Python example explaining how to query and decode the arrow result:
+
+
+```python
+import base64
+import pyarrow as pa
+import time
+
+headers = {"Authorization":"Bearer "}
+query_result_request = """
+{
+ query(environmentId: 70, queryId: "12345678") {
+ sql
+ status
+ error
+ arrowResult
+ }
}
-```
+"""
-```
-DimensionType = [CATEGORICAL, TIME]
+while True:
+ gql_response = requests.post(
+ "https://semantic-layer.cloud.getdbt.com/api/graphql",
+ json={"query": query_result_request},
+ headers=headers,
+ )
+ if gql_response.json()["data"]["status"] in ["FAILED", "SUCCESSFUL"]:
+ break
+ # Set an appropriate interval between polling requests
+ time.sleep(1)
+
+"""
+gql_response.json() =>
+{
+ "data": {
+ "query": {
+ "sql": "SELECT\n ordered_at AS metric_time__day\n , SUM(order_total) AS order_total\nFROM semantic_layer.orders orders_src_1\nGROUP BY\n ordered_at",
+ "status": "SUCCESSFUL",
+ "error": null,
+ "arrowResult": "arrow-byte-data"
+ }
+ }
+}
+"""
+
+def to_arrow_table(byte_string: str) -> pa.Table:
+ """Get a raw base64 string and convert to an Arrow Table."""
+ with pa.ipc.open_stream(base64.b64decode(res)) as reader:
+ return pa.Table.from_batches(reader, reader.schema)
+
+
+arrow_table = to_arrow_table(gql_response.json()["data"]["query"]["arrowResult"])
+
+# Perform whatever functionality is available, like convert to a pandas table.
+print(arrow_table.to_pandas())
+"""
+order_total ordered_at
+ 3 2023-08-07
+ 112 2023-08-08
+ 12 2023-08-09
+ 5123 2023-08-10
+"""
```
-### Create Query examples
+### Additional Create Query examples
The following section provides query examples for the GraphQL API, such as how to query metrics, dimensions, where filters, and more.
@@ -359,7 +482,7 @@ mutation {
}
```
-**Query with Explain**
+**Query with just compiling SQL**
This takes the same inputs as the `createQuery` mutation.
@@ -374,89 +497,3 @@ mutation {
}
}
```
-
-### Output format and pagination
-
-**Output format**
-
-By default, the output is in Arrow format. You can switch to JSON format using the following parameter. However, due to performance limitations, we recommend using the JSON parameter for testing and validation. The JSON received is a base64 encoded string. To access it, you can decode it using a base64 decoder. The JSON is created from pandas, which means you can change it back to a dataframe using `pandas.read_json(json, orient="table")`. Or you can work with the data directly using `json["data"]`, and find the table schema using `json["schema"]["fields"]`. Alternatively, you can pass `encoded:false` to the jsonResult field to get a raw JSON string directly.
-
-
-```graphql
-{
- query(environmentId: BigInt!, queryId: Int!, pageNum: Int! = 1) {
- sql
- status
- error
- totalPages
- arrowResult
- jsonResult(orient: PandasJsonOrient! = TABLE, encoded: Boolean! = true)
- }
-}
-```
-
-The results default to the table but you can change it to any [pandas](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.to_json.html) supported value.
-
-**Pagination**
-
-By default, we return 1024 rows per page. If your result set exceeds this, you need to increase the page number using the `pageNum` option.
-
-### Run a Python query
-
-The `arrowResult` in the GraphQL query response is a byte dump, which isn't visually useful. You can convert this byte data into an Arrow table using any Arrow-supported language. Refer to the following Python example explaining how to query and decode the arrow result:
-
-
-```python
-import base64
-import pyarrow as pa
-
-headers = {"Authorization":"Bearer "}
-query_result_request = """
-{
- query(environmentId: 70, queryId: "12345678") {
- sql
- status
- error
- arrowResult
- }
-}
-"""
-
-gql_response = requests.post(
- "https://semantic-layer.cloud.getdbt.com/api/graphql",
- json={"query": query_result_request},
- headers=headers,
-)
-
-"""
-gql_response.json() =>
-{
- "data": {
- "query": {
- "sql": "SELECT\n ordered_at AS metric_time__day\n , SUM(order_total) AS order_total\nFROM semantic_layer.orders orders_src_1\nGROUP BY\n ordered_at",
- "status": "SUCCESSFUL",
- "error": null,
- "arrowResult": "arrow-byte-data"
- }
- }
-}
-"""
-
-def to_arrow_table(byte_string: str) -> pa.Table:
- """Get a raw base64 string and convert to an Arrow Table."""
- with pa.ipc.open_stream(base64.b64decode(res)) as reader:
- return pa.Table.from_batches(reader, reader.schema)
-
-
-arrow_table = to_arrow_table(gql_response.json()["data"]["query"]["arrowResult"])
-
-# Perform whatever functionality is available, like convert to a pandas table.
-print(arrow_table.to_pandas())
-"""
-order_total ordered_at
- 3 2023-08-07
- 112 2023-08-08
- 12 2023-08-09
- 5123 2023-08-10
-"""
-```
diff --git a/website/docs/docs/dbt-cloud-apis/sl-jdbc.md b/website/docs/docs/dbt-cloud-apis/sl-jdbc.md
index 931666dd10c..aba309566f8 100644
--- a/website/docs/docs/dbt-cloud-apis/sl-jdbc.md
+++ b/website/docs/docs/dbt-cloud-apis/sl-jdbc.md
@@ -352,6 +352,8 @@ semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'],
## FAQs
+
+
- **Why do some dimensions use different syntax, like `metric_time` versus `[Dimension('metric_time')`?**
When you select a dimension on its own, such as `metric_time` you can use the shorthand method which doesn't need the “Dimension” syntax. However, when you perform operations on the dimension, such as adding granularity, the object syntax `[Dimension('metric_time')` is required.
diff --git a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md
index 9ebd3c64cf3..af098860e6f 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md
@@ -12,7 +12,7 @@ import UpgradeMove from '/snippets/_upgrade-move.md';
## Resources
- [Changelog](https://github.com/dbt-labs/dbt-core/blob/8aaed0e29f9560bc53d9d3e88325a9597318e375/CHANGELOG.md)
-- [CLI Installation guide](/docs/core/installation)
+- [CLI Installation guide](/docs/core/installation-overview)
- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud)
- [Release schedule](https://github.com/dbt-labs/dbt-core/issues/8260)
@@ -32,6 +32,8 @@ This is a relatively small behavior change, but worth calling out in case you no
- Don't add a `freshness:` block.
- Explicitly set `freshness: null`
+Beginning with v1.7, running [`dbt deps`](/reference/commands/deps) creates or updates the `package-lock.yml` file in the _project_root_ where `packages.yml` is recorded. The `package-lock.yml` file contains a record of all packages installed and, if subsequent `dbt deps` runs contain no updated packages in `depenedencies.yml` or `packages.yml`, dbt-core installs from `package-lock.yml`.
+
## New and changed features and functionality
- [`dbt docs generate`](/reference/commands/cmd-docs) now supports `--select` to generate [catalog metadata](/reference/artifacts/catalog-json) for a subset of your project. Currently available for Snowflake and Postgres only, but other adapters are coming soon.
diff --git a/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md b/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md
index d36cc544814..33a038baa9b 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md
@@ -17,7 +17,7 @@ dbt Core v1.6 has three significant areas of focus:
## Resources
- [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md)
-- [CLI Installation guide](/docs/core/installation)
+- [dbt Core installation guide](/docs/core/installation-overview)
- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud)
- [Release schedule](https://github.com/dbt-labs/dbt-core/issues/7481)
diff --git a/website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md b/website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md
index dded8a690fe..e739caa477a 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md
@@ -16,7 +16,7 @@ dbt Core v1.5 is a feature release, with two significant additions:
## Resources
- [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.5.latest/CHANGELOG.md)
-- [CLI Installation guide](/docs/core/installation)
+- [CLI Installation guide](/docs/core/installation-overview)
- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud)
- [Release schedule](https://github.com/dbt-labs/dbt-core/issues/6715)
diff --git a/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md b/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md
index 6c6d96b2326..a946bdf369b 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md
@@ -12,7 +12,7 @@ import UpgradeMove from '/snippets/_upgrade-move.md';
### Resources
- [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md)
-- [CLI Installation guide](/docs/core/installation)
+- [CLI Installation guide](/docs/core/installation-overview)
- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud)
**Final release:** January 25, 2023
diff --git a/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md b/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md
index f66d9bb9706..d9d97f17dc5 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md
@@ -12,7 +12,7 @@ import UpgradeMove from '/snippets/_upgrade-move.md';
### Resources
- [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.3.latest/CHANGELOG.md)
-- [CLI Installation guide](/docs/core/installation)
+- [CLI Installation guide](/docs/core/installation-overview)
- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud)
## What to know before upgrading
diff --git a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md
index 16825ff4e2b..72a3e0c82ad 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md
@@ -12,7 +12,7 @@ import UpgradeMove from '/snippets/_upgrade-move.md';
### Resources
- [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.2.latest/CHANGELOG.md)
-- [CLI Installation guide](/docs/core/installation)
+- [CLI Installation guide](/docs/core/installation-overview)
- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud)
## What to know before upgrading
diff --git a/website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md b/website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md
index 403264a46e6..12f0f42354a 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md
@@ -12,7 +12,7 @@ import UpgradeMove from '/snippets/_upgrade-move.md';
### Resources
- [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.1.latest/CHANGELOG.md)
-- [CLI Installation guide](/docs/core/installation)
+- [CLI Installation guide](/docs/core/installation-overview)
- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud)
## What to know before upgrading
diff --git a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md
index 3f45e44076c..6e437638ef6 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md
@@ -13,7 +13,7 @@ import UpgradeMove from '/snippets/_upgrade-move.md';
- [Discourse](https://discourse.getdbt.com/t/3180)
- [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.0.latest/CHANGELOG.md)
-- [CLI Installation guide](/docs/core/installation)
+- [CLI Installation guide](/docs/core/installation-overview)
- [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud)
## What to know before upgrading
@@ -45,7 +45,7 @@ Global project macros have been reorganized, and some old unused macros have bee
### Installation
- [Installation docs](/docs/supported-data-platforms) reflects adapter-specific installations
-- `pip install dbt` is no longer supported, and will raise an explicit error. Install the specific adapter plugin you need as `pip install dbt-`.
+- `python -m pip install dbt` is no longer supported, and will raise an explicit error. Install the specific adapter plugin you need as `python -m pip install dbt-`.
- `brew install dbt` is no longer supported. Install the specific adapter plugin you need (among Postgres, Redshift, Snowflake, or BigQuery) as `brew install dbt-`.
- Removed official support for python 3.6, which is reaching end of life on December 23, 2021
diff --git a/website/docs/docs/dbt-versions/core-versions.md b/website/docs/docs/dbt-versions/core-versions.md
index 2467f3c946b..3ebf988c136 100644
--- a/website/docs/docs/dbt-versions/core-versions.md
+++ b/website/docs/docs/dbt-versions/core-versions.md
@@ -18,7 +18,7 @@ dbt Labs provides different support levels for different versions, which may inc
### Further reading
- To learn how you can use dbt Core versions in dbt Cloud, see [Choosing a dbt Core version](/docs/dbt-versions/upgrade-core-in-cloud).
-- To learn about installing dbt Core, see "[How to install dbt Core](/docs/core/installation)."
+- To learn about installing dbt Core, see "[How to install dbt Core](/docs/core/installation-overview)."
- To restrict your project to only work with a range of dbt Core versions, or use the currently running dbt Core version, see [`require-dbt-version`](/reference/project-configs/require-dbt-version) and [`dbt_version`](/reference/dbt-jinja-functions/dbt_version).
## Version support prior to v1.0
@@ -29,7 +29,7 @@ All dbt Core versions released prior to 1.0 and their version-specific documenta
All dbt Core minor versions that have reached end-of-life (EOL) will have no new patch releases. This means they will no longer receive any fixes, including for known bugs that have been identified. Fixes for those bugs will instead be made in newer minor versions that are still under active support.
-We recommend upgrading to a newer version in [dbt Cloud](/docs/dbt-versions/upgrade-core-in-cloud) or [dbt Core](/docs/core/installation#upgrading-dbt-core) to continue receiving support.
+We recommend upgrading to a newer version in [dbt Cloud](/docs/dbt-versions/upgrade-core-in-cloud) or [dbt Core](/docs/core/installation-overview#upgrading-dbt-core) to continue receiving support.
All dbt Core v1.0 and later are available in dbt Cloud until further notice. In the future, we intend to align dbt Cloud availability with dbt Core ongoing support. You will receive plenty of advance notice before any changes take place.
@@ -56,7 +56,7 @@ After a minor version reaches the end of its critical support period, one year a
### Future versions
-We aim to release a new minor "feature" every 3 months. _This is an indicative timeline ONLY._ For the latest information about upcoming releases, including their planned release dates and which features and fixes might be included in each, always consult the [`dbt-core` repository milestones](https://github.com/dbt-labs/dbt-core/milestones).
+For the latest information about upcoming releases, including planned release dates and which features and fixes might be included, consult the [`dbt-core` repository milestones](https://github.com/dbt-labs/dbt-core/milestones) and [product roadmaps](https://github.com/dbt-labs/dbt-core/tree/main/docs/roadmap).
## Best practices for upgrading
diff --git a/website/docs/docs/dbt-versions/release-notes/75-Nov-2023/explorer-updates-rn.md b/website/docs/docs/dbt-versions/release-notes/75-Nov-2023/explorer-updates-rn.md
new file mode 100644
index 00000000000..8b829311d81
--- /dev/null
+++ b/website/docs/docs/dbt-versions/release-notes/75-Nov-2023/explorer-updates-rn.md
@@ -0,0 +1,33 @@
+---
+title: "Enhancement: New features and UI changes to dbt Explorer"
+description: "November 2023: New features and UI changes to dbt Explorer, including a new filter panel, improved lineage graph, and detailed resource information."
+sidebar_label: "Enhancement: New features and UI changes to dbt Explorer"
+sidebar_position: 08
+tags: [Nov-2023]
+date: 2023-11-28
+---
+
+dbt Labs is excited to announce the latest features and UI updates to dbt Explorer!
+
+For more details, refer to [Explore your dbt projects](/docs/collaborate/explore-projects).
+
+## The project's lineage graph
+
+- The search bar in the full lineage graph is now more prominent.
+- It's easier to navigate across projects using the breadcrumbs.
+- The new context menu (right click) makes it easier to focus on a node or to view its lineage.
+
+
+
+## Search improvements
+
+- When searching with keywords, a new side panel UI helps you filter search results by resource type, tag, column, and other key properties (instead of manually defining selectors).
+- Search result logic is clearly explained. For instance, indicating whether a resource contains a column name (exact match only).
+
+
+
+## Resource details
+- Model test result statuses are now displayed on the model details page.
+- Column names can now be searched within the list.
+
+
\ No newline at end of file
diff --git a/website/docs/docs/dbt-versions/release-notes/02-Nov-2023/job-notifications-rn.md b/website/docs/docs/dbt-versions/release-notes/75-Nov-2023/job-notifications-rn.md
similarity index 98%
rename from website/docs/docs/dbt-versions/release-notes/02-Nov-2023/job-notifications-rn.md
rename to website/docs/docs/dbt-versions/release-notes/75-Nov-2023/job-notifications-rn.md
index 660129513d7..02fe2e037df 100644
--- a/website/docs/docs/dbt-versions/release-notes/02-Nov-2023/job-notifications-rn.md
+++ b/website/docs/docs/dbt-versions/release-notes/75-Nov-2023/job-notifications-rn.md
@@ -4,6 +4,7 @@ description: "November 2023: New quality-of-life improvements for setting up and
sidebar_label: "Enhancement: Job notifications"
sidebar_position: 10
tags: [Nov-2023]
+date: 2023-11-28
---
There are new quality-of-life improvements in dbt Cloud for email and Slack notifications about your jobs:
diff --git a/website/docs/docs/dbt-versions/release-notes/02-Nov-2023/microsoft-fabric-support-rn.md b/website/docs/docs/dbt-versions/release-notes/75-Nov-2023/microsoft-fabric-support-rn.md
similarity index 65%
rename from website/docs/docs/dbt-versions/release-notes/02-Nov-2023/microsoft-fabric-support-rn.md
rename to website/docs/docs/dbt-versions/release-notes/75-Nov-2023/microsoft-fabric-support-rn.md
index 13aefa80ffc..b416817f3a0 100644
--- a/website/docs/docs/dbt-versions/release-notes/02-Nov-2023/microsoft-fabric-support-rn.md
+++ b/website/docs/docs/dbt-versions/release-notes/75-Nov-2023/microsoft-fabric-support-rn.md
@@ -4,11 +4,14 @@ description: "November 2023: Public Preview now available for Microsoft Fabric i
sidebar_label: "New: Public Preview of Microsoft Fabric support"
sidebar_position: 09
tags: [Nov-2023]
+date: 2023-11-28
---
Public Preview is now available in dbt Cloud for Microsoft Fabric!
-To learn more, check out the [Quickstart for dbt Cloud and Microsoft Fabric](/guides/microsoft-fabric?step=1). The guide walks you through:
+To learn more, refer to [Connect Microsoft Fabric](/docs/cloud/connect-data-platform/connect-microsoft-fabric) and [Microsoft Fabric DWH configurations](/reference/resource-configs/fabric-configs).
+
+Also, check out the [Quickstart for dbt Cloud and Microsoft Fabric](/guides/microsoft-fabric?step=1). The guide walks you through:
- Loading the Jaffle Shop sample data (provided by dbt Labs) into your Microsoft Fabric warehouse.
- Connecting dbt Cloud to Microsoft Fabric.
diff --git a/website/docs/docs/dbt-versions/release-notes/75-Nov-2023/repo-caching.md b/website/docs/docs/dbt-versions/release-notes/75-Nov-2023/repo-caching.md
new file mode 100644
index 00000000000..7c35991e961
--- /dev/null
+++ b/website/docs/docs/dbt-versions/release-notes/75-Nov-2023/repo-caching.md
@@ -0,0 +1,14 @@
+---
+title: "New: Support for Git repository caching"
+description: "November 2023: dbt Cloud can cache your project's code (as well as other dbt packages) to ensure runs can begin despite an upstream Git provider's outage."
+sidebar_label: "New: Support for Git repository caching"
+sidebar_position: 07
+tags: [Nov-2023]
+date: 2023-11-29
+---
+
+Now available for dbt Cloud Enterprise plans is a new option to enable Git repository caching for your job runs. When enabled, dbt Cloud caches your dbt project's Git repository and uses the cached copy instead if there's an outage with the Git provider. This feature improves the reliability and stability of your job runs.
+
+To learn more, refer to [Repo caching](/docs/deploy/deploy-environments#git-repository-caching).
+
+
\ No newline at end of file
diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/api-v2v3-limit.md b/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/api-v2v3-limit.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/03-Oct-2023/api-v2v3-limit.md
rename to website/docs/docs/dbt-versions/release-notes/76-Oct-2023/api-v2v3-limit.md
diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/cloud-cli-pp.md b/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/cloud-cli-pp.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/03-Oct-2023/cloud-cli-pp.md
rename to website/docs/docs/dbt-versions/release-notes/76-Oct-2023/cloud-cli-pp.md
diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/custom-branch-fix-rn.md b/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/custom-branch-fix-rn.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/03-Oct-2023/custom-branch-fix-rn.md
rename to website/docs/docs/dbt-versions/release-notes/76-Oct-2023/custom-branch-fix-rn.md
diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/dbt-deps-auto-install.md b/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/dbt-deps-auto-install.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/03-Oct-2023/dbt-deps-auto-install.md
rename to website/docs/docs/dbt-versions/release-notes/76-Oct-2023/dbt-deps-auto-install.md
diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/explorer-public-preview-rn.md b/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/explorer-public-preview-rn.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/03-Oct-2023/explorer-public-preview-rn.md
rename to website/docs/docs/dbt-versions/release-notes/76-Oct-2023/explorer-public-preview-rn.md
diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/native-retry-support-rn.md b/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/native-retry-support-rn.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/03-Oct-2023/native-retry-support-rn.md
rename to website/docs/docs/dbt-versions/release-notes/76-Oct-2023/native-retry-support-rn.md
diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/product-docs-sept-rn.md b/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/product-docs-sept-rn.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/03-Oct-2023/product-docs-sept-rn.md
rename to website/docs/docs/dbt-versions/release-notes/76-Oct-2023/product-docs-sept-rn.md
diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/sl-ga.md b/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/sl-ga.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/03-Oct-2023/sl-ga.md
rename to website/docs/docs/dbt-versions/release-notes/76-Oct-2023/sl-ga.md
diff --git a/website/docs/docs/dbt-versions/release-notes/04-Sept-2023/ci-updates-phase2-rn.md b/website/docs/docs/dbt-versions/release-notes/77-Sept-2023/ci-updates-phase2-rn.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/04-Sept-2023/ci-updates-phase2-rn.md
rename to website/docs/docs/dbt-versions/release-notes/77-Sept-2023/ci-updates-phase2-rn.md
diff --git a/website/docs/docs/dbt-versions/release-notes/04-Sept-2023/ci-updates-phase3-rn.md b/website/docs/docs/dbt-versions/release-notes/77-Sept-2023/ci-updates-phase3-rn.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/04-Sept-2023/ci-updates-phase3-rn.md
rename to website/docs/docs/dbt-versions/release-notes/77-Sept-2023/ci-updates-phase3-rn.md
diff --git a/website/docs/docs/dbt-versions/release-notes/04-Sept-2023/product-docs-summer-rn.md b/website/docs/docs/dbt-versions/release-notes/77-Sept-2023/product-docs-summer-rn.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/04-Sept-2023/product-docs-summer-rn.md
rename to website/docs/docs/dbt-versions/release-notes/77-Sept-2023/product-docs-summer-rn.md
diff --git a/website/docs/docs/dbt-versions/release-notes/04-Sept-2023/removing-prerelease-versions.md b/website/docs/docs/dbt-versions/release-notes/77-Sept-2023/removing-prerelease-versions.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/04-Sept-2023/removing-prerelease-versions.md
rename to website/docs/docs/dbt-versions/release-notes/77-Sept-2023/removing-prerelease-versions.md
diff --git a/website/docs/docs/dbt-versions/release-notes/05-Aug-2023/deprecation-endpoints-discovery.md b/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/deprecation-endpoints-discovery.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/05-Aug-2023/deprecation-endpoints-discovery.md
rename to website/docs/docs/dbt-versions/release-notes/78-Aug-2023/deprecation-endpoints-discovery.md
diff --git a/website/docs/docs/dbt-versions/release-notes/05-Aug-2023/ide-v1.2.md b/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/ide-v1.2.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/05-Aug-2023/ide-v1.2.md
rename to website/docs/docs/dbt-versions/release-notes/78-Aug-2023/ide-v1.2.md
diff --git a/website/docs/docs/dbt-versions/release-notes/05-Aug-2023/sl-revamp-beta.md b/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/sl-revamp-beta.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/05-Aug-2023/sl-revamp-beta.md
rename to website/docs/docs/dbt-versions/release-notes/78-Aug-2023/sl-revamp-beta.md
diff --git a/website/docs/docs/dbt-versions/release-notes/06-July-2023/faster-run.md b/website/docs/docs/dbt-versions/release-notes/79-July-2023/faster-run.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/06-July-2023/faster-run.md
rename to website/docs/docs/dbt-versions/release-notes/79-July-2023/faster-run.md
diff --git a/website/docs/docs/dbt-versions/release-notes/07-June-2023/admin-api-rn.md b/website/docs/docs/dbt-versions/release-notes/80-June-2023/admin-api-rn.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/07-June-2023/admin-api-rn.md
rename to website/docs/docs/dbt-versions/release-notes/80-June-2023/admin-api-rn.md
diff --git a/website/docs/docs/dbt-versions/release-notes/07-June-2023/ci-updates-phase1-rn.md b/website/docs/docs/dbt-versions/release-notes/80-June-2023/ci-updates-phase1-rn.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/07-June-2023/ci-updates-phase1-rn.md
rename to website/docs/docs/dbt-versions/release-notes/80-June-2023/ci-updates-phase1-rn.md
diff --git a/website/docs/docs/dbt-versions/release-notes/07-June-2023/lint-format-rn.md b/website/docs/docs/dbt-versions/release-notes/80-June-2023/lint-format-rn.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/07-June-2023/lint-format-rn.md
rename to website/docs/docs/dbt-versions/release-notes/80-June-2023/lint-format-rn.md
diff --git a/website/docs/docs/dbt-versions/release-notes/07-June-2023/product-docs-jun.md b/website/docs/docs/dbt-versions/release-notes/80-June-2023/product-docs-jun.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/07-June-2023/product-docs-jun.md
rename to website/docs/docs/dbt-versions/release-notes/80-June-2023/product-docs-jun.md
diff --git a/website/docs/docs/dbt-versions/release-notes/08-May-2023/discovery-api-public-preview.md b/website/docs/docs/dbt-versions/release-notes/81-May-2023/discovery-api-public-preview.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/08-May-2023/discovery-api-public-preview.md
rename to website/docs/docs/dbt-versions/release-notes/81-May-2023/discovery-api-public-preview.md
diff --git a/website/docs/docs/dbt-versions/release-notes/08-May-2023/may-ide-updates.md b/website/docs/docs/dbt-versions/release-notes/81-May-2023/may-ide-updates.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/08-May-2023/may-ide-updates.md
rename to website/docs/docs/dbt-versions/release-notes/81-May-2023/may-ide-updates.md
diff --git a/website/docs/docs/dbt-versions/release-notes/08-May-2023/product-docs-may.md b/website/docs/docs/dbt-versions/release-notes/81-May-2023/product-docs-may.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/08-May-2023/product-docs-may.md
rename to website/docs/docs/dbt-versions/release-notes/81-May-2023/product-docs-may.md
diff --git a/website/docs/docs/dbt-versions/release-notes/08-May-2023/run-details-and-logs-improvements.md b/website/docs/docs/dbt-versions/release-notes/81-May-2023/run-details-and-logs-improvements.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/08-May-2023/run-details-and-logs-improvements.md
rename to website/docs/docs/dbt-versions/release-notes/81-May-2023/run-details-and-logs-improvements.md
diff --git a/website/docs/docs/dbt-versions/release-notes/08-May-2023/run-history-endpoint.md b/website/docs/docs/dbt-versions/release-notes/81-May-2023/run-history-endpoint.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/08-May-2023/run-history-endpoint.md
rename to website/docs/docs/dbt-versions/release-notes/81-May-2023/run-history-endpoint.md
diff --git a/website/docs/docs/dbt-versions/release-notes/08-May-2023/run-history-improvements.md b/website/docs/docs/dbt-versions/release-notes/81-May-2023/run-history-improvements.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/08-May-2023/run-history-improvements.md
rename to website/docs/docs/dbt-versions/release-notes/81-May-2023/run-history-improvements.md
diff --git a/website/docs/docs/dbt-versions/release-notes/09-April-2023/api-endpoint-restriction.md b/website/docs/docs/dbt-versions/release-notes/82-April-2023/api-endpoint-restriction.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/09-April-2023/api-endpoint-restriction.md
rename to website/docs/docs/dbt-versions/release-notes/82-April-2023/api-endpoint-restriction.md
diff --git a/website/docs/docs/dbt-versions/release-notes/09-April-2023/apr-ide-updates.md b/website/docs/docs/dbt-versions/release-notes/82-April-2023/apr-ide-updates.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/09-April-2023/apr-ide-updates.md
rename to website/docs/docs/dbt-versions/release-notes/82-April-2023/apr-ide-updates.md
diff --git a/website/docs/docs/dbt-versions/release-notes/09-April-2023/product-docs.md b/website/docs/docs/dbt-versions/release-notes/82-April-2023/product-docs.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/09-April-2023/product-docs.md
rename to website/docs/docs/dbt-versions/release-notes/82-April-2023/product-docs.md
diff --git a/website/docs/docs/dbt-versions/release-notes/09-April-2023/scheduler-optimized.md b/website/docs/docs/dbt-versions/release-notes/82-April-2023/scheduler-optimized.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/09-April-2023/scheduler-optimized.md
rename to website/docs/docs/dbt-versions/release-notes/82-April-2023/scheduler-optimized.md
diff --git a/website/docs/docs/dbt-versions/release-notes/09-April-2023/starburst-trino-ga.md b/website/docs/docs/dbt-versions/release-notes/82-April-2023/starburst-trino-ga.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/09-April-2023/starburst-trino-ga.md
rename to website/docs/docs/dbt-versions/release-notes/82-April-2023/starburst-trino-ga.md
diff --git a/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/1.0-deprecation.md b/website/docs/docs/dbt-versions/release-notes/83-Mar-2023/1.0-deprecation.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/10-Mar-2023/1.0-deprecation.md
rename to website/docs/docs/dbt-versions/release-notes/83-Mar-2023/1.0-deprecation.md
diff --git a/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/apiv2-limit.md b/website/docs/docs/dbt-versions/release-notes/83-Mar-2023/apiv2-limit.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/10-Mar-2023/apiv2-limit.md
rename to website/docs/docs/dbt-versions/release-notes/83-Mar-2023/apiv2-limit.md
diff --git a/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/mar-ide-updates.md b/website/docs/docs/dbt-versions/release-notes/83-Mar-2023/mar-ide-updates.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/10-Mar-2023/mar-ide-updates.md
rename to website/docs/docs/dbt-versions/release-notes/83-Mar-2023/mar-ide-updates.md
diff --git a/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/public-preview-trino-in-dbt-cloud.md b/website/docs/docs/dbt-versions/release-notes/83-Mar-2023/public-preview-trino-in-dbt-cloud.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/10-Mar-2023/public-preview-trino-in-dbt-cloud.md
rename to website/docs/docs/dbt-versions/release-notes/83-Mar-2023/public-preview-trino-in-dbt-cloud.md
diff --git a/website/docs/docs/dbt-versions/release-notes/11-Feb-2023/feb-ide-updates.md b/website/docs/docs/dbt-versions/release-notes/84-Feb-2023/feb-ide-updates.md
similarity index 94%
rename from website/docs/docs/dbt-versions/release-notes/11-Feb-2023/feb-ide-updates.md
rename to website/docs/docs/dbt-versions/release-notes/84-Feb-2023/feb-ide-updates.md
index d52ad2d4081..64fa2026d04 100644
--- a/website/docs/docs/dbt-versions/release-notes/11-Feb-2023/feb-ide-updates.md
+++ b/website/docs/docs/dbt-versions/release-notes/84-Feb-2023/feb-ide-updates.md
@@ -13,7 +13,6 @@ Learn more about the [February changes](https://getdbt.slack.com/archives/C03SAH
## New features
- Support for custom node colors in the IDE DAG visualization
-- Autosave prototype is now available under feature flag. [Contact](mailto:cloud-ide-feedback@dbtlabs.com) the dbt Labs IDE team to try this out
- Ref autocomplete includes models from seeds and snapshots
- Prevent menus from getting cropped (git controls dropdown, file tree dropdown, build button, editor tab options)
- Additional option to access the file menu by right-clicking on the files and folders in the file tree
diff --git a/website/docs/docs/dbt-versions/release-notes/11-Feb-2023/no-partial-parse-config.md b/website/docs/docs/dbt-versions/release-notes/84-Feb-2023/no-partial-parse-config.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/11-Feb-2023/no-partial-parse-config.md
rename to website/docs/docs/dbt-versions/release-notes/84-Feb-2023/no-partial-parse-config.md
diff --git a/website/docs/docs/dbt-versions/release-notes/12-Jan-2023/ide-updates.md b/website/docs/docs/dbt-versions/release-notes/85-Jan-2023/ide-updates.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/12-Jan-2023/ide-updates.md
rename to website/docs/docs/dbt-versions/release-notes/85-Jan-2023/ide-updates.md
diff --git a/website/docs/docs/dbt-versions/release-notes/23-Dec-2022/default-thread-value.md b/website/docs/docs/dbt-versions/release-notes/86-Dec-2022/default-thread-value.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/23-Dec-2022/default-thread-value.md
rename to website/docs/docs/dbt-versions/release-notes/86-Dec-2022/default-thread-value.md
diff --git a/website/docs/docs/dbt-versions/release-notes/23-Dec-2022/new-jobs-default-as-off.md b/website/docs/docs/dbt-versions/release-notes/86-Dec-2022/new-jobs-default-as-off.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/23-Dec-2022/new-jobs-default-as-off.md
rename to website/docs/docs/dbt-versions/release-notes/86-Dec-2022/new-jobs-default-as-off.md
diff --git a/website/docs/docs/dbt-versions/release-notes/23-Dec-2022/private-packages-clone-git-token.md b/website/docs/docs/dbt-versions/release-notes/86-Dec-2022/private-packages-clone-git-token.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/23-Dec-2022/private-packages-clone-git-token.md
rename to website/docs/docs/dbt-versions/release-notes/86-Dec-2022/private-packages-clone-git-token.md
diff --git a/website/docs/docs/dbt-versions/release-notes/24-Nov-2022/dbt-databricks-unity-catalog-support.md b/website/docs/docs/dbt-versions/release-notes/87-Nov-2022/dbt-databricks-unity-catalog-support.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/24-Nov-2022/dbt-databricks-unity-catalog-support.md
rename to website/docs/docs/dbt-versions/release-notes/87-Nov-2022/dbt-databricks-unity-catalog-support.md
diff --git a/website/docs/docs/dbt-versions/release-notes/24-Nov-2022/ide-features-ide-deprecation.md b/website/docs/docs/dbt-versions/release-notes/87-Nov-2022/ide-features-ide-deprecation.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/24-Nov-2022/ide-features-ide-deprecation.md
rename to website/docs/docs/dbt-versions/release-notes/87-Nov-2022/ide-features-ide-deprecation.md
diff --git a/website/docs/docs/dbt-versions/release-notes/25-Oct-2022/cloud-integration-azure.md b/website/docs/docs/dbt-versions/release-notes/88-Oct-2022/cloud-integration-azure.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/25-Oct-2022/cloud-integration-azure.md
rename to website/docs/docs/dbt-versions/release-notes/88-Oct-2022/cloud-integration-azure.md
diff --git a/website/docs/docs/dbt-versions/release-notes/25-Oct-2022/new-ide-launch.md b/website/docs/docs/dbt-versions/release-notes/88-Oct-2022/new-ide-launch.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/25-Oct-2022/new-ide-launch.md
rename to website/docs/docs/dbt-versions/release-notes/88-Oct-2022/new-ide-launch.md
diff --git a/website/docs/docs/dbt-versions/release-notes/26-Sept-2022/liststeps-endpoint-deprecation.md b/website/docs/docs/dbt-versions/release-notes/89-Sept-2022/liststeps-endpoint-deprecation.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/26-Sept-2022/liststeps-endpoint-deprecation.md
rename to website/docs/docs/dbt-versions/release-notes/89-Sept-2022/liststeps-endpoint-deprecation.md
diff --git a/website/docs/docs/dbt-versions/release-notes/26-Sept-2022/metadata-api-data-retention-limits.md b/website/docs/docs/dbt-versions/release-notes/89-Sept-2022/metadata-api-data-retention-limits.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/26-Sept-2022/metadata-api-data-retention-limits.md
rename to website/docs/docs/dbt-versions/release-notes/89-Sept-2022/metadata-api-data-retention-limits.md
diff --git a/website/docs/docs/dbt-versions/release-notes/27-Aug-2022/ide-improvement-beta.md b/website/docs/docs/dbt-versions/release-notes/91-Aug-2022/ide-improvement-beta.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/27-Aug-2022/ide-improvement-beta.md
rename to website/docs/docs/dbt-versions/release-notes/91-Aug-2022/ide-improvement-beta.md
diff --git a/website/docs/docs/dbt-versions/release-notes/27-Aug-2022/support-redshift-ra3.md b/website/docs/docs/dbt-versions/release-notes/91-Aug-2022/support-redshift-ra3.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/27-Aug-2022/support-redshift-ra3.md
rename to website/docs/docs/dbt-versions/release-notes/91-Aug-2022/support-redshift-ra3.md
diff --git a/website/docs/docs/dbt-versions/release-notes/28-July-2022/render-lineage-feature.md b/website/docs/docs/dbt-versions/release-notes/92-July-2022/render-lineage-feature.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/28-July-2022/render-lineage-feature.md
rename to website/docs/docs/dbt-versions/release-notes/92-July-2022/render-lineage-feature.md
diff --git a/website/docs/docs/dbt-versions/release-notes/29-May-2022/gitlab-auth.md b/website/docs/docs/dbt-versions/release-notes/93-May-2022/gitlab-auth.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/29-May-2022/gitlab-auth.md
rename to website/docs/docs/dbt-versions/release-notes/93-May-2022/gitlab-auth.md
diff --git a/website/docs/docs/dbt-versions/release-notes/30-April-2022/audit-log.md b/website/docs/docs/dbt-versions/release-notes/94-April-2022/audit-log.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/30-April-2022/audit-log.md
rename to website/docs/docs/dbt-versions/release-notes/94-April-2022/audit-log.md
diff --git a/website/docs/docs/dbt-versions/release-notes/30-April-2022/credentials-saved.md b/website/docs/docs/dbt-versions/release-notes/94-April-2022/credentials-saved.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/30-April-2022/credentials-saved.md
rename to website/docs/docs/dbt-versions/release-notes/94-April-2022/credentials-saved.md
diff --git a/website/docs/docs/dbt-versions/release-notes/30-April-2022/email-verification.md b/website/docs/docs/dbt-versions/release-notes/94-April-2022/email-verification.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/30-April-2022/email-verification.md
rename to website/docs/docs/dbt-versions/release-notes/94-April-2022/email-verification.md
diff --git a/website/docs/docs/dbt-versions/release-notes/30-April-2022/scheduler-improvements.md b/website/docs/docs/dbt-versions/release-notes/94-April-2022/scheduler-improvements.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/30-April-2022/scheduler-improvements.md
rename to website/docs/docs/dbt-versions/release-notes/94-April-2022/scheduler-improvements.md
diff --git a/website/docs/docs/dbt-versions/release-notes/31-March-2022/ide-timeout-message.md b/website/docs/docs/dbt-versions/release-notes/95-March-2022/ide-timeout-message.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/31-March-2022/ide-timeout-message.md
rename to website/docs/docs/dbt-versions/release-notes/95-March-2022/ide-timeout-message.md
diff --git a/website/docs/docs/dbt-versions/release-notes/31-March-2022/prep-and-waiting-time.md b/website/docs/docs/dbt-versions/release-notes/95-March-2022/prep-and-waiting-time.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/31-March-2022/prep-and-waiting-time.md
rename to website/docs/docs/dbt-versions/release-notes/95-March-2022/prep-and-waiting-time.md
diff --git a/website/docs/docs/dbt-versions/release-notes/32-February-2022/DAG-updates-more.md b/website/docs/docs/dbt-versions/release-notes/96-February-2022/DAG-updates-more.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/32-February-2022/DAG-updates-more.md
rename to website/docs/docs/dbt-versions/release-notes/96-February-2022/DAG-updates-more.md
diff --git a/website/docs/docs/dbt-versions/release-notes/32-February-2022/service-tokens-more.md b/website/docs/docs/dbt-versions/release-notes/96-February-2022/service-tokens-more.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/32-February-2022/service-tokens-more.md
rename to website/docs/docs/dbt-versions/release-notes/96-February-2022/service-tokens-more.md
diff --git a/website/docs/docs/dbt-versions/release-notes/33-January-2022/IDE-autocomplete-more.md b/website/docs/docs/dbt-versions/release-notes/97-January-2022/IDE-autocomplete-more.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/33-January-2022/IDE-autocomplete-more.md
rename to website/docs/docs/dbt-versions/release-notes/97-January-2022/IDE-autocomplete-more.md
diff --git a/website/docs/docs/dbt-versions/release-notes/33-January-2022/model-timing-more.md b/website/docs/docs/dbt-versions/release-notes/97-January-2022/model-timing-more.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/33-January-2022/model-timing-more.md
rename to website/docs/docs/dbt-versions/release-notes/97-January-2022/model-timing-more.md
diff --git a/website/docs/docs/dbt-versions/release-notes/34-dbt-cloud-changelog-2021.md b/website/docs/docs/dbt-versions/release-notes/98-dbt-cloud-changelog-2021.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/34-dbt-cloud-changelog-2021.md
rename to website/docs/docs/dbt-versions/release-notes/98-dbt-cloud-changelog-2021.md
diff --git a/website/docs/docs/dbt-versions/release-notes/35-dbt-cloud-changelog-2019-2020.md b/website/docs/docs/dbt-versions/release-notes/99-dbt-cloud-changelog-2019-2020.md
similarity index 100%
rename from website/docs/docs/dbt-versions/release-notes/35-dbt-cloud-changelog-2019-2020.md
rename to website/docs/docs/dbt-versions/release-notes/99-dbt-cloud-changelog-2019-2020.md
diff --git a/website/docs/docs/deploy/airgapped.md b/website/docs/docs/deploy/airgapped.md
deleted file mode 100644
index a08370fef8c..00000000000
--- a/website/docs/docs/deploy/airgapped.md
+++ /dev/null
@@ -1,19 +0,0 @@
----
-id: airgapped-deployment
-title: Airgapped (Beta)
----
-
-:::info Airgapped
-
-This section provides a high level summary of the airgapped deployment type for dbt Cloud. This deployment type is currently in Beta and may not be supported in the long term.
-If you’re interested in learning more about airgapped deployments for dbt Cloud, contact us at sales@getdbt.com.
-
-:::
-
-The airgapped deployment is similar to an on-premise installation in that the dbt Cloud instance will live in your network, and is subject to your security procedures, technologies, and controls. An airgapped install allows you to run dbt Cloud without any external network dependencies and is ideal for organizations that have strict rules around installing software from the cloud.
-
-The installation process for airgapped is a bit different. Instead of downloading and installing images during installation time, you will download all of the necessary configuration and Docker images before starting the installation process, and manage uploading these images yourself. This means that you can remove all external network dependencies and run this application in a very secure environment.
-
-For more information about the dbt Cloud Airgapped deployment see the below.
-
-- [Customer Managed Network Architecture](/docs/cloud/about-cloud/architecture)
diff --git a/website/docs/docs/deploy/job-commands.md b/website/docs/docs/deploy/job-commands.md
index db284c78a05..26fe1931db6 100644
--- a/website/docs/docs/deploy/job-commands.md
+++ b/website/docs/docs/deploy/job-commands.md
@@ -41,7 +41,7 @@ For every job, you have the option to select the [Generate docs on run](/docs/co
### Command list
-You can add or remove as many [dbt commands](/reference/dbt-commands) as necessary for every job. However, you need to have at least one dbt command. There are few commands listed as "dbt Core" in the [dbt Command reference doc](/reference/dbt-commands) page. This means they are meant for use in [dbt Core](/docs/core/about-dbt-core) only and are not available in dbt Cloud.
+You can add or remove as many dbt commands as necessary for every job. However, you need to have at least one dbt command. There are few commands listed as "dbt Cloud CLI" or "dbt Core" in the [dbt Command reference page](/reference/dbt-commands) page. This means they are meant for use in dbt Core or dbt Cloud CLI, and not in dbt Cloud IDE.
:::tip Using selectors
diff --git a/website/docs/docs/deploy/retry-jobs.md b/website/docs/docs/deploy/retry-jobs.md
index ea616121f38..beefb35379e 100644
--- a/website/docs/docs/deploy/retry-jobs.md
+++ b/website/docs/docs/deploy/retry-jobs.md
@@ -26,7 +26,7 @@ If your dbt job run completed with a status of **Error**, you can rerun it from
## Related content
-- [Retry a failed run for a job](/dbt-cloud/api-v2#/operations/Retry%20a%20failed%20run%20for%20a%20job) API endpoint
+- [Retry a failed run for a job](/dbt-cloud/api-v2#/operations/Retry%20Failed%20Job) API endpoint
- [Run visibility](/docs/deploy/run-visibility)
- [Jobs](/docs/deploy/jobs)
-- [Job commands](/docs/deploy/job-commands)
\ No newline at end of file
+- [Job commands](/docs/deploy/job-commands)
diff --git a/website/docs/docs/introduction.md b/website/docs/docs/introduction.md
index 61cda6e1d3e..c575a9ae657 100644
--- a/website/docs/docs/introduction.md
+++ b/website/docs/docs/introduction.md
@@ -5,6 +5,7 @@ pagination_next: null
pagination_prev: null
---
+
dbt compiles and runs your analytics code against your data platform, enabling you and your team to collaborate on a single source of truth for metrics, insights, and business definitions. This single source of truth, combined with the ability to define tests for your data, reduces errors when logic changes, and alerts you when issues arise.
diff --git a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md
index b3b6ffb3e45..f1e631f0d78 100644
--- a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md
+++ b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md
@@ -11,9 +11,9 @@ You can run your dbt projects with [dbt Cloud](/docs/cloud/about-cloud/dbt-cloud
- Share your [dbt project's documentation](/docs/collaborate/build-and-view-your-docs) with your team.
- Integrates with the dbt Cloud IDE, allowing you to run development tasks and environment in the dbt Cloud UI for a seamless experience.
- The dbt Cloud CLI to develop and run dbt commands against your dbt Cloud development environment from your local command line.
- - For more details, refer to [Develop in the Cloud](/docs/cloud/about-cloud-develop).
+ - For more details, refer to [Develop dbt](/docs/cloud/about-develop-dbt).
-- **dbt Core**: An open source project where you can develop from the [command line](/docs/core/about-dbt-core).
+- **dbt Core**: An open source project where you can develop from the [command line](/docs/core/installation-overview).
The dbt Cloud CLI and dbt Core are both command line tools that enable you to run dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its [features](/docs/cloud/about-cloud/dbt-cloud-features).
diff --git a/website/docs/docs/supported-data-platforms.md b/website/docs/docs/supported-data-platforms.md
index c0c9a30db36..079e2018982 100644
--- a/website/docs/docs/supported-data-platforms.md
+++ b/website/docs/docs/supported-data-platforms.md
@@ -41,6 +41,3 @@ The following are **Trusted adapters** ✓ you can connect to in dbt Core:
import AdaptersTrusted from '/snippets/_adapters-trusted.md';
-
-
* Install these adapters using dbt Core as they're not currently supported in dbt Cloud.
-
diff --git a/website/docs/docs/trusted-adapters.md b/website/docs/docs/trusted-adapters.md
index 20d61f69575..7b7af7d0790 100644
--- a/website/docs/docs/trusted-adapters.md
+++ b/website/docs/docs/trusted-adapters.md
@@ -25,12 +25,12 @@ Refer to the [Build, test, document, and promote adapters](/guides/adapter-creat
### Trusted vs Verified
-The Verification program exists to highlight adapters that meets both of the following criteria:
+The Verification program exists to highlight adapters that meet both of the following criteria:
- the guidelines given in the Trusted program,
- formal agreements required for integration with dbt Cloud
-For more information on the Verified Adapter program, reach out the [dbt Labs partnerships team](mailto:partnerships@dbtlabs.com)
+For more information on the Verified Adapter program, reach out to the [dbt Labs partnerships team](mailto:partnerships@dbtlabs.com)
### Trusted adapters
diff --git a/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md b/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md
index 4f4621fa860..be02fedb230 100644
--- a/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md
+++ b/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md
@@ -33,6 +33,7 @@ import AvailIntegrations from '/snippets/_sl-partner-links.md';
- {frontMatter.meta.api_name} to learn how to integrate and query your metrics in downstream tools.
- [dbt Semantic Layer API query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata)
- [Hex dbt Semantic Layer cells](https://learn.hex.tech/docs/logic-cell-types/transform-cells/dbt-metrics-cells) to set up SQL cells in Hex.
+- [Resolve 'Failed APN'](/faqs/Troubleshooting/sl-alpn-error) error when connecting to the dbt Semantic Layer.
diff --git a/website/docs/docs/use-dbt-semantic-layer/gsheets.md b/website/docs/docs/use-dbt-semantic-layer/gsheets.md
index cb9f4014803..d7525fa7b26 100644
--- a/website/docs/docs/use-dbt-semantic-layer/gsheets.md
+++ b/website/docs/docs/use-dbt-semantic-layer/gsheets.md
@@ -17,6 +17,8 @@ The dbt Semantic Layer offers a seamless integration with Google Sheets through
- You have a Google account with access to Google Sheets.
- You can install Google add-ons.
- You have a dbt Cloud Environment ID and a [service token](/docs/dbt-cloud-apis/service-tokens) to authenticate with from a dbt Cloud account.
+- You must have a dbt Cloud Team or Enterprise [account](https://www.getdbt.com/pricing). Suitable for both Multi-tenant and Single-tenant deployment.
+ - Single-tenant accounts should contact their account representative for necessary setup and enablement.
## Installing the add-on
@@ -54,10 +56,9 @@ To use the filter functionality, choose the [dimension](docs/build/dimensions) y
- For categorical dimensiosn, type in the dimension value you want to filter by (no quotes needed) and press enter.
- Continue adding additional filters as needed with AND and OR. If it's a time dimension, choose the operator and select from the calendar.
-
-
**Limited Use Policy Disclosure**
The dbt Semantic Layer for Sheet's use and transfer to any other app of information received from Google APIs will adhere to [Google API Services User Data Policy](https://developers.google.com/terms/api-services-user-data-policy), including the Limited Use requirements.
-
+## FAQs
+
diff --git a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md
index 84e3227b4e7..62437f4ecd6 100644
--- a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md
+++ b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md
@@ -26,7 +26,7 @@ MetricFlow, a powerful component of the dbt Semantic Layer, simplifies the creat
Use this guide to fully experience the power of the universal dbt Semantic Layer. Here are the following steps you'll take:
- [Create a semantic model](#create-a-semantic-model) in dbt Cloud using MetricFlow
-- [Define metrics](#define-metrics) in dbt Cloud using MetricFlow
+- [Define metrics](#define-metrics) in dbt using MetricFlow
- [Test and query metrics](#test-and-query-metrics) with MetricFlow
- [Run a production job](#run-a-production-job) in dbt Cloud
- [Set up dbt Semantic Layer](#setup) in dbt Cloud
@@ -88,20 +88,9 @@ import SlSetUp from '/snippets/_new-sl-setup.md';
If you're encountering some issues when defining your metrics or setting up the dbt Semantic Layer, check out a list of answers to some of the questions or problems you may be experiencing.
-
- How do I migrate from the legacy Semantic Layer to the new one?
-
-
If you're using the legacy Semantic Layer, we highly recommend you
upgrade your dbt version to dbt v1.6 or higher to use the new dbt Semantic Layer. Refer to the dedicated
migration guide for more info.
-
-
-
-How are you storing my data?
-User data passes through the Semantic Layer on its way back from the warehouse. dbt Labs ensures security by authenticating through the customer's data warehouse. Currently, we don't cache data for the long term, but it might temporarily stay in the system for up to 10 minutes, usually less. In the future, we'll introduce a caching feature that allows us to cache data on our infrastructure for up to 24 hours.
-
-
- Is the dbt Semantic Layer open source?
- The dbt Semantic Layer is proprietary; however, some components of the dbt Semantic Layer are open source, such as dbt-core and MetricFlow.
dbt Cloud Developer or dbt Core users can define metrics in their project, including a local dbt Core project, using the dbt Cloud IDE, dbt Cloud CLI, or dbt Core CLI. However, to experience the universal dbt Semantic Layer and access those metrics using the API or downstream tools, users must be on a dbt Cloud Team or Enterprise plan.
Refer to Billing for more information.
-
+import SlFaqs from '/snippets/_sl-faqs.md';
+
+
## Next steps
diff --git a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md
index 75a853fcbe8..9aea2ab42b0 100644
--- a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md
+++ b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md
@@ -14,43 +14,38 @@ The dbt Semantic Layer allows you to define metrics and use various interfaces t
-## dbt Semantic Layer components
+## Components
The dbt Semantic Layer includes the following components:
| Components | Information | dbt Core users | Developer plans | Team plans | Enterprise plans | License |
-| --- | --- | :---: | :---: | :---: | --- |
+| --- | --- | :---: | :---: | :---: | :---: |
| **[MetricFlow](/docs/build/about-metricflow)** | MetricFlow in dbt allows users to centrally define their semantic models and metrics with YAML specifications. | ✅ | ✅ | ✅ | ✅ | BSL package (code is source available) |
-| **MetricFlow Server**| A proprietary server that takes metric requests and generates optimized SQL for the specific data platform. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)|
-| **Semantic Layer Gateway** | A service that passes queries to the MetricFlow server and executes the SQL generated by MetricFlow against the data platform|
❌ | ❌ |✅ | ✅ | Proprietary, Cloud (Team & Enterprise) |
-| **Semantic Layer APIs** | The interfaces allow users to submit metric queries using GraphQL and JDBC APIs. They also serve as the foundation for building first-class integrations with various tools. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)|
+| **dbt Semantic interfaces**| A configuration spec for defining metrics, dimensions, how they link to each other, and how to query them. The [dbt-semantic-interfaces](https://github.com/dbt-labs/dbt-semantic-interfaces) is available under Apache 2.0. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)|
+| **Service layer** | Coordinates query requests and dispatching the relevant metric query to the target query engine. This is provided through dbt Cloud and is available to all users on dbt version 1.6 or later. The service layer includes a Gateway service for executing SQL against the data platform. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise) |
+| **[Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview)** | The interfaces allow users to submit metric queries using GraphQL and JDBC APIs. They also serve as the foundation for building first-class integrations with various tools. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)|
-## Related questions
+## Feature comparison
-
- How do I migrate from the legacy Semantic Layer to the new one?
-
-
If you're using the legacy Semantic Layer, we highly recommend you
upgrade your dbt version to dbt v1.6 or higher to use the new dbt Semantic Layer. Refer to the dedicated
migration guide for more info.
-
-
-
-
-How are you storing my data?
-User data passes through the Semantic Layer on its way back from the warehouse. dbt Labs ensures security by authenticating through the customer's data warehouse. Currently, we don't cache data for the long term, but it might temporarily stay in the system for up to 10 minutes, usually less. In the future, we'll introduce a caching feature that allows us to cache data on our infrastructure for up to 24 hours.
-
-
- Is the dbt Semantic Layer open source?
-The dbt Semantic Layer is proprietary; however, some components of the dbt Semantic Layer are open source, such as dbt-core and MetricFlow.
dbt Cloud Developer or dbt Core users can define metrics in their project, including a local dbt Core project, using the dbt Cloud IDE, dbt Cloud CLI, or dbt Core CLI. However, to experience the universal dbt Semantic Layer and access those metrics using the API or downstream tools, users must be on a dbt Cloud Team or Enterprise plan.
Refer to Billing for more information.
-
-
- Is there a dbt Semantic Layer discussion hub?
-
-
+The following table compares the features available in dbt Cloud and source available in dbt Core:
+
+| Feature | MetricFlow Source available | dbt Semantic Layer with dbt Cloud |
+| ----- | :------: | :------: |
+| Define metrics and semantic models in dbt using the MetricFlow spec | ✅ | ✅ |
+| Generate SQL from a set of config files | ✅ | ✅ |
+| Query metrics and dimensions through the command line interface (CLI) | ✅ | ✅ |
+| Query dimension, entity, and metric metadata through the CLI | ✅ | ✅ |
+| Query metrics and dimensions through semantic APIs (ADBC, GQL) | ❌ | ✅ |
+| Connect to downstream integrations (Tableau, Hex, Mode, Google Sheets, and so on.) | ❌ | ✅ |
+| Create and run Exports to save metrics queries as tables in your data platform. | ❌ | Coming soon |
+
+## FAQs
+
+import SlFaqs from '/snippets/_sl-faqs.md';
+
+
diff --git a/website/docs/docs/use-dbt-semantic-layer/tableau.md b/website/docs/docs/use-dbt-semantic-layer/tableau.md
index 1d283023dda..0f12a75f468 100644
--- a/website/docs/docs/use-dbt-semantic-layer/tableau.md
+++ b/website/docs/docs/use-dbt-semantic-layer/tableau.md
@@ -21,7 +21,8 @@ This integration provides a live connection to the dbt Semantic Layer through Ta
- Note that Tableau Online does not currently support custom connectors natively. If you use Tableau Online, you will only be able to access the connector in Tableau Desktop.
- Log in to Tableau Desktop (with Online or Server credentials) or a license to Tableau Server
- You need your dbt Cloud host, [Environment ID](/docs/use-dbt-semantic-layer/setup-sl#set-up-dbt-semantic-layer) and [service token](/docs/dbt-cloud-apis/service-tokens) to log in. This account should be set up with the dbt Semantic Layer.
-- You must have a dbt Cloud Team or Enterprise [account](https://www.getdbt.com/pricing) and multi-tenant [deployment](/docs/cloud/about-cloud/regions-ip-addresses). (Single-Tenant coming soon)
+- You must have a dbt Cloud Team or Enterprise [account](https://www.getdbt.com/pricing). Suitable for both Multi-tenant and Single-tenant deployment.
+ - Single-tenant accounts should contact their account representative for necessary setup and enablement.
## Installing the Connector
@@ -36,7 +37,7 @@ This integration provides a live connection to the dbt Semantic Layer through Ta
2. Install the [JDBC driver](/docs/dbt-cloud-apis/sl-jdbc) to the folder based on your operating system:
- Windows: `C:\Program Files\Tableau\Drivers`
- - Mac: `~/Library/Tableau/Drivers`
+ - Mac: `~/Library/Tableau/Drivers` or `/Library/JDBC` or `~/Library/JDBC`
- Linux: ` /opt/tableau/tableau_driver/jdbc`
3. Open Tableau Desktop or Tableau Server and find the **dbt Semantic Layer by dbt Labs** connector on the left-hand side. You may need to restart these applications for the connector to be available.
4. Connect with your Host, Environment ID, and Service Token information dbt Cloud provides during [Semantic Layer configuration](/docs/use-dbt-semantic-layer/setup-sl#:~:text=After%20saving%20it%2C%20you%27ll%20be%20provided%20with%20the%20connection%20information%20that%20allows%20you%20to%20connect%20to%20downstream%20tools).
@@ -80,3 +81,5 @@ The following Tableau features aren't supported at this time, however, the dbt S
- Filtering on a Date Part time dimension for a Cumulative metric type
- Changing your date dimension to use "Week Number"
+## FAQs
+
diff --git a/website/docs/faqs/API/_category_.yaml b/website/docs/faqs/API/_category_.yaml
new file mode 100644
index 00000000000..fac67328a7a
--- /dev/null
+++ b/website/docs/faqs/API/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'API'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: API FAQs
+customProps:
+ description: Frequently asked questions about dbt APIs
diff --git a/website/docs/faqs/API/rotate-token.md b/website/docs/faqs/API/rotate-token.md
index a880825ea3f..144c834ea8a 100644
--- a/website/docs/faqs/API/rotate-token.md
+++ b/website/docs/faqs/API/rotate-token.md
@@ -7,6 +7,24 @@ id: rotate-token
For security reasons and best practices, you should aim to rotate API keys every so often.
+You can rotate your API key automatically with the push of a button in your dbt Cloud environment or manually using the command line.
+
+
+
+
+
+To automatically rotate your API key:
+
+1. Navigate to the Account settings by clicking the **gear icon** in the top right of your dbt Cloud account.
+2. Select **API Access** from the lefthand side.
+3. In the **API** pane, click `Rotate`.
+
+
+
+
+
+
+
1. Rotate your [User API token](/docs/dbt-cloud-apis/user-tokens) by replacing `YOUR_USER_ID`, `YOUR_CURRENT_TOKEN`, and `YOUR_ACCESS_URL `with your information in the following request.
```
@@ -41,3 +59,7 @@ For example, if your deployment is Virtual Private dbt:
✅ `http://cloud.customizedurl.getdbt.com/`
❌ `http://cloud.getdbt.com/`
+
+
+
+
\ No newline at end of file
diff --git a/website/docs/faqs/Accounts/_category_.yaml b/website/docs/faqs/Accounts/_category_.yaml
new file mode 100644
index 00000000000..b8ebee5fe2a
--- /dev/null
+++ b/website/docs/faqs/Accounts/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Accounts'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Account FAQs
+customProps:
+ description: Frequently asked questions about your account in dbt
diff --git a/website/docs/faqs/Core/_category_.yaml b/website/docs/faqs/Core/_category_.yaml
new file mode 100644
index 00000000000..bac4ad4a655
--- /dev/null
+++ b/website/docs/faqs/Core/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'dbt Core'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: 'dbt Core FAQs'
+customProps:
+ description: Frequently asked questions about dbt Core
diff --git a/website/docs/faqs/Core/install-pip-best-practices.md b/website/docs/faqs/Core/install-pip-best-practices.md
index e36d58296ec..72360a52acc 100644
--- a/website/docs/faqs/Core/install-pip-best-practices.md
+++ b/website/docs/faqs/Core/install-pip-best-practices.md
@@ -30,6 +30,6 @@ Before installing dbt, make sure you have the latest versions:
```shell
-pip install --upgrade pip wheel setuptools
+python -m pip install --upgrade pip wheel setuptools
```
diff --git a/website/docs/faqs/Core/install-pip-os-prereqs.md b/website/docs/faqs/Core/install-pip-os-prereqs.md
index 41a4e4ec60e..1eb6205512a 100644
--- a/website/docs/faqs/Core/install-pip-os-prereqs.md
+++ b/website/docs/faqs/Core/install-pip-os-prereqs.md
@@ -57,7 +57,7 @@ pip install cryptography~=3.4
```
-#### Windows
+### Windows
Windows requires Python and git to successfully install and run dbt Core.
diff --git a/website/docs/faqs/Docs/_category_.yaml b/website/docs/faqs/Docs/_category_.yaml
new file mode 100644
index 00000000000..8c7925dcc15
--- /dev/null
+++ b/website/docs/faqs/Docs/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'dbt Docs'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: dbt Docs FAQs
+customProps:
+ description: Frequently asked questions about dbt Docs
diff --git a/website/docs/faqs/Environments/_category_.yaml b/website/docs/faqs/Environments/_category_.yaml
new file mode 100644
index 00000000000..8d252d2c5d3
--- /dev/null
+++ b/website/docs/faqs/Environments/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Environments'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: 'Environments FAQs'
+customProps:
+ description: Frequently asked questions about Environments in dbt
diff --git a/website/docs/faqs/Git/_category_.yaml b/website/docs/faqs/Git/_category_.yaml
new file mode 100644
index 00000000000..0d9e5ee6e91
--- /dev/null
+++ b/website/docs/faqs/Git/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Git'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Git FAQs
+customProps:
+ description: Frequently asked questions about Git and dbt
diff --git a/website/docs/faqs/Jinja/_category_.yaml b/website/docs/faqs/Jinja/_category_.yaml
new file mode 100644
index 00000000000..809ca0bb8eb
--- /dev/null
+++ b/website/docs/faqs/Jinja/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Jinja'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Jinja FAQs
+customProps:
+ description: Frequently asked questions about Jinja and dbt
diff --git a/website/docs/faqs/Models/_category_.yaml b/website/docs/faqs/Models/_category_.yaml
new file mode 100644
index 00000000000..7398058db2b
--- /dev/null
+++ b/website/docs/faqs/Models/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Models'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Models FAQs
+customProps:
+ description: Frequently asked questions about Models in dbt
diff --git a/website/docs/faqs/Project/_category_.yaml b/website/docs/faqs/Project/_category_.yaml
new file mode 100644
index 00000000000..d2f695773f8
--- /dev/null
+++ b/website/docs/faqs/Project/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Projects'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Project FAQs
+customProps:
+ description: Frequently asked questions about projects in dbt
diff --git a/website/docs/faqs/Runs/_category_.yaml b/website/docs/faqs/Runs/_category_.yaml
new file mode 100644
index 00000000000..5867a0d3710
--- /dev/null
+++ b/website/docs/faqs/Runs/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Runs'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Runs FAQs
+customProps:
+ description: Frequently asked questions about runs in dbt
diff --git a/website/docs/faqs/Seeds/_category_.yaml b/website/docs/faqs/Seeds/_category_.yaml
new file mode 100644
index 00000000000..fd2f7d3d925
--- /dev/null
+++ b/website/docs/faqs/Seeds/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Seeds'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Seeds FAQs
+customProps:
+ description: Frequently asked questions about seeds in dbt
diff --git a/website/docs/faqs/Snapshots/_category_.yaml b/website/docs/faqs/Snapshots/_category_.yaml
new file mode 100644
index 00000000000..743b508fefe
--- /dev/null
+++ b/website/docs/faqs/Snapshots/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Snapshots'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Snapshots FAQs
+customProps:
+ description: Frequently asked questions about snapshots in dbt
diff --git a/website/docs/faqs/Tests/_category_.yaml b/website/docs/faqs/Tests/_category_.yaml
new file mode 100644
index 00000000000..754b8ec267b
--- /dev/null
+++ b/website/docs/faqs/Tests/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Tests'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Tests FAQs
+customProps:
+ description: Frequently asked questions about tests in dbt
diff --git a/website/docs/faqs/Tests/testing-sources.md b/website/docs/faqs/Tests/testing-sources.md
index 8eb769026e5..5e68b88dcbf 100644
--- a/website/docs/faqs/Tests/testing-sources.md
+++ b/website/docs/faqs/Tests/testing-sources.md
@@ -9,7 +9,7 @@ id: testing-sources
To run tests on all sources, use the following command:
```shell
-$ dbt test --select source:*
+ dbt test --select "source:*"
```
(You can also use the `-s` shorthand here instead of `--select`)
diff --git a/website/docs/faqs/Troubleshooting/_category_.yaml b/website/docs/faqs/Troubleshooting/_category_.yaml
new file mode 100644
index 00000000000..14c4b49044d
--- /dev/null
+++ b/website/docs/faqs/Troubleshooting/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Troubleshooting'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Troubleshooting FAQs
+customProps:
+ description: Frequently asked questions about troubleshooting dbt
diff --git a/website/docs/faqs/Troubleshooting/ip-restrictions.md b/website/docs/faqs/Troubleshooting/ip-restrictions.md
new file mode 100644
index 00000000000..9f1aa41c574
--- /dev/null
+++ b/website/docs/faqs/Troubleshooting/ip-restrictions.md
@@ -0,0 +1,29 @@
+---
+title: "I'm receiving a 403 error 'Forbidden: Access denied' when using service tokens"
+description: "All service token traffic is now subject to IP restrictions. To resolve 403 errors, add your third-party integration CIDRs (network addresses) to the allowlist."
+sidebar_label: 'Service token 403 error: Forbidden: Access denied'
+---
+
+
+All [service token](/docs/dbt-cloud-apis/service-tokens) traffic is subject to IP restrictions.
+
+When using a service token, the following 403 response error indicates the IP is not on the allowlist. To resolve this, you should add your third-party integration CIDRs (network addresses) to your allowlist.
+
+The following is an example of the 403 response error:
+
+```json
+ {
+ "status": {
+ "code": 403,
+ "is_success": False,
+ "user_message": ("Forbidden: Access denied"),
+ "developer_message": None,
+ },
+ "data": {
+ "account_id": ,
+ "user_id": ,
+ "is_service_token": ,
+ "account_access_denied": True,
+ },
+ }
+```
diff --git a/website/docs/faqs/Troubleshooting/sl-alpn-error.md b/website/docs/faqs/Troubleshooting/sl-alpn-error.md
new file mode 100644
index 00000000000..f588d690fac
--- /dev/null
+++ b/website/docs/faqs/Troubleshooting/sl-alpn-error.md
@@ -0,0 +1,14 @@
+---
+title: I'm receiving an `Failed ALPN` error when trying to connect to the dbt Semantic Layer.
+description: "To resolve the 'Failed ALPN' error in the dbt Semantic Layer, create a SSL interception exception for the dbt Cloud domain."
+sidebar_label: 'Use SSL exception to resolve `Failed ALPN` error'
+---
+
+If you're receiving a `Failed ALPN` error when trying to connect the dbt Semantic Layer with the various [data integration tools](/docs/use-dbt-semantic-layer/avail-sl-integrations) (such as Tableau, DBeaver, Datagrip, ADBC, or JDBC), it typically happens when connecting from a computer behind a corporate VPN or Proxy (like Zscaler or Check Point).
+
+The root cause is typically the proxy interfering with the TLS handshake as the dbt Semantic Layer uses gRPC/HTTP2 for connectivity. To resolve this:
+
+- If your proxy supports gRPC/HTTP2 but isn't configured to allow ALPN, adjust its settings accordingly to allow ALPN. Or create an exception for the dbt Cloud domain.
+- If your proxy does not support gRPC/HTTP2, add an SSL interception exception for the dbt Cloud domain in your proxy settings
+
+This should help in successfully establishing the connection without the Failed ALPN error.
diff --git a/website/docs/faqs/Warehouse/_category_.yaml b/website/docs/faqs/Warehouse/_category_.yaml
new file mode 100644
index 00000000000..4de6e2e7d5e
--- /dev/null
+++ b/website/docs/faqs/Warehouse/_category_.yaml
@@ -0,0 +1,10 @@
+# position: 2.5 # float position is supported
+label: 'Warehouse'
+collapsible: true # make the category collapsible
+collapsed: true # keep the category collapsed by default
+className: red
+link:
+ type: generated-index
+ title: Warehouse FAQs
+customProps:
+ description: Frequently asked questions about warehouses and dbt
diff --git a/website/docs/guides/adapter-creation.md b/website/docs/guides/adapter-creation.md
index 8a9145f0258..8bf082b04a0 100644
--- a/website/docs/guides/adapter-creation.md
+++ b/website/docs/guides/adapter-creation.md
@@ -799,7 +799,7 @@ dbt-tests-adapter
```sh
-pip install -r dev_requirements.txt
+python -m pip install -r dev_requirements.txt
```
### Set up and configure pytest
@@ -1108,7 +1108,7 @@ The following subjects need to be addressed across three pages of this docs site
| How To... | File to change within `/website/docs/` | Action | Info to Include |
|----------------------|--------------------------------------------------------------|--------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
-| Connect | `/docs/core/connect-data-platform/{MY-DATA-PLATFORM}-setup.md` | Create | Give all information needed to define a target in `~/.dbt/profiles.yml` and get `dbt debug` to connect to the database successfully. All possible configurations should be mentioned. |
+| Connect | `/docs/core/connect-data-platform/{MY-DATA-PLATFORM}-setup.md` | Create | Give all information needed to define a target in `~/.dbt/profiles.yml` and get `dbt debug` to connect to the database successfully. All possible configurations should be mentioned. |
| Configure | `reference/resource-configs/{MY-DATA-PLATFORM}-configs.md` | Create | What options and configuration specific to your data platform do users need to know? e.g. table distribution and indexing options, column_quoting policy, which incremental strategies are supported |
| Discover and Install | `docs/supported-data-platforms.md` | Modify | Is it a vendor- or community- supported adapter? How to install Python adapter package? Ideally with pip and PyPI hosted package, but can also use `git+` link to GitHub Repo |
| Add link to sidebar | `website/sidebars.js` | Modify | Add the document id to the correct location in the sidebar menu |
@@ -1123,6 +1123,14 @@ Below are some recent pull requests made by partners to document their data plat
- [SingleStore](https://github.com/dbt-labs/docs.getdbt.com/pull/1044)
- [Firebolt](https://github.com/dbt-labs/docs.getdbt.com/pull/941)
+Note — Use the following re-usable component to auto-fill the frontmatter content on your new page:
+
+```markdown
+import SetUpPages from '/snippets/_setup-pages-intro.md';
+
+
+```
+
## Promote a new adapter
The most important thing here is recognizing that people are successful in the community when they join, first and foremost, to engage authentically.
diff --git a/website/docs/guides/codespace-qs.md b/website/docs/guides/codespace-qs.md
index 7712ed8f8e8..b28b0ddaacf 100644
--- a/website/docs/guides/codespace-qs.md
+++ b/website/docs/guides/codespace-qs.md
@@ -61,7 +61,7 @@ If you'd like to work with a larger selection of Jaffle Shop data, you can gener
1. Install the Python package called [jafgen](https://pypi.org/project/jafgen/). At the terminal's prompt, run:
```shell
- /workspaces/test (main) $ pip install jafgen
+ /workspaces/test (main) $ python -m pip install jafgen
```
1. When installation is done, run:
diff --git a/website/docs/guides/create-new-materializations.md b/website/docs/guides/create-new-materializations.md
index 1ad7d202de6..af2732c0c39 100644
--- a/website/docs/guides/create-new-materializations.md
+++ b/website/docs/guides/create-new-materializations.md
@@ -7,7 +7,6 @@ hoverSnippet: Learn how to create your own materializations.
# time_to_complete: '30 minutes' commenting out until we test
icon: 'guides'
hide_table_of_contents: true
-tags: ['dbt Core']
level: 'Advanced'
recently_updated: true
---
diff --git a/website/docs/guides/custom-cicd-pipelines.md b/website/docs/guides/custom-cicd-pipelines.md
index 672c6e6dab8..bd6d7617623 100644
--- a/website/docs/guides/custom-cicd-pipelines.md
+++ b/website/docs/guides/custom-cicd-pipelines.md
@@ -336,7 +336,7 @@ lint-project:
rules:
- if: $CI_PIPELINE_SOURCE == "push" && $CI_COMMIT_BRANCH != 'main'
script:
- - pip install sqlfluff==0.13.1
+ - python -m pip install sqlfluff==0.13.1
- sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022
# this job calls the dbt Cloud API to run a job
@@ -379,7 +379,7 @@ steps:
displayName: 'Use Python 3.7'
- script: |
- pip install requests
+ python -m pip install requests
displayName: 'Install python dependencies'
- script: |
@@ -434,7 +434,7 @@ pipelines:
- step:
name: Lint dbt project
script:
- - pip install sqlfluff==0.13.1
+ - python -m pip install sqlfluff==0.13.1
- sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022
'main': # override if your default branch doesn't run on a branch named "main"
diff --git a/website/docs/guides/dremio-lakehouse.md b/website/docs/guides/dremio-lakehouse.md
new file mode 100644
index 00000000000..378ec857f6a
--- /dev/null
+++ b/website/docs/guides/dremio-lakehouse.md
@@ -0,0 +1,196 @@
+---
+title: Build a data lakehouse with dbt Core and Dremio Cloud
+id: build-dremio-lakehouse
+description: Learn how to build a data lakehouse with dbt Core and Dremio Cloud.
+displayText: Build a data lakehouse with dbt Core and Dremio Cloud
+hoverSnippet: Learn how to build a data lakehouse with dbt Core and Dremio Cloud
+# time_to_complete: '30 minutes' commenting out until we test
+platform: 'dbt-core'
+icon: 'guides'
+hide_table_of_contents: true
+tags: ['Dremio', 'dbt Core']
+level: 'Intermediate'
+recently_updated: true
+---
+## Introduction
+
+This guide will demonstrate how to build a data lakehouse with dbt Core 1.5 or newer and Dremio Cloud. You can simplify and optimize your data infrastructure with dbt's robust transformation framework and Dremio’s open and easy data lakehouse. The integrated solution empowers companies to establish a strong data and analytics foundation, fostering self-service analytics and enhancing business insights while simplifying operations by eliminating the necessity to write complex Extract, Transform, and Load (ETL) pipelines.
+
+### Prerequisites
+
+* You must have a [Dremio Cloud](https://docs.dremio.com/cloud/) account.
+* You must have Python 3 installed.
+* You must have dbt Core v1.5 or newer [installed](//docs/core/installation-overview).
+* You must have the Dremio adapter 1.5.0 or newer [installed and configured](/docs/core/connect-data-platform/dremio-setup) for Dremio Cloud.
+* You must have basic working knowledge of Git and the command line interface (CLI).
+
+## Validate your environment
+
+Validate your environment by running the following commands in your CLI and verifying the results:
+
+```shell
+
+$ python3 --version
+Python 3.11.4 # Must be Python 3
+
+```
+
+```shell
+
+$ dbt --version
+Core:
+ - installed: 1.5.0 # Must be 1.5 or newer
+ - latest: 1.6.3 - Update available!
+
+ Your version of dbt-core is out of date!
+ You can find instructions for upgrading here:
+ https://docs.getdbt.com/docs/installation
+
+Plugins:
+ - dremio: 1.5.0 - Up to date! # Must be 1.5 or newer
+
+```
+
+## Getting started
+
+1. Clone the Dremio dbt Core sample project from the [GitHub repo](https://github.com/dremio-brock/DremioDBTSample/tree/master/dremioSamples).
+
+2. In your integrated development environment (IDE), open the relation.py file in the Dremio adapter directory:
+ `$HOME/Library/Python/3.9/lib/python/site-packages/dbt/adapters/dremio/relation.py`
+
+3. Find and update lines 51 and 52 to match the following syntax:
+
+```python
+
+PATTERN = re.compile(r"""((?:[^."']|"[^"]*"|'[^']*')+)""")
+return ".".join(PATTERN.split(identifier)[1::2])
+
+```
+
+The complete selection should look like this:
+
+```python
+def quoted_by_component(self, identifier, componentName):
+ if componentName == ComponentName.Schema:
+ PATTERN = re.compile(r"""((?:[^."']|"[^"]*"|'[^']*')+)""")
+ return ".".join(PATTERN.split(identifier)[1::2])
+ else:
+ return self.quoted(identifier)
+
+```
+
+You need to update this pattern because the plugin doesn’t support schema names in Dremio containing dots and spaces.
+
+## Build your pipeline
+
+1. Create a `profiles.yml` file in the `$HOME/.dbt/profiles.yml` path and add the following configs:
+
+```yaml
+
+dremioSamples:
+ outputs:
+ cloud_dev:
+ dremio_space: dev
+ dremio_space_folder: no_schema
+ object_storage_path: dev
+ object_storage_source: $scratch
+ pat:
+ cloud_host: api.dremio.cloud
+ cloud_project_id:
+ threads: 1
+ type: dremio
+ use_ssl: true
+ user:
+ target: dev
+
+ ```
+
+ 2. Execute the transformation pipeline:
+
+ ```shell
+
+ $ dbt run -t cloud_dev
+
+ ```
+
+ If the above configurations have been implemented, the output will look something like this:
+
+```shell
+
+17:24:16 Running with dbt=1.5.0
+17:24:17 Found 5 models, 0 tests, 0 snapshots, 0 analyses, 348 macros, 0 operations, 0 seed files, 2 sources, 0 exposures, 0 metrics, 0 groups
+17:24:17
+17:24:29 Concurrency: 1 threads (target='cloud_dev')
+17:24:29
+17:24:29 1 of 5 START sql view model Preparation.trips .................................. [RUN]
+17:24:31 1 of 5 OK created sql view model Preparation. trips ............................. [OK in 2.61s]
+17:24:31 2 of 5 START sql view model Preparation.weather ................................ [RUN]
+17:24:34 2 of 5 OK created sql view model Preparation.weather ........................... [OK in 2.15s]
+17:24:34 3 of 5 START sql view model Business.Transportation.nyc_trips .................. [RUN]
+17:24:36 3 of 5 OK created sql view model Business.Transportation.nyc_trips ............. [OK in 2.18s]
+17:24:36 4 of 5 START sql view model Business.Weather.nyc_weather ....................... [RUN]
+17:24:38 4 of 5 OK created sql view model Business.Weather.nyc_weather .................. [OK in 2.09s]
+17:24:38 5 of 5 START sql view model Application.nyc_trips_with_weather ................. [RUN]
+17:24:41 5 of 5 OK created sql view model Application.nyc_trips_with_weather ............ [OK in 2.74s]
+17:24:41
+17:24:41 Finished running 5 view models in 0 hours 0 minutes and 24.03 seconds (24.03s).
+17:24:41
+17:24:41 Completed successfully
+17:24:41
+17:24:41 Done. PASS=5 WARN=0 ERROR=0 SKIP=0 TOTAL=5
+
+```
+
+Now that you have a running environment and a completed job, you can view the data in Dremio and expand your code. This is a snapshot of the project structure in an IDE:
+
+
+
+## About the schema.yml
+
+The `schema.yml` file defines Dremio sources and models to be used and what data models are in scope. In this guides sample project, there are two data sources:
+
+1. The `NYC-weather.csv` stored in the **Samples** database and
+2. The `sample_data` from the **Samples database**.
+
+The models correspond to both weather and trip data respectively and will be joined for analysis.
+
+The sources can be found by navigating to the **Object Storage** section of the Dremio Cloud UI.
+
+
+
+## About the models
+
+**Preparation** — `preparation_trips.sql` and `preparation_weather.sql` are building views on top of the trips and weather data.
+
+**Business** — `business_transportation_nyc_trips.sql` applies some level of transformation on `preparation_trips.sql` view. `Business_weather_nyc.sql` has no transformation on the `preparation_weather.sql` view.
+
+**Application** — `application_nyc_trips_with_weather.sql` joins the output from the Business model. This is what your business users will consume.
+
+## The Job output
+
+When you run the dbt job, it will create a **dev** space folder that has all the data assets created. This is what you will see in Dremio Cloud UI. Spaces in Dremio is a way to organize data assets which map to business units or data products.
+
+
+
+Open the **Application folder** and you will see the output of the simple transformation we did using dbt.
+
+
+
+## Query the data
+
+Now that you have run the job and completed the transformation, it's time to query your data. Click on the `nyc_trips_with_weather` view. That will take you to the SQL Runner page. Click **Show SQL Pane** on the upper right corner of the page.
+
+Run the following query:
+
+```sql
+
+SELECT vendor_id,
+ AVG(tip_amount)
+FROM dev.application."nyc_treips_with_weather"
+GROUP BY vendor_id
+
+```
+
+
+
+This completes the integration setup and data is ready for business consumption.
diff --git a/website/docs/guides/manual-install-qs.md b/website/docs/guides/manual-install-qs.md
index 61796fe008a..c74d30db51c 100644
--- a/website/docs/guides/manual-install-qs.md
+++ b/website/docs/guides/manual-install-qs.md
@@ -15,7 +15,7 @@ When you use dbt Core to work with dbt, you will be editing files locally using
### Prerequisites
* To use dbt Core, it's important that you know some basics of the Terminal. In particular, you should understand `cd`, `ls` and `pwd` to navigate through the directory structure of your computer easily.
-* Install dbt Core using the [installation instructions](/docs/core/installation) for your operating system.
+* Install dbt Core using the [installation instructions](/docs/core/installation-overview) for your operating system.
* Complete [Setting up (in BigQuery)](/guides/bigquery?step=2) and [Loading data (BigQuery)](/guides/bigquery?step=3).
* [Create a GitHub account](https://github.com/join) if you don't already have one.
diff --git a/website/docs/guides/microsoft-fabric-qs.md b/website/docs/guides/microsoft-fabric-qs.md
index c7c53a2aac7..1d1e016a6f1 100644
--- a/website/docs/guides/microsoft-fabric-qs.md
+++ b/website/docs/guides/microsoft-fabric-qs.md
@@ -9,7 +9,7 @@ recently_updated: true
---
## Introduction
-In this quickstart guide, you'll learn how to use dbt Cloud with Microsoft Fabric. It will show you how to:
+In this quickstart guide, you'll learn how to use dbt Cloud with [Microsoft Fabric](https://www.microsoft.com/en-us/microsoft-fabric). It will show you how to:
- Load the Jaffle Shop sample data (provided by dbt Labs) into your Microsoft Fabric warehouse.
- Connect dbt Cloud to Microsoft Fabric.
@@ -27,7 +27,7 @@ A public preview of Microsoft Fabric in dbt Cloud is now available!
### Prerequisites
- You have a [dbt Cloud](https://www.getdbt.com/signup/) account.
- You have started the Microsoft Fabric (Preview) trial. For details, refer to [Microsoft Fabric (Preview) trial](https://learn.microsoft.com/en-us/fabric/get-started/fabric-trial) in the Microsoft docs.
-- As a Microsoft admin, you’ve enabled service principal authentication. For details, refer to [Enable service principal authentication](https://learn.microsoft.com/en-us/fabric/admin/metadata-scanning-enable-read-only-apis) in the Microsoft docs. dbt Cloud needs these authentication credentials to connect to Microsoft Fabric.
+- As a Microsoft admin, you’ve enabled service principal authentication. You must add the service principal to the Microsoft Fabric workspace with either a Member (recommended) or Admin permission set. For details, refer to [Enable service principal authentication](https://learn.microsoft.com/en-us/fabric/admin/metadata-scanning-enable-read-only-apis) in the Microsoft docs. dbt Cloud needs these authentication credentials to connect to Microsoft Fabric.
### Related content
- [dbt Courses](https://courses.getdbt.com/collections)
@@ -54,8 +54,8 @@ A public preview of Microsoft Fabric in dbt Cloud is now available!
CREATE TABLE dbo.customers
(
[ID] [int],
- [FIRST_NAME] [varchar] (8000),
- [LAST_NAME] [varchar] (8000)
+ \[FIRST_NAME] [varchar](8000),
+ \[LAST_NAME] [varchar](8000)
);
COPY INTO [dbo].[customers]
@@ -72,7 +72,7 @@ A public preview of Microsoft Fabric in dbt Cloud is now available!
[USER_ID] [int],
-- [ORDER_DATE] [int],
[ORDER_DATE] [date],
- [STATUS] [varchar] (8000)
+ \[STATUS] [varchar](8000)
);
COPY INTO [dbo].[orders]
@@ -87,8 +87,8 @@ A public preview of Microsoft Fabric in dbt Cloud is now available!
(
[ID] [int],
[ORDERID] [int],
- [PAYMENTMETHOD] [varchar] (8000),
- [STATUS] [varchar] (8000),
+ \[PAYMENTMETHOD] [varchar](8000),
+ \[STATUS] [varchar](8000),
[AMOUNT] [int],
[CREATED] [date]
);
@@ -108,6 +108,9 @@ A public preview of Microsoft Fabric in dbt Cloud is now available!
2. Enter a project name and click **Continue**.
3. Choose **Fabric** as your connection and click **Next**.
4. In the **Configure your environment** section, enter the **Settings** for your new project:
+ - **Server** — Use the service principal's **host** value for the Fabric test endpoint.
+ - **Port** — 1433 (which is the default).
+ - **Database** — Use the service principal's **database** value for the Fabric test endpoint.
5. Enter the **Development credentials** for your new project:
- **Authentication** — Choose **Service Principal** from the dropdown.
- **Tenant ID** — Use the service principal’s **Directory (tenant) id** as the value.
diff --git a/website/docs/guides/redshift-qs.md b/website/docs/guides/redshift-qs.md
index 9296e6c6568..890be27e50a 100644
--- a/website/docs/guides/redshift-qs.md
+++ b/website/docs/guides/redshift-qs.md
@@ -57,7 +57,7 @@ You can check out [dbt Fundamentals](https://courses.getdbt.com/courses/fundamen
-7. You might be asked to Configure account. For the purpose of this sandbox environment, we recommend selecting “Configure account”.
+7. You might be asked to Configure account. For this sandbox environment, we recommend selecting “Configure account”.
8. Select your cluster from the list. In the **Connect to** popup, fill out the credentials from the output of the stack:
- **Authentication** — Use the default which is **Database user name and password** (NOTE: IAM authentication is not supported in dbt Cloud).
@@ -82,8 +82,7 @@ Now we are going to load our sample data into the S3 bucket that our Cloudformat
2. Now we are going to use the S3 bucket that you created with CloudFormation and upload the files. Go to the search bar at the top and type in `S3` and click on S3. There will be sample data in the bucket already, feel free to ignore it or use it for other modeling exploration. The bucket will be prefixed with `dbt-data-lake`.
-
-
+
3. Click on the `name of the bucket` S3 bucket. If you have multiple S3 buckets, this will be the bucket that was listed under “Workshopbucket” on the Outputs page.
diff --git a/website/docs/guides/set-up-ci.md b/website/docs/guides/set-up-ci.md
index 83362094ec6..89d7c5a14fa 100644
--- a/website/docs/guides/set-up-ci.md
+++ b/website/docs/guides/set-up-ci.md
@@ -167,7 +167,7 @@ jobs:
with:
python-version: "3.9"
- name: Install SQLFluff
- run: "pip install sqlfluff"
+ run: "python -m pip install sqlfluff"
- name: Lint project
run: "sqlfluff lint models --dialect snowflake"
@@ -204,7 +204,7 @@ lint-project:
rules:
- if: $CI_PIPELINE_SOURCE == "push" && $CI_COMMIT_BRANCH != 'main'
script:
- - pip install sqlfluff
+ - python -m pip install sqlfluff
- sqlfluff lint models --dialect snowflake
```
@@ -235,7 +235,7 @@ pipelines:
- step:
name: Lint dbt project
script:
- - pip install sqlfluff==0.13.1
+ - python -m pip install sqlfluff==0.13.1
- sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022
'main': # override if your default branch doesn't run on a branch named "main"
diff --git a/website/docs/guides/sl-migration.md b/website/docs/guides/sl-migration.md
index 0cfde742af2..8ede40a6a2d 100644
--- a/website/docs/guides/sl-migration.md
+++ b/website/docs/guides/sl-migration.md
@@ -25,10 +25,10 @@ dbt Labs recommends completing these steps in a local dev environment (such as t
1. Create new Semantic Model configs as YAML files in your dbt project.*
1. Upgrade the metrics configs in your project to the new spec.*
1. Delete your old metrics file or remove the `.yml` file extension so they're ignored at parse time. Remove the `dbt-metrics` package from your project. Remove any macros that reference `dbt-metrics`, like `metrics.calculate()`. Make sure that any packages you’re using don't have references to the old metrics spec.
-1. Install the CLI with `pip install "dbt-metricflow[your_adapter_name]"`. For example:
+1. Install the CLI with `python -m pip install "dbt-metricflow[your_adapter_name]"`. For example:
```bash
- pip install "dbt-metricflow[snowflake]"
+ python -m pip install "dbt-metricflow[snowflake]"
```
**Note** - The MetricFlow CLI is not available in the IDE at this time. Support is coming soon.
@@ -91,13 +91,11 @@ At this point, both the new semantic layer and the old semantic layer will be ru
Now that your Semantic Layer is set up, you will need to update any downstream integrations that used the legacy Semantic Layer.
-### Migration guide for Hex
+### Migration guide for Hex
-To learn more about integrating with Hex, check out their [documentation](https://learn.hex.tech/docs/connect-to-data/data-connections/dbt-integration#dbt-semantic-layer-integration) for more info. Additionally, refer to [dbt Semantic Layer cells](https://learn.hex.tech/docs/logic-cell-types/transform-cells/dbt-metrics-cells) to set up SQL cells in Hex.
+To learn more about integrating with Hex, check out their [documentation](https://learn.hex.tech/docs/connect-to-data/data-connections/dbt-integration#dbt-semantic-layer-integration) for more info. Additionally, refer to [dbt Semantic Layer cells](https://learn.hex.tech/docs/logic-cell-types/transform-cells/dbt-metrics-cells) to set up SQL cells in Hex.
-1. Set up a new connection for the Semantic Layer for your account. Something to note is that your old connection will still work. The following Loom video guides you in setting up your Semantic Layer with Hex:
-
-
+1. Set up a new connection for the dbt Semantic Layer for your account. Something to note is that your legacy connection will still work.
2. Re-create the dashboards or reports that use the legacy dbt Semantic Layer.
diff --git a/website/docs/reference/configs-and-properties.md b/website/docs/reference/configs-and-properties.md
index 8a557c762ed..c6458babeaa 100644
--- a/website/docs/reference/configs-and-properties.md
+++ b/website/docs/reference/configs-and-properties.md
@@ -157,9 +157,9 @@ You can find an exhaustive list of each supported property and config, broken do
* Model [properties](/reference/model-properties) and [configs](/reference/model-configs)
* Source [properties](/reference/source-properties) and [configs](source-configs)
* Seed [properties](/reference/seed-properties) and [configs](/reference/seed-configs)
-* [Snapshot Properties](snapshot-properties)
+* Snapshot [properties](snapshot-properties)
* Analysis [properties](analysis-properties)
-* [Macro Properties](/reference/macro-properties)
+* Macro [properties](/reference/macro-properties)
* Exposure [properties](/reference/exposure-properties)
## FAQs
diff --git a/website/docs/reference/dbt-commands.md b/website/docs/reference/dbt-commands.md
index d5f0bfcd2ad..4cb20051ea2 100644
--- a/website/docs/reference/dbt-commands.md
+++ b/website/docs/reference/dbt-commands.md
@@ -5,7 +5,7 @@ title: "dbt Command reference"
You can run dbt using the following tools:
- In your browser with the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud)
-- On the command line interface using the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) or open-source [dbt Core](/docs/core/about-dbt-core), both of which enable you to execute dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its [features](/docs/cloud/about-cloud/dbt-cloud-features).
+- On the command line interface using the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) or open-source [dbt Core](/docs/core/installation-overview), both of which enable you to execute dbt commands. The key distinction is the dbt Cloud CLI is tailored for dbt Cloud's infrastructure and integrates with all its [features](/docs/cloud/about-cloud/dbt-cloud-features).
The following sections outline the commands supported by dbt and their relevant flags. For information about selecting models on the command line, consult the docs on [Model selection syntax](/reference/node-selection/syntax).
@@ -71,7 +71,7 @@ Use the following dbt commands in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/
-Use the following dbt commands in [dbt Core](/docs/core/about-dbt-core) and use the `dbt` prefix. For example, to run the `test` command, type `dbt test`.
+Use the following dbt commands in [dbt Core](/docs/core/installation-overview) and use the `dbt` prefix. For example, to run the `test` command, type `dbt test`.
- [build](/reference/commands/build): build and test all selected resources (models, seeds, snapshots, tests)
- [clean](/reference/commands/clean): deletes artifacts present in the dbt project
diff --git a/website/docs/reference/dbt_project.yml.md b/website/docs/reference/dbt_project.yml.md
index caf501c27ab..34af0f696c7 100644
--- a/website/docs/reference/dbt_project.yml.md
+++ b/website/docs/reference/dbt_project.yml.md
@@ -22,7 +22,85 @@ dbt uses YAML in a few different places. If you're new to YAML, it would be wort
:::
-
+
+
+
+
+```yml
+[name](/reference/project-configs/name): string
+
+[config-version](/reference/project-configs/config-version): 2
+[version](/reference/project-configs/version): version
+
+[profile](/reference/project-configs/profile): profilename
+
+[model-paths](/reference/project-configs/model-paths): [directorypath]
+[seed-paths](/reference/project-configs/seed-paths): [directorypath]
+[test-paths](/reference/project-configs/test-paths): [directorypath]
+[analysis-paths](/reference/project-configs/analysis-paths): [directorypath]
+[macro-paths](/reference/project-configs/macro-paths): [directorypath]
+[snapshot-paths](/reference/project-configs/snapshot-paths): [directorypath]
+[docs-paths](/reference/project-configs/docs-paths): [directorypath]
+[asset-paths](/reference/project-configs/asset-paths): [directorypath]
+
+[target-path](/reference/project-configs/target-path): directorypath
+[log-path](/reference/project-configs/log-path): directorypath
+[packages-install-path](/reference/project-configs/packages-install-path): directorypath
+
+[clean-targets](/reference/project-configs/clean-targets): [directorypath]
+
+[query-comment](/reference/project-configs/query-comment): string
+
+[require-dbt-version](/reference/project-configs/require-dbt-version): version-range | [version-range]
+
+[dbt-cloud](/docs/cloud/cloud-cli-installation):
+ [project-id](/docs/cloud/configure-cloud-cli#configure-the-dbt-cloud-cli): project_id # Required
+ [defer-env-id](/docs/cloud/about-cloud-develop-defer#defer-in-dbt-cloud-cli): environment_id # Optional
+
+[quoting](/reference/project-configs/quoting):
+ database: true | false
+ schema: true | false
+ identifier: true | false
+
+metrics:
+
+
+models:
+ [](/reference/model-configs)
+
+seeds:
+ [](/reference/seed-configs)
+
+semantic-models:
+
+
+snapshots:
+ [](/reference/snapshot-configs)
+
+sources:
+ [](source-configs)
+
+tests:
+ [](/reference/test-configs)
+
+vars:
+ [](/docs/build/project-variables)
+
+[on-run-start](/reference/project-configs/on-run-start-on-run-end): sql-statement | [sql-statement]
+[on-run-end](/reference/project-configs/on-run-start-on-run-end): sql-statement | [sql-statement]
+
+[dispatch](/reference/project-configs/dispatch-config):
+ - macro_namespace: packagename
+ search_order: [packagename]
+
+[restrict-access](/docs/collaborate/govern/model-access): true | false
+
+```
+
+
+
+
+
diff --git a/website/docs/reference/model-configs.md b/website/docs/reference/model-configs.md
index 06830d0d32b..19391f1c763 100644
--- a/website/docs/reference/model-configs.md
+++ b/website/docs/reference/model-configs.md
@@ -1,8 +1,13 @@
---
title: Model configurations
description: "Read this guide to understand model configurations in dbt."
+meta:
+ resource_type: Models
---
+import ConfigResource from '/snippets/_config-description-resource.md';
+import ConfigGeneral from '/snippets/_config-description-general.md';
+
## Related documentation
* [Models](/docs/build/models)
* [`run` command](/reference/commands/run)
@@ -10,6 +15,8 @@ description: "Read this guide to understand model configurations in dbt."
## Available configurations
### Model-specific configurations
+
+
+
+
```sql
{{
diff --git a/website/docs/reference/resource-properties/config.md b/website/docs/reference/resource-properties/config.md
index e6021def852..55d2f64d9ff 100644
--- a/website/docs/reference/resource-properties/config.md
+++ b/website/docs/reference/resource-properties/config.md
@@ -16,6 +16,7 @@ datatype: "{dictionary}"
{ label: 'Sources', value: 'sources', },
{ label: 'Metrics', value: 'metrics', },
{ label: 'Exposures', value: 'exposures', },
+ { label: 'Semantic models', value: 'semantic models', },
]
}>
@@ -182,6 +183,36 @@ exposures:
+
+
+
+
+Support for the `config` property on `semantic_models` was added in dbt Core v1.7
+
+
+
+
+
+
+
+```yml
+version: 2
+
+semantic_models:
+ - name:
+ config:
+ enabled: true | false
+ group:
+ meta: {dictionary}
+```
+
+
+
+
+
+
+