diff --git a/website/docs/docs/cloud/about-cloud-develop-defer.md b/website/docs/docs/cloud/about-cloud-develop-defer.md
index fc55edf8a38..3ee5ac71666 100644
--- a/website/docs/docs/cloud/about-cloud-develop-defer.md
+++ b/website/docs/docs/cloud/about-cloud-develop-defer.md
@@ -40,6 +40,9 @@ To enable defer in the dbt Cloud IDE, toggle the **Defer to production** button
For example, if you were to start developing on a new branch with [nothing in your development schema](/reference/node-selection/defer#usage), edit a single model, and run `dbt build -s state:modified` — only the edited model would run. Any `{{ ref() }}` functions will point to the production location of the referenced models.
+
+Note: The **Defer to staging/production** toggle button doesn't apply when running [dbt Semantic Layer commands](/docs/build/metricflow-commands) in the dbt Cloud IDE. To use defer for Semantic layer commands in the IDE, toggle the button on and manually add the `--defer` flag to the command. This is a temporary workaround and will be available soon.
+
### Defer in dbt Cloud CLI
diff --git a/website/docs/docs/cloud/about-cloud/about-dbt-cloud.md b/website/docs/docs/cloud/about-cloud/about-dbt-cloud.md
index 02f950111ea..d7afd424fc4 100644
--- a/website/docs/docs/cloud/about-cloud/about-dbt-cloud.md
+++ b/website/docs/docs/cloud/about-cloud/about-dbt-cloud.md
@@ -24,7 +24,7 @@ dbt Cloud's [flexible plans](https://www.getdbt.com/pricing/) and features make
diff --git a/website/docs/docs/cloud/connect-data-platform/about-connections.md b/website/docs/docs/cloud/connect-data-platform/about-connections.md
index 8bec408af2e..6f2f140b724 100644
--- a/website/docs/docs/cloud/connect-data-platform/about-connections.md
+++ b/website/docs/docs/cloud/connect-data-platform/about-connections.md
@@ -18,6 +18,7 @@ dbt Cloud can connect with a variety of data platform providers including:
- [PostgreSQL](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb)
- [Snowflake](/docs/cloud/connect-data-platform/connect-snowflake)
- [Starburst or Trino](/docs/cloud/connect-data-platform/connect-starburst-trino)
+- [Teradata](/docs/cloud/connect-data-platform/connect-teradata)
You can connect to your database in dbt Cloud by clicking the gear in the top right and selecting **Account Settings**. From the Account Settings page, click **+ New Project**.
diff --git a/website/docs/docs/cloud/connect-data-platform/connect-teradata.md b/website/docs/docs/cloud/connect-data-platform/connect-teradata.md
new file mode 100644
index 00000000000..cf41814078b
--- /dev/null
+++ b/website/docs/docs/cloud/connect-data-platform/connect-teradata.md
@@ -0,0 +1,29 @@
+---
+title: "Connect Teradata"
+id: connect-teradata
+description: "Configure the Teradata platform connection in dbt Cloud."
+sidebar_label: "Connect Teradata"
+---
+
+# Connect Teradata
+
+Your environment(s) must be on ["Versionless"](/docs/dbt-versions/versionless-cloud) to use the Teradata connection.
+
+| Field | Description | Type | Required? | Example |
+| ----------------------------- | --------------------------------------------------------------------------------------------- | -------------- | --------- | ------- |
+| Host | Host name of your Teradata environment. | String | Required | host-name.env.clearscape.teradata.com |
+| Port | The database port number. Equivalent to the Teradata JDBC Driver DBS_PORT connection parameter.| Quoted integer | Optional | 1025 |
+| Retries | Number of times to retry to connect to database upon error. | Integer | optional | 10 |
+| Request timeout | The waiting period between connections attempts in seconds. Default is "1" second. | Quoted integer | Optional | 3 |
+
+
+
+### Development and deployment credentials
+
+| Field | Description | Type | Required? | Example |
+| ------------------------------|-----------------------------------------------------------------------------------------------|----------------|-----------|--------------------|
+| Username | The database username. Equivalent to the Teradata JDBC Driver USER connection parameter. | String | Required | database_username |
+| Password | The database password. Equivalent to the Teradata JDBC Driver PASSWORD connection parameter. | String | Required | DatabasePassword123 |
+| Schema | Specifies the initial database to use after login, rather than the user's default database. | String | Required | dbtlabsdocstest |
+
+
diff --git a/website/docs/docs/cloud/dbt-assist-data.md b/website/docs/docs/cloud/dbt-assist-data.md
deleted file mode 100644
index ad32c304ca8..00000000000
--- a/website/docs/docs/cloud/dbt-assist-data.md
+++ /dev/null
@@ -1,29 +0,0 @@
----
-title: "dbt Assist privacy and data"
-sidebar_label: "dbt Assist privacy"
-description: "dbt Assist’s powerful AI feature helps you deliver data that works."
----
-
-# dbt Assist privacy and data
-
-dbt Labs is committed to protecting your privacy and data. This page provides information about how dbt Labs handles your data when you use dbt Assist.
-
-#### Is my data used by dbt Labs to train AI models?
-
-No, dbt Assist does not use client warehouse data to train any AI models. It uses API calls to an AI provider.
-
-#### Does dbt Labs share my personal data with third parties
-
-dbt Labs only shares client personal information as needed to perform the services, under client instructions, or for legal, tax, or compliance reasons.
-
-#### Does dbt Assist store or use personal data?
-
-The user clicks the AI assist button, and the user does not otherwise enter data.
-
-#### Does dbt Assist access my warehouse data?
-
-dbt Assist utilizes metadata, including column names, model SQL, the model's name, and model documentation. The row-level data from the warehouse is never used or sent to a third-party provider. Such output must be double-checked by the user for completeness and accuracy.
-
-#### Can dbt Assist data be deleted upon client written request?
-
-dbt Assist data, aside from usage data, does not persist on dbt Labs systems. Usage data is retained by dbt Labs. dbt Labs does not have possession of any personal or sensitive data. To the extent client identifies personal or sensitive information uploaded by or on behalf of client to dbt Labs systems, such data can be deleted within 30 days of written request.
diff --git a/website/docs/docs/cloud/dbt-assist.md b/website/docs/docs/cloud/dbt-assist.md
deleted file mode 100644
index bb8cabaff2b..00000000000
--- a/website/docs/docs/cloud/dbt-assist.md
+++ /dev/null
@@ -1,25 +0,0 @@
----
-title: "About dbt Assist"
-sidebar_label: "About dbt Assist"
-description: "dbt Assist’s powerful AI co-pilot feature helps you deliver data that works."
-pagination_next: "docs/cloud/enable-dbt-assist"
-pagination_prev: null
----
-
-# About dbt Assist
-
-dbt Assist is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud, allowing you to focus on delivering data that works. dbt Assist’s AI co-pilot generates [documentation](/docs/build/documentation), [semantic models](/docs/build/semantic-models), and [tests](/docs/build/data-tests) for your SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time.
-
-:::tip Beta feature
-dbt Assist is an AI tool meant to _help_ developers generate documentation, semantic models, and tests in dbt Cloud. It's available in beta, in the dbt Cloud IDE only.
-
-To use dbt Assist, you must have an active [dbt Cloud Enterprise account](https://www.getdbt.com/pricing) and agree to use dbt Labs' OpenAI key. [Register your interest](https://docs.google.com/forms/d/e/1FAIpQLScPjRGyrtgfmdY919Pf3kgqI5E95xxPXz-8JoVruw-L9jVtxg/viewform) to join the private beta or reach out to your account team to begin this process.
-:::
-
-
-
-## Feedback
-
-Please note: Always review AI-generated code and content as it may produce incorrect results. dbt Assist features and/or functionality may be added or eliminated as part of the beta trial.
-
-To give feedback, please reach out to your dbt Labs account team. We appreciate your feedback and suggestions as we improve dbt Assist.
diff --git a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md
index 37f39f6dff8..398b0cff2a1 100644
--- a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md
+++ b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md
@@ -13,7 +13,7 @@ The dbt Cloud integrated development environment (IDE) is a single web-based int
The dbt Cloud IDE offers several [keyboard shortcuts](/docs/cloud/dbt-cloud-ide/keyboard-shortcuts) and [editing features](/docs/cloud/dbt-cloud-ide/ide-user-interface#editing-features) for faster and efficient development and governance:
- Syntax highlighting for SQL — Makes it easy to distinguish different parts of your code, reducing syntax errors and enhancing readability.
-- AI co-pilot — Use [dbt Assist](/docs/cloud/dbt-assist), a powerful AI co-pilot feature, to generate documentation, semantic models, and tests for your dbt SQL models.
+- AI copilot — Use [dbt Copilot](/docs/cloud/dbt-copilot), a powerful AI engine that can generate documentation, tests, and semantic models for your dbt SQL models.
- Auto-completion — Suggests table names, arguments, and column names as you type, saving time and reducing typos.
- Code [formatting and linting](/docs/cloud/dbt-cloud-ide/lint-format) — Helps standardize and fix your SQL code effortlessly.
- Navigation tools — Easily move around your code, jump to specific lines, find and replace text, and navigate between project files.
@@ -55,7 +55,7 @@ To understand how to navigate the IDE and its user interface elements, refer to
| [**Keyboard shortcuts**](/docs/cloud/dbt-cloud-ide/keyboard-shortcuts) | You can access a variety of [commands and actions](/docs/cloud/dbt-cloud-ide/keyboard-shortcuts) in the IDE by choosing the appropriate keyboard shortcut. Use the shortcuts for common tasks like building modified models or resuming builds from the last failure. |
| **IDE version control** | The IDE version control section and git button allow you to apply the concept of [version control](/docs/collaborate/git/version-control-basics) to your project directly into the IDE.
- Create or change branches, execute git commands using the git button.
- Commit or revert individual files by right-clicking the edited file
- [Resolve merge conflicts](/docs/collaborate/git/merge-conflicts)
- Link to the repo directly by clicking the branch name
- Edit, format, or lint files and execute dbt commands in your primary protected branch, and commit to a new branch.
- Use Git diff view to view what has been changed in a file before you make a pull request.
- From dbt version 1.6 and higher, use the **Prune branches** [button](/docs/cloud/dbt-cloud-ide/ide-user-interface#prune-branches-modal) to delete local branches that have been deleted from the remote repository, keeping your branch management tidy. |
| **Preview and Compile button** | You can [compile or preview](/docs/cloud/dbt-cloud-ide/ide-user-interface#console-section) code, a snippet of dbt code, or one of your dbt models after editing and saving. |
-| [**dbt Assist**](/docs/cloud/dbt-assist) | A powerful AI co-pilot feature that generates documentation, semantic models, and tests for your dbt SQL models. Available for dbt Cloud Enterprise plans. |
+| [**dbt Copilot**](/docs/cloud/dbt-copilot) | A powerful AI engine that can generate documentation, tests, and semantic models for your dbt SQL models. Available for dbt Cloud Enterprise plans. |
| **Build, test, and run button** | Build, test, and run your project with a button click or by using the Cloud IDE command bar.
| **Command bar** | You can enter and run commands from the command bar at the bottom of the IDE. Use the [rich model selection syntax](/reference/node-selection/syntax) to execute [dbt commands](/reference/dbt-commands) directly within dbt Cloud. You can also view the history, status, and logs of previous runs by clicking History on the left of the bar.
| **Drag and drop** | Drag and drop files located in the file explorer, and use the file breadcrumb on the top of the IDE for quick, linear navigation. Access adjacent files in the same file by right-clicking on the breadcrumb file.
@@ -130,7 +130,7 @@ Nice job, you're ready to start developing and building models 🎉!
- Starting from dbt v1.6, leverage [environments variables](/docs/build/environment-variables#special-environment-variables) to dynamically use the Git branch name. For example, using the branch name as a prefix for a development schema.
- Run [MetricFlow commands](/docs/build/metricflow-commands) to create and manage metrics in your project with the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl).
-- **Generate your YAML configurations with dbt Assist** — [dbt Assist](/docs/cloud/dbt-assist) is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud. It generates documentation, semantic models, and tests for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time. Available for dbt Cloud Enterprise plans.
+- **Generate your YAML configurations with dbt Copilot** — [dbt Copilot](/docs/cloud/dbt-copilot) is a powerful artificial intelligence (AI) feature that helps automate development in dbt Cloud. It can generate documentation, tests, and semantic models for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time. Available for dbt Cloud Enterprise plans.
- **Build and view your project's docs** — The dbt Cloud IDE makes it possible to [build and view](/docs/collaborate/build-and-view-your-docs) documentation for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production.
diff --git a/website/docs/docs/cloud/dbt-copilot-data.md b/website/docs/docs/cloud/dbt-copilot-data.md
new file mode 100644
index 00000000000..b55681542e3
--- /dev/null
+++ b/website/docs/docs/cloud/dbt-copilot-data.md
@@ -0,0 +1,29 @@
+---
+title: "dbt Copilot privacy and data"
+sidebar_label: "dbt Copilot privacy"
+description: "dbt Copilot is a powerful AI engine to help you deliver data that works."
+---
+
+# dbt Copilot privacy and data
+
+dbt Labs is committed to protecting your privacy and data. This page provides information about how the dbt Copilot AI engine handles your data.
+
+#### Is my data used by dbt Labs to train AI models?
+
+No, dbt Copilot does not use client warehouse data to train any AI models. It uses API calls to an AI provider.
+
+#### Does dbt Labs share my personal data with third parties
+
+dbt Labs only shares client personal information as needed to perform the services, under client instructions, or for legal, tax, or compliance reasons.
+
+#### Does dbt Copilot store or use personal data?
+
+The user clicks the dbt Copilot button, and the user does not otherwise enter data.
+
+#### Does dbt Copilot access my warehouse data?
+
+dbt Copilot utilizes metadata, including column names, model SQL, the model's name, and model documentation. The row-level data from the warehouse is never used or sent to a third-party provider. Such output must be double-checked by the user for completeness and accuracy.
+
+#### Can dbt Copilot data be deleted upon client written request?
+
+The data from using dbt Copilot, aside from usage data, _doesn't_ persist on dbt Labs systems. Usage data is retained by dbt Labs. dbt Labs doesn't have possession of any personal or sensitive data. To the extent client identifies personal or sensitive information uploaded by or on behalf of client to dbt Labs systems, such data can be deleted within 30 days of written request.
diff --git a/website/docs/docs/cloud/dbt-copilot.md b/website/docs/docs/cloud/dbt-copilot.md
new file mode 100644
index 00000000000..42a05dd91ba
--- /dev/null
+++ b/website/docs/docs/cloud/dbt-copilot.md
@@ -0,0 +1,25 @@
+---
+title: "About dbt Copilot"
+sidebar_label: "About dbt Copilot"
+description: "dbt Copilot is a powerful AI engine designed to accelerate your analytics workflows throughout your entire ADLC."
+pagination_next: "docs/cloud/enable-dbt-copilot"
+pagination_prev: null
+---
+
+# About dbt Copilot
+
+dbt Copilot is a powerful artificial intelligence (AI) engine that's fully integrated into your dbt Cloud experience and designed to accelerate your analytics workflows. dbt Copilot embeds AI-driven assistance across every stage of the analytics development life cycle (ADLC), empowering data practitioners to deliver data products faster, improve data quality, and enhance data accessibility. With automatic code generation, you can let the AI engine generate the [documentation](/docs/build/documentation), [tests](/docs/build/data-tests), and [semantic models](/docs/build/semantic-models) for you.
+
+:::tip Beta feature
+dbt Copilot is designed to _help_ developers generate documentation, tests, and semantic models in dbt Cloud. It's available in beta, in the dbt Cloud IDE only.
+
+To use dbt Copilot, you must have an active [dbt Cloud Enterprise account](https://www.getdbt.com/pricing) and agree to use dbt Labs' OpenAI key. [Register your interest](https://docs.google.com/forms/d/e/1FAIpQLScPjRGyrtgfmdY919Pf3kgqI5E95xxPXz-8JoVruw-L9jVtxg/viewform) to join the private beta or reach out to your Account team to begin this process.
+:::
+
+
+
+## Feedback
+
+Please note: Always review AI-generated code and content as it may produce incorrect results. The features and/or functionality of dbt Copilot may be added or eliminated as part of the beta trial.
+
+To give feedback, please contact your dbt Labs account team. We appreciate your feedback and suggestions as we improve dbt Copilot.
diff --git a/website/docs/docs/cloud/enable-dbt-assist.md b/website/docs/docs/cloud/enable-dbt-assist.md
deleted file mode 100644
index 9432f858001..00000000000
--- a/website/docs/docs/cloud/enable-dbt-assist.md
+++ /dev/null
@@ -1,35 +0,0 @@
----
-title: "Enable dbt Assist"
-sidebar_label: "Enable dbt Assist"
-description: "Enable dbt Assist in dbt Cloud and leverage AI to speed up your development."
----
-
-# Enable dbt Assist
-
-This page explains how to enable dbt Assist in dbt Cloud to leverage AI to speed up your development and allow you to focus on delivering quality data.
-
-## Prerequisites
-
-- Available in the dbt Cloud IDE only.
-- Must have an active [dbt Cloud Enterprise account](https://www.getdbt.com/pricing).
-- Development environment be ["Versionless"](/docs/dbt-versions/upgrade-dbt-version-in-cloud#versionless).
-- Current dbt Assist deployments use a central OpenAI API key managed by dbt Labs. In the future, you may provide your own key for Azure OpenAI or OpenAI.
-- Accept and sign legal agreements. Reach out to your account team to begin this process.
-
-## Enable dbt Assist
-
-dbt Assist will only be available at an account level after your organization has signed the legal requirements. It will be disabled by default. Your dbt Cloud Admin(s) will enable it by following these steps:
-
-1. Navigate to **Account Settings** in the navigation menu.
-
-2. Under **Settings**, confirm the account you're enabling.
-
-3. Click **Edit** in the top right corner.
-
-4. To turn on dbt Assist, toggle the **Enable account access to AI-powered features** switch to the right. The toggle will slide to the right side, activating dbt Assist.
-
-5. Click **Save** and you should now have dbt Assist AI enabled to use.
-
-Note: To disable (only after enabled), repeat steps 1 to 3, toggle off in step 4, and repeat step 5.
-
-
diff --git a/website/docs/docs/cloud/enable-dbt-copilot.md b/website/docs/docs/cloud/enable-dbt-copilot.md
new file mode 100644
index 00000000000..23c253ecf7a
--- /dev/null
+++ b/website/docs/docs/cloud/enable-dbt-copilot.md
@@ -0,0 +1,35 @@
+---
+title: "Enable dbt Copilot"
+sidebar_label: "Enable dbt Copilot"
+description: "Enable the dbt Copilot AI engine in dbt Cloud to speed up your development."
+---
+
+# Enable dbt Copilot
+
+This page explains how to enable the dbt Copilot engine in dbt Cloud, leveraging AI to speed up your development and allowing you to focus on delivering quality data.
+
+## Prerequisites
+
+- Available in the dbt Cloud IDE only.
+- Must have an active [dbt Cloud Enterprise account](https://www.getdbt.com/pricing).
+- Development environment has been upgraded to ["Versionless"](/docs/dbt-versions/upgrade-dbt-version-in-cloud#versionless).
+- Current dbt Copilot deployments use a central OpenAI API key managed by dbt Labs. In the future, you may provide your own key for Azure OpenAI or OpenAI.
+- Accept and sign legal agreements. Reach out to your Account team to begin this process.
+
+## Enable dbt Copilot
+
+dbt Copilot is only available at an account level after your organization has signed the legal requirements. It's disabled by default. A dbt Cloud admin(s) can enable it by following these steps:
+
+1. Navigate to **Account settings** in the navigation menu.
+
+2. Under **Settings**, confirm the account you're enabling.
+
+3. Click **Edit** in the top right corner.
+
+4. Enable the **Enable account access to AI-powered features** option.
+
+5. Click **Save**. You should now have the dbt Copilot AI engine enabled for use.
+
+Note: To disable (only after enabled), repeat steps 1 to 3, toggle off in step 4, and repeat step 5.
+
+
\ No newline at end of file
diff --git a/website/docs/docs/cloud/use-dbt-assist.md b/website/docs/docs/cloud/use-dbt-assist.md
deleted file mode 100644
index 888d5107999..00000000000
--- a/website/docs/docs/cloud/use-dbt-assist.md
+++ /dev/null
@@ -1,20 +0,0 @@
----
-title: "Use dbt Assist"
-sidebar_label: "Use dbt Assist"
-description: "Use dbt Assist to generate documentation, semantic models, and tests from scratch, giving you the flexibility to modify or fix generated code."
----
-
-# Use dbt Assist
-
-Use dbt Assist to generate documentation, semantic models, and tests from scratch, giving you the flexibility to modify or fix generated code.
-
-To access and use dbt Assist:
-
-1. Navigate to the dbt Cloud IDE and select a SQL model file under the **File Explorer**.
-2. In the **Console** section (under the **File Editor**), select the **dbt Assist** to view the available AI options.
-3. Select the available options to generate the YAML config: **Generate Documentation**, **Generate Tests**, or **Generate Semantic Model**.
- - To generate multiple YAML configs for the same model, click each option separately. dbt Assist intelligently saves the YAML config in the same file.
-4. Verify the AI-generated code. Update or fix the code if needed.
-5. Click **Save** to save the code. You should see the file changes under the **Version control** section.
-
-
diff --git a/website/docs/docs/cloud/use-dbt-copilot.md b/website/docs/docs/cloud/use-dbt-copilot.md
new file mode 100644
index 00000000000..30def967f96
--- /dev/null
+++ b/website/docs/docs/cloud/use-dbt-copilot.md
@@ -0,0 +1,22 @@
+---
+title: "Use dbt Copilot"
+sidebar_label: "Use dbt Copilot"
+description: "Use the dbt Copilot AI engine to generate documentation, tests, and semantic models from scratch, giving you the flexibility to modify or fix generated code."
+---
+
+# Use dbt Copilot
+
+Use dbt Copilot to generate documentation, tests, and semantic models from scratch, giving you the flexibility to modify or fix generated code. To access and use this AI engine:
+
+1. Navigate to the dbt Cloud IDE and select a SQL model file under the **File Explorer**.
+
+2. In the **Console** section (under the **File Editor**), click **dbt Copilot** to view the available AI options.
+
+3. Select the available options to generate the YAML config: **Generate Documentation**, **Generate Tests**, or **Generate Semantic Model**.
+ - To generate multiple YAML configs for the same model, click each option separately. dbt Copilot intelligently saves the YAML config in the same file.
+
+4. Verify the AI-generated code. You can update or fix the code as needed.
+
+5. Click **Save As**. You should see the file changes under the **Version control** section.
+
+
diff --git a/website/docs/docs/collaborate/auto-exposures.md b/website/docs/docs/collaborate/auto-exposures.md
index 2b1d649abd1..9b25a2fb305 100644
--- a/website/docs/docs/collaborate/auto-exposures.md
+++ b/website/docs/docs/collaborate/auto-exposures.md
@@ -7,7 +7,7 @@ pagination_next: "docs/collaborate/data-tile"
image: /img/docs/cloud-integrations/auto-exposures/explorer-lineage.jpg
---
-# Auto-exposures
+# Auto-exposures
As a data team, it’s critical that you have context into the downstream use cases and users of your data products. Auto-exposures integrates natively with Tableau (Power BI coming soon) and auto-generates downstream lineage in dbt Explorer for a richer experience.
diff --git a/website/docs/docs/collaborate/explore-projects.md b/website/docs/docs/collaborate/explore-projects.md
index 9e27c2afa47..a4388a8696e 100644
--- a/website/docs/docs/collaborate/explore-projects.md
+++ b/website/docs/docs/collaborate/explore-projects.md
@@ -20,7 +20,7 @@ import ExplorerCourse from '/snippets/_explorer-course-link.md';
- You have at least one successful job run in the deployment environment. Note that [CI jobs](/docs/deploy/ci-jobs) do not update dbt Explorer.
- You are on the dbt Explorer page. To do this, select **Explore** from the navigation in dbt Cloud.
-## Overview page
+## Overview page
Navigate the dbt Explorer overview page to access your project's resources and metadata. The page includes the following sections:
diff --git a/website/docs/docs/collaborate/govern/model-contracts.md b/website/docs/docs/collaborate/govern/model-contracts.md
index b07ce909480..d30024157c8 100644
--- a/website/docs/docs/collaborate/govern/model-contracts.md
+++ b/website/docs/docs/collaborate/govern/model-contracts.md
@@ -178,14 +178,14 @@ Currently, `not_null` and `check` constraints are enforced only after a model is
### Which models should have contracts?
Any model meeting the criteria described above _can_ define a contract. We recommend defining contracts for ["public" models](model-access) that are being relied on downstream.
-- Inside of dbt: Shared with other groups, other teams, and (in the future) other dbt projects.
+- Inside of dbt: Shared with other groups, other teams, and [other dbt projects](/best-practices/how-we-mesh/mesh-1-intro).
- Outside of dbt: Reports, dashboards, or other systems & processes that expect this model to have a predictable structure. You might reflect these downstream uses with [exposures](/docs/build/exposures).
### How are contracts different from tests?
A model's contract defines the **shape** of the returned dataset. If the model's logic or input data doesn't conform to that shape, the model does not build.
-[Data Tests](/docs/build/data-tests) are a more flexible mechanism for validating the content of your model _after_ it's built. So long as you can write the query, you can run the data test. Data tests are more configurable, such as with [custom severity thresholds](/reference/resource-configs/severity). They are easier to debug after finding failures, because you can query the already-built model, or [store the failing records in the data warehouse](/reference/resource-configs/store_failures).
+[Data Tests](/docs/build/data-tests) are a more flexible mechanism for validating the content of your model _after_ it's built. So long as you can write the query, you can run the data test. Data tests are more configurable, such as with [custom severity thresholds](/reference/resource-configs/severity). They are easier to debug after finding failures because you can query the already-built model, or [store the failing records in the data warehouse](/reference/resource-configs/store_failures).
In some cases, you can replace a data test with its equivalent constraint. This has the advantage of guaranteeing the validation at build time, and it probably requires less compute (cost) in your data platform. The prerequisites for replacing a data test with a constraint are:
- Making sure that your data platform can support and enforce the constraint that you need. Most platforms only enforce `not_null`.
diff --git a/website/docs/docs/core/connect-data-platform/athena-setup.md b/website/docs/docs/core/connect-data-platform/athena-setup.md
index 9780e86de88..825d3071ad2 100644
--- a/website/docs/docs/core/connect-data-platform/athena-setup.md
+++ b/website/docs/docs/core/connect-data-platform/athena-setup.md
@@ -7,7 +7,7 @@ meta:
github_repo: 'dbt-labs/dbt-athena'
pypi_package: 'dbt-athena-community'
min_core_version: 'v1.3.0'
- cloud_support: Not Supported
+ cloud_support: Supported
min_supported_version: 'engine version 2 and 3'
slack_channel_name: '#db-athena'
slack_channel_link: 'https://getdbt.slack.com/archives/C013MLFR7BQ'
diff --git a/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md b/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md
index 8a4d6b61004..0a0347df9ea 100644
--- a/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md
+++ b/website/docs/docs/core/connect-data-platform/azuresynapse-setup.md
@@ -7,7 +7,7 @@ meta:
github_repo: 'Microsoft/dbt-synapse'
pypi_package: 'dbt-synapse'
min_core_version: 'v0.18.0'
- cloud_support: Not Supported
+ cloud_support: Supported
min_supported_version: 'Azure Synapse 10'
slack_channel_name: '#db-synapse'
slack_channel_link: 'https://getdbt.slack.com/archives/C01DRQ178LQ'
diff --git a/website/docs/docs/dbt-cloud-apis/sl-api-overview.md b/website/docs/docs/dbt-cloud-apis/sl-api-overview.md
index 1c4d5f387e9..e4e2a91791d 100644
--- a/website/docs/docs/dbt-cloud-apis/sl-api-overview.md
+++ b/website/docs/docs/dbt-cloud-apis/sl-api-overview.md
@@ -43,15 +43,9 @@ plan="dbt Cloud Team or Enterprise"
icon="dbt-bit"/>
-
-
diff --git a/website/docs/docs/dbt-cloud-apis/sl-python-sdk.md b/website/docs/docs/dbt-cloud-apis/sl-python-sdk.md
index 901b6bf179a..e34a44a5a57 100644
--- a/website/docs/docs/dbt-cloud-apis/sl-python-sdk.md
+++ b/website/docs/docs/dbt-cloud-apis/sl-python-sdk.md
@@ -7,7 +7,6 @@ keywords: [dbt Cloud, API, dbt Semantic Layer, python, sdk]
sidebar_label: "Python SDK"
---
-# Python SDK
The [`dbt-sl-sdk` Python software development kit](https://github.com/dbt-labs/semantic-layer-sdk-python) (SDK) is a Python library that provides you with easy access to the dbt Semantic Layer with Python. It allows developers to interact with the dbt Semantic Layer APIs and query metrics and dimensions in downstream tools.
## Installation
diff --git a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
index cf9b9eaed4e..29f3650e7a6 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
@@ -42,7 +42,7 @@ Historically, managing incremental models involved several manual steps and resp
While this works for many use-cases, there’s a clear limitation with this approach: *Some datasets are just too big to fit into one query.*
-Starting in Core 1.9, you can use the new microbatch strategy to optimize your largest datasets -- **process your event data in discrete periods with their own SQL queries, rather than all at once.** The benefits include:
+Starting in Core 1.9, you can use the new [microbatch strategy](/docs/build/incremental-microbatch#what-is-microbatch-in-dbt) to optimize your largest datasets -- **process your event data in discrete periods with their own SQL queries, rather than all at once.** The benefits include:
- Simplified query design: Write your model query for a single batch of data. dbt will use your `event_time`, `lookback`, and `batch_size` configurations to automatically generate the necessary filters for you, making the process more streamlined and reducing the need for you to manage these details.
- Independent batch processing: dbt automatically breaks down the data to load into smaller batches based on the specified `batch_size` and processes each batch independently, improving efficiency and reducing the risk of query timeouts. If some of your batches fail, you can use `dbt retry` to load only the failed batches.
diff --git a/website/docs/docs/dbt-versions/release-notes.md b/website/docs/docs/dbt-versions/release-notes.md
index 9030ca8e722..662fd0f381a 100644
--- a/website/docs/docs/dbt-versions/release-notes.md
+++ b/website/docs/docs/dbt-versions/release-notes.md
@@ -20,6 +20,32 @@ Release notes are grouped by month for both multi-tenant and virtual private clo
## October 2024
+
+
+ Documentation for new features and functionality announced at Coalesce 2024:
+
+ - Iceberg table support for [Snowflake](https://docs.getdbt.com/reference/resource-configs/snowflake-configs#iceberg-table-format)
+ - [Athena](https://docs.getdbt.com/reference/resource-configs/athena-configs) and [Teradata](https://docs.getdbt.com/reference/resource-configs/teradata-configs) adapter support in dbt Cloud
+ - dbt Cloud now hosted on [Azure](https://docs.getdbt.com/docs/cloud/about-cloud/access-regions-ip-addresses)
+ - Get comfortable with [Versionless dbt Cloud](https://docs.getdbt.com/docs/dbt-versions/versionless-cloud)
+ - Scalable [microbatch incremental models](https://docs.getdbt.com/docs/build/incremental-microbatch)
+ - Advanced CI [features](https://docs.getdbt.com/docs/deploy/advanced-ci)
+ - [Linting with CI jobs](https://docs.getdbt.com/docs/deploy/continuous-integration#sql-linting)
+ - dbt Assist is now [dbt Copilot](https://docs.getdbt.com/docs/cloud/dbt-copilot)
+ - Developer blog on [Snowflake Feature Store and dbt: A bridge between data pipelines and ML](https://docs.getdbt.com/blog/snowflake-feature-store)
+ - New [Quickstart for dbt Cloud CLI](https://docs.getdbt.com/guides/dbt-cloud-cli?step=1)
+ - [Auto-exposures with Tableau](https://docs.getdbt.com/docs/collaborate/auto-exposures)
+ - Semantic Layer integration with [Excel desktop and M365](https://docs.getdbt.com/docs/cloud-integrations/semantic-layer/excel)
+ - [Data health tiles](https://docs.getdbt.com/docs/collaborate/data-tile)
+ - [Semantic Layer and Cloud IDE integration](https://docs.getdbt.com/docs/build/metricflow-commands#metricflow-commands)
+ - Query history in [Explorer](https://docs.getdbt.com/docs/collaborate/model-query-history#view-query-history-in-explorer)
+ - Semantic Layer Metricflow improvements, including [improved granularity and custom calendar](https://docs.getdbt.com/docs/build/metricflow-time-spine#custom-calendar)
+ - [Python SDK](https://docs.getdbt.com/docs/dbt-cloud-apis/sl-python) is now generally available
+
+
+
+
+- **New**: The [dbt Semantic Layer Python software development kit](/docs/dbt-cloud-apis/sl-python) is now [generally available](/docs/dbt-versions/product-lifecycles). It provides users with easy access to the dbt Semantic Layer with Python and enables developers to interact with the dbt Semantic Layer APIs to query metrics/dimensions in downstream tools.
- **Enhancement**: You can now add a description to a singular data test in dbt Cloud Versionless. Use the [`description` property](/reference/resource-properties/description) to document [singular data tests](/docs/build/data-tests#singular-data-tests). You can also use [docs block](/docs/build/documentation#using-docs-blocks) to capture your test description. The enhancement will be included in upcoming dbt Core 1.9 release.
- **New**: Introducing the [microbatch incremental model strategy](/docs/build/incremental-microbatch) (beta), available in dbt Cloud Versionless and will soon be supported in dbt Core 1.9. The microbatch strategy allows for efficient, batch-based processing of large time-series datasets for improved performance and resiliency, especially when you're working with data that changes over time (like new records being added daily). To enable this feature in dbt Cloud, set the `DBT_EXPERIMENTAL_MICROBATCH` environment variable to `true` in your project.
- **New**: The dbt Semantic Layer supports custom calendar configurations in MetricFlow, available in [Preview](/docs/dbt-versions/product-lifecycles#dbt-cloud). Custom calendar configurations allow you to query data using non-standard time periods like `fiscal_year` or `retail_month`. Refer to [custom calendar](/docs/build/metricflow-time-spine#custom-calendar) to learn how to define these custom granularities in your MetricFlow timespine YAML configuration.
@@ -37,7 +63,7 @@ Release notes are grouped by month for both multi-tenant and virtual private clo
## September 2024
-- **New**: Use dbt Assist's co-pilot feature to generate semantic model for your models, now available in beta. dbt Assist automatically generates documentation, tests, and now semantic models based on the data in your model, . To learn more, refer to [dbt Assist](/docs/cloud/dbt-assist).
+- **New**: Use the dbt Copilot AI engine to generate semantic model for your models, now available in beta. dbt Copilot automatically generates documentation, tests, and now semantic models based on the data in your model, . To learn more, refer to [dbt Copilot](/docs/cloud/dbt-copilot).
- **New**: Use the new recommended syntax for [defining `foreign_key` constraints](/reference/resource-properties/constraints) using `refs`, available in dbt Cloud Versionless. This will soon be released in dbt Core v1.9. This new syntax will capture dependencies and works across different environments.
- **Enhancement**: You can now run [Semantic Layer commands](/docs/build/metricflow-commands) commands in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). The supported commands are `dbt sl list`, `dbt sl list metrics`, `dbt sl list dimension-values`, `dbt sl list saved-queries`, `dbt sl query`, `dbt sl list dimensions`, `dbt sl list entities`, and `dbt sl validate`.
- **New**: Microsoft Excel, a dbt Semantic Layer integration, is now generally available. The integration allows you to connect to Microsoft Excel to query metrics and collaborate with your team. Available for [Excel Desktop](https://pages.store.office.com/addinsinstallpage.aspx?assetid=WA200007100&rs=en-US&correlationId=4132ecd1-425d-982d-efb4-de94ebc83f26) or [Excel Online](https://pages.store.office.com/addinsinstallpage.aspx?assetid=WA200007100&rs=en-US&correlationid=4132ecd1-425d-982d-efb4-de94ebc83f26&isWac=True). For more information, refer to [Microsoft Excel](/docs/cloud-integrations/semantic-layer/excel).
@@ -108,7 +134,7 @@ Release notes are grouped by month for both multi-tenant and virtual private clo
The following features are new or enhanced as part of our [dbt Cloud Launch Showcase](https://www.getdbt.com/resources/webinars/dbt-cloud-launch-showcase) event on May 14th, 2024:
-- **New:** [dbt Assist](/docs/cloud/dbt-assist) is a powerful AI feature helping you generate documentation and tests, saving you time as you deliver high-quality data. Available in private beta for a subset of dbt Cloud Enterprise users and in the dbt Cloud IDE. [Register your interest](https://docs.google.com/forms/d/e/1FAIpQLScPjRGyrtgfmdY919Pf3kgqI5E95xxPXz-8JoVruw-L9jVtxg/viewform) to join the private beta.
+- **New:** [dbt Copilot](/docs/cloud/dbt-copilot) is a powerful AI engine helping you generate documentation, tests, and semantic models, saving you time as you deliver high-quality data. Available in private beta for a subset of dbt Cloud Enterprise users and in the dbt Cloud IDE. [Register your interest](https://docs.google.com/forms/d/e/1FAIpQLScPjRGyrtgfmdY919Pf3kgqI5E95xxPXz-8JoVruw-L9jVtxg/viewform) to join the private beta.
- **New:** The new low-code editor, now in private beta, enables less SQL-savvy analysts to create or edit dbt models through a visual, drag-and-drop experience inside of dbt Cloud. These models compile directly to SQL and are indistinguishable from other dbt models in your projects: they are version-controlled, can be accessed across projects in dbt Mesh, and integrate with dbt Explorer and the Cloud IDE. [Register your interest](https://docs.google.com/forms/d/e/1FAIpQLScPjRGyrtgfmdY919Pf3kgqI5E95xxPXz-8JoVruw-L9jVtxg/viewform) to join the private beta.
diff --git a/website/docs/guides/qs-cloud-cli.md b/website/docs/guides/qs-cloud-cli.md
new file mode 100644
index 00000000000..1e2a548114f
--- /dev/null
+++ b/website/docs/guides/qs-cloud-cli.md
@@ -0,0 +1,313 @@
+---
+title: "Coalesce: Quickstart for dbt Cloud CLI"
+id: "dbt-cloud-cli"
+# time_to_complete: '30 minutes' commenting out until we test
+level: 'Beginner'
+icon: 'guides'
+hide_table_of_contents: true
+tags: ['Cloud CLI', 'dbt Cloud','Quickstart']
+recently_updated: true
+---
+
+
+
+## Introduction
+
+In this quickstart guide, you'll learn how to configure and use dbt Cloud CLI as part of the Coalesce 24 Workshop.
+
+It will show you how to:
+
+- Set up a dbt Cloud sandbox.
+- Install the dbt Cloud CLI and connect to dbt Cloud.
+- Run commands locally using the dbt Cloud CLI.
+- Defer to different production environments.
+- Leverage cross-project ref.
+- Install dbt Power User.
+- Use dbt Power User to supercharge development.
+
+### Prerequisites
+
+- Familiarity with dbt projects and common commands (for example, `dbt build`)
+- Git is installed
+- An editor, such as Visual Studio Code (preferred), is installed
+
+### Related content
+
+- Learn more with [dbt Learn courses](https://learn.getdbt.com)
+
+## Install Git and Visual Studio Code (Prerequisites)
+
+You will need to have Git installed locally and a code editor (preferably Visual Studio Code).
+
+### Check your installation status
+
+Run `git --version` in your terminal to check if it's installed. For example:
+
+
+
+
+
+Check your installed applications for Visual Studio Code (vscode) or another editor. For example:
+
+
+
+
+
+### Install Git and Visual Studio Code
+
+Navigate to the following Git installation page and install it for your operating system:
+
+https://git-scm.com/downloads
+
+Navigate to the following Visual Studio Code installation page and install it for your operating system.
+
+https://code.visualstudio.com/download
+
+## Set up dbt Cloud (Coalesce Workshop Only)
+
+Let's get set up with a dbt Cloud sandbox that's already connected to a Snowflake account for the workshop.
+
+1. Go to [bit.ly/coalesce-24-sandboxes](https://bit.ly/coalesce-24-sandboxes) to create an account. Make sure you log out of any other dbt Cloud accounts.
+
+ a. Enter your **First Name** and **Last Name**
+
+ b. For **Workshop**, choose **Test driving dbt Cloud CLI and dbt power user** from the dropdown
+
+ c. The **Passcode** will be provided by your facilitators
+
+ d. Accept the terms and click **Complete Registration**
+
+1. Navigate to the platform project by selecting **Project** form the left sidebar and choosing **Platform Analytics**.
+
+1. Select **Deploy >> Runs** to find the created jobs. For each job, click on the job and click **run**.
+
+1. Now repeat for the **Analytics project**. Toggle into the Analytics project.
+
+1. Select **Deploy >> Runs** to find the created jobs. For the one job, click on the job and click **run**.
+
+1. Select **Explore** from the navigation and choose XX. Now you can visualize your dbt Mesh. Click into each project to see project level lineage.
+
+You've now successfully run your project in deployment environments so you can use cross project ref and deferral later in the workshop.
+
+## Configure dbt Cloud CLI
+
+Now we'll clone the project repository and configure dbt Cloud CLI to connect to your sandbox.
+
+### Clone the repo
+
+1. Navigate to a folder on your computer to clone the repository.
+
+1. In your terminal, run the following command to clone the downstream (analytics) project:
+
+ ```shell
+ git clone https://github.com/dbt-labs/c24-workshops-analytics.git
+ ```
+
+### Install Cloud CLI
+
+1. In dbt Cloud, select Platform Analytics and choose **Develop >> Configure Cloud CLI**.
+
+1. Based on your current local setup, use the following guidance to determine your installation approach:
+
+ a. Check if you have dbt in your PATH by running `dbt --version`
+
+ b. If you don't have dbt in your PATH, we recommend the macOS or Windows installation method.
+
+ c. If you do have dbt in your PATH (global environment), we recommend:
+ 1. Uninstalling dbt globally
+ 2. Installing dbt Cloud CLI with a Python virtual environment
+
+ d. If you have dbt in a virtual environment, install dbt Cloud CLI with a separate Python virtual environment. Be sure to activate it with `source
/bin/activate`.
+
+1. Download the CLI configuration file from the dbt Cloud UI. Save it in your `.dbt` folder.
+
+1. Navigate to the dbt project folder that you cloned earlier and open the `dbt_project.yml` file with your `project_id`.
+
+### Confirm the installation
+
+Run `dbt compile` to verify your installation.
+
+There you go! You've installed the dbt Cloud CLI! Let's dive in!
+
+### Additional resources
+Consult the following docs if you run into problems when trying to install the dbt Cloud CLI:
+- [Install dbt Cloud CLI](https://docs.getdbt.com/docs/cloud/cloud-cli-installation)
+- [Configure and use dbt Cloud CLI](https://docs.getdbt.com/docs/cloud/configure-cloud-cli)
+
+## Leverage dbt Cloud CLI
+
+Let's run a few commands together to get comfortable with the dbt Cloud CLI:
+* `dbt debug` — Displays your connection details and information
+* `dbt compile --select stg_campaigns` — Compiles your dbt project
+* `dbt run --select stg_campaigns` — Materialized your dbt models
+* `dbt run --select stg_campaigns` — Preview the results of a model
+* `dbt test --select stg_campaigns` — Execute tests against your materialized models
+
+Now let's dive into some more advanced components of dbt Cloud CLI.
+
+### Deferral
+
+Deferral is a powerful functionality, allowing you to leverage upstream assets that exist outside of your personal development environment. As a result, you can speed up your development workflows and save on warehouse compute costs. Let's run a few commands using deferral:
+
+1. Run `dbt compile -s stg_campaigns`. Notice how we're able to resolve dependencies in the compiled SQL without seeding `campaigns.csv`.
+1. Now let's modify the `stg_campaigns` model by adding a timestamp:
+ ```sql
+ current_timestamp() as updated_at
+ ```
+
+ Let's build that model with the next command.
+1. Run `dbt build --select stg_campaigns`. We're utilizing deferral and the concept of "statefulness" to check with objects that have been modified and resolve dependencies of upstream assets if they exist.
+
+ By default, the dbt Cloud CLI defers to a [Staging](https://docs.getdbt.com/docs/deploy/deploy-environments#staging-environment) environment if one exists. If not, dbt uses the assets from the Production environment.
+
+ To override which environment the dbt Cloud CLI defers to, you can set a `defer-env-id` key in either your `dbt_project.yml` or `dbt_cloud.yml` file. For example:
+
+ ```yml
+ dbt-cloud:
+ defer-env-id: '123456'
+ ```
+
+### dbt Mesh
+
+You have access to cross-project ref's that's powered by the metadata of dbt Cloud.
+
+1. Open the `agg_campaign_customer_contacts` model.
+1. Find the reference called `{{ ref('platform', 'dim_customers', v=1) }}`.
+1. Run the command:
+
+ ```shell
+ dbt run --select agg_campaign_customer_contacts
+ ```
+
+1. Navigate to dbt Cloud Explorer and find a public model. Let's use the `fct_order_items` model.
+1. Create a new model called `agg_orders` in your project with the following code:
+
+ ```sql
+ with orders as (
+
+ select * from {{ ref('platform', 'fct_order_items') }}
+
+ ),
+
+ final as (
+
+ select
+ customer_key as customer_id,
+ is_return as return_status,
+ count(*) as count_orders
+
+ from
+ orders
+ group by
+ customer_key,
+ is_return
+ )
+
+ select * from final
+ ```
+
+### Linting and fixing SQL files
+
+With SQLFluff built in, you can check your code against a style guide and automatically make fixes.
+
+1. Run the SQLFluff command `lint`:
+
+ ```shell
+ dbt sqlfluff lint models/staging/campaigns/stg_campaigns.sql --dialect snowflake
+ ```
+
+ This identifies tweaks to make in the `stg_campaigns` model.
+2. Run the SQLFluff command `fix`:
+
+ ```shell
+ dbt sqlfluff fix models/staging/campaigns/stg_campaigns.sql --dialect snowflake
+ ```
+
+ This attempts to directly make fixes in the `stg_campaigns` model.
+
+### Change branches
+
+You can quickly change branches without fully pushing to your Git provider (such as GitHub):
+
+```shell
+git checkout -b my-new-branch
+
+git checkout main
+```
+
+Now you've taken a tour of what you can do with dbt Cloud CLI. Let's dive into dbt Power User next.
+
+## Install dbt Power User
+
+Let's get dbt Power User installed to supercharge our workflow.
+
+1. From Visual Studio Code, click on extensions and search for "Power User for dbt".
+
+
+
+
+1. Click on install.
+1. Click **Switch to dbt Cloud**. You might need to refresh.
+
+
+
+1. Complete the setup steps. (click on welcome in VSCode and choose dbt Poweruser)
+
+
+
+1. Make an account to sign up and get an API Key: https://app.myaltimate.com/register
+
+1. Copy your API key and enter this into the dbt Power User extension settings.
+
+Now let's dive in!
+
+## Leverage dbt Power User
+
+There is a ton you can do to supercharge your workflow with dbt Cloud. Let's cover some highlights.
+
+### Preview your upstream/downstream changes
+
+Open the Power User extension on the left-hand side. You can see the upstream and downstream projects.
+
+
+
+
+
+### Preview results
+
+Press Command-Enter (or Control-Enter for Windows) and instantly see the results of your model below.
+
+
+
+
+
+### SQL visualization
+
+While looking at a model file, click the Altimate logo in the top right and click **Visualize SQL** to see a breakdown of your SQL model.
+
+
+
+
+
+### Generate test and documentation YML with user-friendly UX and AI
+
+At the top of your model file, click on generate documentation for a UI to rapidly create documentation and tests with AI
+
+
+
+
+
+There is a whole lot more too! Check out the dbt Power User docs here: https://docs.myaltimate.com/
+
+## Conclusion
+
+You've successfully installed dbt Cloud CLI and dbt Power User! Now you can get the benefits of local development _and_ dbt Cloud working together.
+
+Be on the look out for the following enhancements to dbt Cloud CLI:
+- Deeper integration with dbt Explorer for visual interaction
+- Support for invoking production jobs directly from the CLI
+- Continued optimization for performance and scalability improvements
+
+
+
+
diff --git a/website/docs/guides/teradata-qs.md b/website/docs/guides/teradata-qs.md
new file mode 100644
index 00000000000..da951620515
--- /dev/null
+++ b/website/docs/guides/teradata-qs.md
@@ -0,0 +1,400 @@
+---
+title: "Quickstart for dbt Cloud and Teradata"
+id: "teradata"
+level: 'Beginner'
+icon: 'teradata'
+tags: ['dbt Cloud','Quickstart','Teradata']
+hide_table_of_contents: true
+---
+
+
+
+## Introduction
+
+In this quickstart guide, you'll learn how to use dbt Cloud with Teradata Vantage. It will show you how to:
+
+- Create a new Teradata Clearscape instance
+- Load sample data into your Teradata Database
+- Connect dbt Cloud to Teradata.
+- Take a sample query and turn it into a model in your dbt project. A model in dbt is a select statement.
+- Add tests to your models.
+- Document your models.
+- Schedule a job to run.
+
+:::tip Videos for you
+You can check out [dbt Fundamentals](https://learn.getdbt.com/courses/dbt-fundamentals) for free if you're interested in course learning with videos.
+:::
+
+### Prerequisites
+
+- You have a [dbt Cloud account](https://www.getdbt.com/signup/).
+- You have access to a Teradata Vantage instance. You can provision one for free at https://clearscape.teradata.com. See [the ClearScape Analytics Experience guide](https://developers.teradata.com/quickstarts/get-access-to-vantage/clearscape-analytics-experience/getting-started-with-csae/) for details.
+
+### Related content
+
+- Learn more with [dbt Learn courses](https://learn.getdbt.com)
+- [How we provision Teradata Clearscape Vantage instance](https://developers.teradata.com/quickstarts/get-access-to-vantage/clearscape-analytics-experience/getting-started-with-csae/)
+- [CI jobs](/docs/deploy/continuous-integration)
+- [Deploy jobs](/docs/deploy/deploy-jobs)
+- [Job notifications](/docs/deploy/job-notifications)
+- [Source freshness](/docs/deploy/source-freshness)
+
+## Load data
+
+The following steps will guide you through how to get the data stored as CSV files in a public S3 bucket and insert it into the tables.
+
+:::tip SQL IDE
+
+If you created your Teradata Vantage database instance at https://clearscape.teradata.com and you don't have an SQL IDE handy, use the JupyterLab bundled with your database to execute SQL:
+
+1. Navigate to [ClearScape Analytics Experience dashboard](https://clearscape.teradata.com/dashboard) and click the **Run Demos** button. The demo will launch JupyterLab.
+
+2. In JupyterLab, go to **Launcher** by clicking the blue **+** icon in the top left corner. Find the **Notebooks** section and click **Teradata SQL**.
+
+3. In the notebook's first cell, connect to the database using `connect` magic. You will be prompted to enter your database password when you execute it:
+ ```ipynb
+ %connect local
+ ```
+4. Use additional cells to type and run SQL statements.
+
+:::
+
+1. Use your preferred SQL IDE editor to create two databases: `jaffle_shop` and `stripe`:
+
+ ```sql
+ CREATE DATABASE jaffle_shop AS PERM = 1e9;
+ CREATE DATABASE stripe AS PERM = 1e9;
+ ```
+
+2. In the databases `jaffle_shop` and `stripe`, create three foreign tables and reference the respective csv files located in object storage:
+
+ ```sql
+ CREATE FOREIGN TABLE jaffle_shop.customers (
+ id integer,
+ first_name varchar (100),
+ last_name varchar (100)
+ )
+ USING (
+ LOCATION ('/s3/dbt-tutorial-public.s3.amazonaws.com/jaffle_shop_customers.csv')
+ )
+ NO PRIMARY INDEX;
+
+ CREATE FOREIGN TABLE jaffle_shop.orders (
+ id integer,
+ user_id integer,
+ order_date date,
+ status varchar(100)
+ )
+ USING (
+ LOCATION ('/s3/dbt-tutorial-public.s3.amazonaws.com/jaffle_shop_orders.csv')
+ )
+ NO PRIMARY INDEX;
+
+ CREATE FOREIGN TABLE stripe.payment (
+ id integer,
+ orderid integer,
+ paymentmethod varchar (100),
+ status varchar (100),
+ amount integer,
+ created date
+ )
+ USING (
+ LOCATION ('/s3/dbt-tutorial-public.s3.amazonaws.com/stripe_payments.csv')
+ )
+ NO PRIMARY INDEX;
+ ```
+
+## Connect dbt cloud to Teradata
+
+1. Create a new project in dbt Cloud. From **Account settings** (using the gear menu in the top right corner), click **New Project**.
+2. Enter a project name and click **Continue**.
+3. In **Configure your development environment**, click **Add new connection**.
+4. Select **Teradata**, fill in all the required details in the **Settings** section, and test the connection.
+
+
+
+
+
+5. Enter your **Development Credentials** for Teradata with:
+ * **Username** — The username of Teradata database.
+ * **Password** — The password of Teradata database.
+ * **Schema** — The default database to use
+
+
+
+6. Click **Test Connection** to verify that dbt Cloud can access your Teradata Vantage instance.
+7. If the connection test succeeds, click **Next**. If it fails, check your Teradata settings and credentials.
+
+## Set up a dbt Cloud managed repository
+
+
+
+## Initialize your dbt project and start developing
+
+Now that you have a repository configured, you can initialize your project and start development in dbt Cloud:
+
+1. Click **Start developing in the IDE**. It might take a few minutes for your project to spin up for the first time as it establishes your git connection, clones your repo, and tests the connection to the warehouse.
+2. Above the file tree to the left, click **Initialize your project** to build out your folder structure with example models.
+3. Make your initial commit by clicking **Commit and sync**. Use the commit message `initial commit` to create the first commit to your managed repo. Once you’ve created the commit, you can open a branch to add new dbt code.
+4. You can now directly query data from your warehouse and execute `dbt run`. You can try this out now:
+ - Click **Create new file**, add this query to the new file, and click **Save as** to save the new file:
+ ```sql
+ select * from jaffle_shop.customers
+ ```
+ - In the command line bar at the bottom, enter `dbt run` and click **Enter**. You should see a `dbt run succeeded` message.
+
+## Build your first model
+
+You have two options for working with files in the dbt Cloud IDE:
+
+- Create a new branch (recommended) — Create a new branch to edit and commit your changes. Navigate to **Version Control** on the left sidebar and click **Create branch**.
+- Edit in the protected primary branch — If you prefer to edit, format, lint files, or execute dbt commands directly in your primary git branch. The dbt Cloud IDE prevents commits to the protected branch, so you will receive a prompt to commit your changes to a new branch.
+
+Name the new branch `add-customers-model`.
+
+1. Click the **...** next to the `models` directory, then select **Create file**.
+2. Name the file `customers.sql`, then click **Create**.
+3. Copy the following query into the file and click **Save**.
+
+```sql
+
+with customers as (
+
+ select
+ id as customer_id,
+ first_name,
+ last_name
+
+ from jaffle_shop.customers
+
+),
+
+orders as (
+
+ select
+ id as order_id,
+ user_id as customer_id,
+ order_date,
+ status
+
+ from jaffle_shop.orders
+
+),
+
+customer_orders as (
+
+ select
+ customer_id,
+
+ min(order_date) as first_order_date,
+ max(order_date) as most_recent_order_date,
+ count(order_id) as number_of_orders
+
+ from orders
+
+ group by 1
+
+),
+
+final as (
+
+ select
+ customers.customer_id,
+ customers.first_name,
+ customers.last_name,
+ customer_orders.first_order_date,
+ customer_orders.most_recent_order_date,
+ coalesce(customer_orders.number_of_orders, 0) as number_of_orders
+
+ from customers
+
+ left join customer_orders using (customer_id)
+
+)
+
+select * from final
+
+```
+
+4. Enter `dbt run` in the command prompt at the bottom of the screen. You should get a successful run and see the three models.
+
+You can connect your business intelligence (BI) tools to these views and tables so they only read cleaned-up data rather than raw data in your BI tool.
+
+## Change the way your model is materialized
+
+
+
+## Delete the example models
+
+
+
+## Build models on top of other models
+
+
+
+1. Create a new SQL file, `models/stg_customers.sql`, with the SQL from the `customers` CTE in your original query.
+2. Create a second new SQL file, `models/stg_orders.sql`, with the SQL from the `orders` CTE in your original query.
+
+
+
+ ```sql
+ select
+ id as customer_id,
+ first_name,
+ last_name
+
+ from jaffle_shop.customers
+ ```
+
+
+
+
+
+ ```sql
+ select
+ id as order_id,
+ user_id as customer_id,
+ order_date,
+ status
+
+ from jaffle_shop.orders
+ ```
+
+
+
+3. Edit the SQL in your `models/customers.sql` file as follows:
+
+
+
+ ```sql
+ with customers as (
+
+ select * from {{ ref('stg_customers') }}
+
+ ),
+
+ orders as (
+
+ select * from {{ ref('stg_orders') }}
+
+ ),
+
+ customer_orders as (
+
+ select
+ customer_id,
+
+ min(order_date) as first_order_date,
+ max(order_date) as most_recent_order_date,
+ count(order_id) as number_of_orders
+
+ from orders
+
+ group by 1
+
+ ),
+
+ final as (
+
+ select
+ customers.customer_id,
+ customers.first_name,
+ customers.last_name,
+ customer_orders.first_order_date,
+ customer_orders.most_recent_order_date,
+ coalesce(customer_orders.number_of_orders, 0) as number_of_orders
+
+ from customers
+
+ left join customer_orders using (customer_id)
+
+ )
+
+ select * from final
+
+ ```
+
+
+
+4. Execute `dbt run`.
+
+ This time, when you performed a `dbt run`, it created separate views/tables for `stg_customers`, `stg_orders`, and `customers`. dbt inferred the order in which these models should run. Because `customers` depends on `stg_customers` and `stg_orders`, dbt builds `customers` last. You don’t need to define these dependencies explicitly.
+
+#### FAQs {#faq-2}
+
+
+
+
+
+## Build models on top of sources
+
+Sources make it possible to name and describe the data loaded into your warehouse by your extract and load tools. By declaring these tables as sources in dbt, you can:
+- Select from source tables in your models using the `{{ source() }}` function, helping define the lineage of your data
+- Test your assumptions about your source data
+- Calculate the freshness of your source data
+
+1. Create a new YML file, `models/sources.yml`.
+2. Declare the sources by copying the following into the file and clicking **Save**.
+
+
+
+ ```yml
+ version: 2
+
+ sources:
+ - name: jaffle_shop
+ description: This is a replica of the Postgres database used by the app
+ database: raw
+ schema: jaffle_shop
+ tables:
+ - name: customers
+ description: One record per customer.
+ - name: orders
+ description: One record per order. Includes canceled and deleted orders.
+ ```
+
+
+
+3. Edit the `models/stg_customers.sql` file to select from the `customers` table in the `jaffle_shop` source.
+
+
+
+ ```sql
+ select
+ id as customer_id,
+ first_name,
+ last_name
+
+ from {{ source('jaffle_shop', 'customers') }}
+ ```
+
+
+
+4. Edit the `models/stg_orders.sql` file to select from the `orders` table in the `jaffle_shop` source.
+
+
+
+ ```sql
+ select
+ id as order_id,
+ user_id as customer_id,
+ order_date,
+ status
+
+ from {{ source('jaffle_shop', 'orders') }}
+ ```
+
+
+
+5. Execute `dbt run`.
+
+ Your `dbt run` results will be the same as those in the previous step. Your `stg_customers` and `stg_orders`
+ models will still query from the same raw data source in Teradata. By using `source`, you can
+ test and document your raw data and also understand the lineage of your sources.
+
+
+
+
+
+
+
diff --git a/website/docs/reference/artifacts/dbt-artifacts.md b/website/docs/reference/artifacts/dbt-artifacts.md
index c38cc2768e1..b8998dba261 100644
--- a/website/docs/reference/artifacts/dbt-artifacts.md
+++ b/website/docs/reference/artifacts/dbt-artifacts.md
@@ -22,7 +22,7 @@ dbt has produced artifacts since the release of dbt-docs in v0.11.0. Starting in
### When are artifacts produced?
Most dbt commands (and corresponding RPC methods) produce artifacts:
-- [semantic manifest](/docs/dbt-cloud-apis/sl-manifest): produced whenever your dbt project is parsed
+- [semantic manifest](/reference/artifacts/sl-manifest): produced whenever your dbt project is parsed
- [manifest](/reference/artifacts/manifest-json): produced by commands that read and understand your project
- [run results](/reference/artifacts/run-results-json): produced by commands that run, compile, or catalog nodes in your DAG
- [catalog](catalog-json): produced by `docs generate`
diff --git a/website/docs/reference/artifacts/other-artifacts.md b/website/docs/reference/artifacts/other-artifacts.md
index 0216acccff0..e37662ae28c 100644
--- a/website/docs/reference/artifacts/other-artifacts.md
+++ b/website/docs/reference/artifacts/other-artifacts.md
@@ -39,7 +39,7 @@ Each of those points in time contains the `name` and `type` of each node and `su
### semantic_manifest.json
-The [`semantic_manifest.json`](/docs/dbt-cloud-apis/sl-manifest) file is useful as an internal interface between `dbt-core` and MetricFlow. As such, it functions as a behind-the-scenes bridge for interaction between the two systems. You can find all of the `semantic_manifest.json` information in the [`semantic_manifest.json`](/docs/dbt-cloud-apis/sl-manifest).
+The [`semantic_manifest.json`](/reference/artifacts/sl-manifest) file is useful as an internal interface between `dbt-core` and MetricFlow. As such, it functions as a behind-the-scenes bridge for interaction between the two systems. You can find all of the `semantic_manifest.json` information in the [`semantic_manifest.json`](/reference/artifacts/sl-manifest).
There are two reasons why `semantic_manifest.json` exists alongside `manifest.json`:
diff --git a/website/docs/docs/dbt-cloud-apis/sl-manifest.md b/website/docs/reference/artifacts/sl-manifest.md
similarity index 90%
rename from website/docs/docs/dbt-cloud-apis/sl-manifest.md
rename to website/docs/reference/artifacts/sl-manifest.md
index d5bcf5a6774..03e661841c4 100644
--- a/website/docs/docs/dbt-cloud-apis/sl-manifest.md
+++ b/website/docs/reference/artifacts/sl-manifest.md
@@ -7,26 +7,24 @@ sidebar_label: "Semantic manifest"
pagination_next: null
---
+**Produced by:** Any command that parses your project. This includes all commands _except_ [`deps`](/reference/commands/deps), [`clean`](/reference/commands/clean), [`debug`](/reference/commands/debug), and [`init`](/reference/commands/init).
+
dbt creates an [artifact](/reference/artifacts/dbt-artifacts) file called the _Semantic Manifest_ (`semantic_manifest.json`), which MetricFlow requires to build and run metric queries properly for the dbt Semantic Layer. This artifact contains comprehensive information about your dbt Semantic Layer. It is an internal file that acts as the integration point with MetricFlow.
By using the semantic manifest produced by dbt Core, MetricFlow will instantiate a data flow plan and generate SQL from Semantic Layer query requests. It's a valuable reference that you can use to understand the structure and details of your data models.
Similar to the [`manifest.json` file](/reference/artifacts/manifest-json), the `semantic_manifest.json` file also lives in the [target directory](/reference/global-configs/json-artifacts) of your dbt project where dbt stores various artifacts (such as compiled models and tests) generated during the execution of your project.
-## How it's produced
-
-Just like `manifest.json`, the `semantic_manifest.json` is produced whenever your dbt project is parsed. All dbt commands will parse your project and create a `semantic_manifest.json` file, _except_ [`deps`](/reference/commands/deps), [`clean`](/reference/commands/clean), [`debug`](/reference/commands/debug), and [`init`](/reference/commands/init).
-
-
-## Top level keys
+## Top-level keys
Top-level keys for the semantic manifest are:
- `semantic_models` — Starting points of data with entities, dimensions, and measures, and correspond to models in your dbt project.
- `metrics` — Functions combining measures, constraints, and so on to define quantitative indicators.
- `project_configuration` — Contains information around your project configurations
-
-Example target/semantic_manifest.json
file
+### Example
+
+
```json
{
@@ -112,7 +110,7 @@ Top-level keys for the semantic manifest are:
}
```
-
+
## Related docs
diff --git a/website/docs/reference/node-selection/defer.md b/website/docs/reference/node-selection/defer.md
index 99dbea401b3..863494de12e 100644
--- a/website/docs/reference/node-selection/defer.md
+++ b/website/docs/reference/node-selection/defer.md
@@ -31,7 +31,7 @@ dbt test --models [...] --defer --state path/to/artifacts
When the `--defer` flag is provided, dbt will resolve `ref` calls differently depending on two criteria:
1. Is the referenced node included in the model selection criteria of the current run?
-2. Does the reference node exist as a database object in the current environment?
+2. Does the referenced node exist as a database object in the current environment?
If the answer to both is **no**—a node is not included _and_ it does not exist as a database object in the current environment—references to it will use the other namespace instead, provided by the state manifest.
@@ -71,8 +71,6 @@ group by 1
I want to test my changes. Nothing exists in my development schema, `dev_alice`.
-### test
-
+### test
+
I also have a `relationships` test that establishes referential integrity between `model_a` and `model_b`:
diff --git a/website/docs/reference/resource-configs/firebolt-configs.md b/website/docs/reference/resource-configs/firebolt-configs.md
index 394823e33de..0ab14354003 100644
--- a/website/docs/reference/resource-configs/firebolt-configs.md
+++ b/website/docs/reference/resource-configs/firebolt-configs.md
@@ -38,8 +38,8 @@ models:
+table_type: fact
+primary_index: [ , ... ]
+indexes:
- - type: aggregating
- key_column: [ , ... ]
+ - index_type: aggregating
+ key_columns: [ , ... ]
aggregation: [ , ... ]
...
```
@@ -58,8 +58,8 @@ models:
table_type: fact
primary_index: [ , ... ]
indexes:
- - type: aggregating
- key_column: [ , ... ]
+ - index_type: aggregating
+ key_columns: [ , ... ]
aggregation: [ , ... ]
...
```
@@ -77,9 +77,9 @@ models:
primary_index = [ "", ... ],
indexes = [
{
- type = "aggregating"
- key_column = [ "", ... ],
- aggregation = [ "", ... ],
+ "index_type": "aggregating"
+ "key_columns": [ "", ... ],
+ "aggregation": [ "", ... ],
},
...
]
@@ -99,8 +99,8 @@ models:
| `table_type` | Whether the materialized table will be a [fact or dimension](https://docs.firebolt.io/godocs/Overview/working-with-tables/working-with-tables.html#fact-and-dimension-tables) table. |
| `primary_index` | Sets the primary index for the fact table using the inputted list of column names from the model. Required for fact tables. |
| `indexes` | A list of aggregating indexes to create on the fact table. |
-| `type` | Specifies that the index is an [aggregating index](https://docs.firebolt.io/godocs/Guides/working-with-indexes/using-aggregating-indexes.html). Should be set to `aggregating`. |
-| `key_column` | Sets the grouping of the aggregating index using the inputted list of column names from the model. |
+| `index_type` | Specifies that the index is an [aggregating index](https://docs.firebolt.io/godocs/Guides/working-with-indexes/using-aggregating-indexes.html). Should be set to `aggregating`. |
+| `key_columns` | Sets the grouping of the aggregating index using the inputted list of column names from the model. |
| `aggregation` | Sets the aggregations on the aggregating index using the inputted list of SQL agg expressions. |
@@ -113,9 +113,9 @@ models:
primary_index = "id",
indexes = [
{
- type: "aggregating",
- key_column: "order_id",
- aggregation: ["COUNT(DISTINCT status)", "AVG(customer_id)"]
+ "index_type": "aggregating",
+ "key_columns": "order_id",
+ "aggregation": ["COUNT(DISTINCT status)", "AVG(customer_id)"]
}
]
) }}
diff --git a/website/docs/reference/resource-properties/constraints.md b/website/docs/reference/resource-properties/constraints.md
index ed38132c367..63582974040 100644
--- a/website/docs/reference/resource-properties/constraints.md
+++ b/website/docs/reference/resource-properties/constraints.md
@@ -47,7 +47,7 @@ models:
columns: [first_column, second_column, ...]
- type: foreign_key # multi_column
columns: [first_column, second_column, ...]
- to: "{{ ref('other_model_name') }}"
+ to: ref('other_model_name')
to_columns: [other_model_first_column, other_model_second_columns, ...]
- type: check
columns: [first_column, second_column, ...]
@@ -64,7 +64,7 @@ models:
- type: not_null
- type: unique
- type: foreign_key
- to: "{{ ref('other_model_name') }}"
+ to: ref('other_model_name')
to_columns: other_model_column
- type: ...
```
diff --git a/website/docusaurus.config.js b/website/docusaurus.config.js
index 82eb6df54f4..dbd389a2299 100644
--- a/website/docusaurus.config.js
+++ b/website/docusaurus.config.js
@@ -72,19 +72,21 @@ var siteSettings = {
},
announcementBar: {
id: "biweekly-demos",
- content: "Register now for Coalesce 2024 ✨ The Analytics Engineering Conference!",
+ content:
+ "Register now for Coalesce 2024 ✨ The Analytics Engineering Conference!",
backgroundColor: "#7444FD",
textColor: "#fff",
isCloseable: true,
},
announcementBarActive: true,
- announcementBarLink: "https://coalesce.getdbt.com/register/?utm_medium=internal&utm_source=docs&utm_campaign=q3-2025_coalesce-2024_aw&utm_content=coalesce____&utm_term=all_all__",
+ announcementBarLink:
+ "https://coalesce.getdbt.com/register/?utm_medium=internal&utm_source=docs&utm_campaign=q3-2025_coalesce-2024_aw&utm_content=coalesce____&utm_term=all_all__",
// Set community spotlight member on homepage
// This is the ID for a specific file under docs/community/spotlight
communitySpotlightMember: "meagan-palmer",
prism: {
theme: (() => {
- var theme = themes.nightOwl;
+ var theme = themes.nightOwl;
// Add additional rule to nightowl theme in order to change
// the color of YAML keys (to be different than values).
// There weren't many Prism themes that differentiated
@@ -200,6 +202,12 @@ var siteSettings = {
links: [
{
html: `
+
+
diff --git a/website/snippets/_auto-exposures-view.md b/website/snippets/_auto-exposures-view.md
index 95f81782cab..d30b47ae21d 100644
--- a/website/snippets/_auto-exposures-view.md
+++ b/website/snippets/_auto-exposures-view.md
@@ -1,4 +1,4 @@
-## View auto-exposures in dbt Explorer
+## View auto-exposures in dbt Explorer
After setting up auto-exposures in dbt Cloud, you can view them in dbt Explorer for a richer experience:
1. Navigate to dbt Explorer by clicking on the **Explore** link in the navigation.
diff --git a/website/snippets/_sl-course.md b/website/snippets/_sl-course.md
index 6be9ec7e959..1400be91f37 100644
--- a/website/snippets/_sl-course.md
+++ b/website/snippets/_sl-course.md
@@ -3,7 +3,7 @@
Explore our [dbt Semantic Layer on-demand course](https://learn.getdbt.com/courses/semantic-layer) to learn how to define and query metrics in your dbt project.
-Additionally, dive into mini-courses for querying the dbt Semantic Layer in your favorite tools: [Tableau](https://courses.getdbt.com/courses/tableau-querying-the-semantic-layer), [Hex](https://courses.getdbt.com/courses/hex-querying-the-semantic-layer), and [Mode](https://courses.getdbt.com/courses/mode-querying-the-semantic-layer).
+Additionally, dive into mini-courses for querying the dbt Semantic Layer in your favorite tools: [Tableau](https://courses.getdbt.com/courses/tableau-querying-the-semantic-layer), [Excel](https://learn.getdbt.com/courses/querying-the-semantic-layer-with-excel), [Hex](https://courses.getdbt.com/courses/hex-querying-the-semantic-layer), and [Mode](https://courses.getdbt.com/courses/mode-querying-the-semantic-layer).
diff --git a/website/snippets/_sl-partner-links.md b/website/snippets/_sl-partner-links.md
index 28e4dc24b39..aaefcc77747 100644
--- a/website/snippets/_sl-partner-links.md
+++ b/website/snippets/_sl-partner-links.md
@@ -54,9 +54,9 @@ The following tools integrate with the dbt Semantic Layer:
-
@@ -68,9 +68,9 @@ The following tools integrate with the dbt Semantic Layer:
-
@@ -82,9 +82,9 @@ The following tools integrate with the dbt Semantic Layer:
-
diff --git a/website/snippets/_sl-run-prod-job.md b/website/snippets/_sl-run-prod-job.md
index f820b7f3f79..318b8d27cbf 100644
--- a/website/snippets/_sl-run-prod-job.md
+++ b/website/snippets/_sl-run-prod-job.md
@@ -6,7 +6,7 @@ This section explains how you can perform a job run in your deployment environme
3. To create a new environment, navigate to **Deploy** in the navigation menu, select **Environments**, and then select **Create new environment**.
4. Fill in your deployment credentials with your Snowflake username and password. You can name the schema anything you want. Click **Save** to create your new production environment.
5. [Create a new deploy job](/docs/deploy/deploy-jobs#create-and-schedule-jobs) that runs in the environment you just created. Go back to the **Deploy** menu, select **Jobs**, select **Create job**, and click **Deploy job**.
-6. Set the job to run a `dbt parse` job to parse your projects and generate a [`semantic_manifest.json` artifact](/docs/dbt-cloud-apis/sl-manifest) file. Although running `dbt build` isn't required, you can choose to do so if needed.
+6. Set the job to run a `dbt parse` job to parse your projects and generate a [`semantic_manifest.json` artifact](/reference/artifacts/sl-manifest) file. Although running `dbt build` isn't required, you can choose to do so if needed.
7. Run the job by clicking the **Run now** button. Monitor the job's progress in real-time through the **Run summary** tab.
Once the job completes successfully, your dbt project, including the generated documentation, will be fully deployed and available for use in your production environment. If any issues arise, review the logs to diagnose and address any errors.
diff --git a/website/snippets/_snapshot-yaml-spec.md b/website/snippets/_snapshot-yaml-spec.md
index 8bbdc6be72e..cb1675ce5bd 100644
--- a/website/snippets/_snapshot-yaml-spec.md
+++ b/website/snippets/_snapshot-yaml-spec.md
@@ -1,4 +1,6 @@
:::info Use the latest snapshot syntax
-In Versionless and dbt v1.9 and later, snapshots are defined in an updated syntax using a YAML file within your `snapshots/` directory (as defined by the [`snapshot-paths` config](/reference/project-configs/snapshot-paths)). For faster and more efficient management, consider the updated snapshot YAML syntax, [available in Versionless](/docs/dbt-versions/versionless-cloud) or [dbt Core v1.9 and later](/docs/dbt-versions/core).
+In [dbt Cloud Versionless](/docs/dbt-versions/versionless-cloud) or [dbt Core v1.9 and later](/docs/dbt-versions/core), you can configure snapshots in YAML files using the updated syntax within your `snapshots/` directory (as defined by the [`snapshot-paths` config](/reference/project-configs/snapshot-paths)).
+
+This syntax allows for faster, more efficient snapshot management. To use it, upgrade to Versionless or dbt v1.9 or newer.
:::
diff --git a/website/static/img/blog/2024-10-04-iceberg-blog/2024-10-03-iceberg-support.png b/website/static/img/blog/2024-10-04-iceberg-blog/2024-10-03-iceberg-support.png
new file mode 100644
index 00000000000..2b99378fa84
Binary files /dev/null and b/website/static/img/blog/2024-10-04-iceberg-blog/2024-10-03-iceberg-support.png differ
diff --git a/website/static/img/blog/2024-10-04-iceberg-blog/iceberg_materialization.png b/website/static/img/blog/2024-10-04-iceberg-blog/iceberg_materialization.png
new file mode 100644
index 00000000000..c20e7855858
Binary files /dev/null and b/website/static/img/blog/2024-10-04-iceberg-blog/iceberg_materialization.png differ
diff --git a/website/static/img/blog/authors/luis-leon.png b/website/static/img/blog/authors/luis-leon.png
new file mode 100644
index 00000000000..ce3c09784ba
Binary files /dev/null and b/website/static/img/blog/authors/luis-leon.png differ
diff --git a/website/static/img/blog/authors/randy-pettus.png b/website/static/img/blog/authors/randy-pettus.png
new file mode 100644
index 00000000000..e3468d9aca7
Binary files /dev/null and b/website/static/img/blog/authors/randy-pettus.png differ
diff --git a/website/static/img/blog/example-features-produced.png b/website/static/img/blog/example-features-produced.png
new file mode 100644
index 00000000000..4aaa34cf3e9
Binary files /dev/null and b/website/static/img/blog/example-features-produced.png differ
diff --git a/website/static/img/blog/example-snowflake-ui.png b/website/static/img/blog/example-snowflake-ui.png
new file mode 100644
index 00000000000..86c3394bcd0
Binary files /dev/null and b/website/static/img/blog/example-snowflake-ui.png differ
diff --git a/website/static/img/blog/example-training-data-set.png b/website/static/img/blog/example-training-data-set.png
new file mode 100644
index 00000000000..085b2785f06
Binary files /dev/null and b/website/static/img/blog/example-training-data-set.png differ
diff --git a/website/static/img/cloud-cli-guide/finder-vscode-check.png b/website/static/img/cloud-cli-guide/finder-vscode-check.png
new file mode 100644
index 00000000000..ab303c00c3a
Binary files /dev/null and b/website/static/img/cloud-cli-guide/finder-vscode-check.png differ
diff --git a/website/static/img/cloud-cli-guide/setup-poweruser-01.png b/website/static/img/cloud-cli-guide/setup-poweruser-01.png
new file mode 100644
index 00000000000..e750bc34ed7
Binary files /dev/null and b/website/static/img/cloud-cli-guide/setup-poweruser-01.png differ
diff --git a/website/static/img/cloud-cli-guide/setup-poweruser-02.png b/website/static/img/cloud-cli-guide/setup-poweruser-02.png
new file mode 100644
index 00000000000..3ddb52c8407
Binary files /dev/null and b/website/static/img/cloud-cli-guide/setup-poweruser-02.png differ
diff --git a/website/static/img/cloud-cli-guide/setup-poweruser-03.png b/website/static/img/cloud-cli-guide/setup-poweruser-03.png
new file mode 100644
index 00000000000..c7baa1b9984
Binary files /dev/null and b/website/static/img/cloud-cli-guide/setup-poweruser-03.png differ
diff --git a/website/static/img/cloud-cli-guide/terminal-git-check.png b/website/static/img/cloud-cli-guide/terminal-git-check.png
new file mode 100644
index 00000000000..59ab886b47e
Binary files /dev/null and b/website/static/img/cloud-cli-guide/terminal-git-check.png differ
diff --git a/website/static/img/cloud-cli-guide/using-poweruser-01.png b/website/static/img/cloud-cli-guide/using-poweruser-01.png
new file mode 100644
index 00000000000..f24a7ac89d2
Binary files /dev/null and b/website/static/img/cloud-cli-guide/using-poweruser-01.png differ
diff --git a/website/static/img/cloud-cli-guide/using-poweruser-02.png b/website/static/img/cloud-cli-guide/using-poweruser-02.png
new file mode 100644
index 00000000000..4724540de13
Binary files /dev/null and b/website/static/img/cloud-cli-guide/using-poweruser-02.png differ
diff --git a/website/static/img/cloud-cli-guide/using-poweruser-03.png b/website/static/img/cloud-cli-guide/using-poweruser-03.png
new file mode 100644
index 00000000000..ab28a8d72b0
Binary files /dev/null and b/website/static/img/cloud-cli-guide/using-poweruser-03.png differ
diff --git a/website/static/img/cloud-cli-guide/using-poweruser-04.png b/website/static/img/cloud-cli-guide/using-poweruser-04.png
new file mode 100644
index 00000000000..7d72f4a97e7
Binary files /dev/null and b/website/static/img/cloud-cli-guide/using-poweruser-04.png differ
diff --git a/website/static/img/docs/dbt-cloud/cloud-ide/dbt-copilot-doc.gif b/website/static/img/docs/dbt-cloud/cloud-ide/dbt-copilot-doc.gif
new file mode 100644
index 00000000000..cca8db37a0a
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/cloud-ide/dbt-copilot-doc.gif differ
diff --git a/website/static/img/docs/dbt-cloud/defer-toggle.jpg b/website/static/img/docs/dbt-cloud/defer-toggle.jpg
index fdeb27c4b71..3c3abca0fc2 100644
Binary files a/website/static/img/docs/dbt-cloud/defer-toggle.jpg and b/website/static/img/docs/dbt-cloud/defer-toggle.jpg differ
diff --git a/website/static/img/docs/dbt-cloud/teradata-connection.png b/website/static/img/docs/dbt-cloud/teradata-connection.png
new file mode 100644
index 00000000000..fd2837c16ec
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/teradata-connection.png differ
diff --git a/website/static/img/docs/dbt-cloud/teradata-deployment.png b/website/static/img/docs/dbt-cloud/teradata-deployment.png
new file mode 100644
index 00000000000..e5f2b6986e0
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/teradata-deployment.png differ
diff --git a/website/static/img/teradata/dbt_cloud_teradata_account_settings.png b/website/static/img/teradata/dbt_cloud_teradata_account_settings.png
new file mode 100644
index 00000000000..c7de2425023
Binary files /dev/null and b/website/static/img/teradata/dbt_cloud_teradata_account_settings.png differ
diff --git a/website/static/img/teradata/dbt_cloud_teradata_development_credentials.png b/website/static/img/teradata/dbt_cloud_teradata_development_credentials.png
new file mode 100644
index 00000000000..762fac961ac
Binary files /dev/null and b/website/static/img/teradata/dbt_cloud_teradata_development_credentials.png differ
diff --git a/website/static/img/teradata/dbt_cloud_teradata_setup_connection_start.png b/website/static/img/teradata/dbt_cloud_teradata_setup_connection_start.png
new file mode 100644
index 00000000000..bbf4c6db380
Binary files /dev/null and b/website/static/img/teradata/dbt_cloud_teradata_setup_connection_start.png differ
diff --git a/website/vercel.json b/website/vercel.json
index e882b50d2fc..0674313f3f5 100644
--- a/website/vercel.json
+++ b/website/vercel.json
@@ -2,6 +2,31 @@
"cleanUrls": true,
"trailingSlash": false,
"redirects": [
+ {
+ "source": "/docs/dbt-cloud-apis/sl-manifest",
+ "destination": "/reference/artifacts/sl-manifest",
+ "permanent": true
+ },
+ {
+ "source": "/docs/cloud/dbt-assist-data",
+ "destination": "/docs/cloud/dbt-copilot-data",
+ "permanent": true
+ },
+ {
+ "source": "/docs/cloud/use-dbt-assist",
+ "destination": "/docs/cloud/use-dbt-copilot",
+ "permanent": true
+ },
+ {
+ "source": "/docs/cloud/enable-dbt-assist",
+ "destination": "/docs/cloud/enable-dbt-copilot",
+ "permanent": true
+ },
+ {
+ "source": "/docs/cloud/dbt-assist",
+ "destination": "/docs/cloud/dbt-copilot",
+ "permanent": true
+ },
{
"source": "/faqs/Troubleshooting/access_token_error",
"destination": "/faqs/Troubleshooting/auth-expired-error",