diff --git a/website/docs/docs/cloud-integrations/semantic-layer/excel.md b/website/docs/docs/cloud-integrations/semantic-layer/excel.md
index 6a70217e0f7..4f76bfc5c97 100644
--- a/website/docs/docs/cloud-integrations/semantic-layer/excel.md
+++ b/website/docs/docs/cloud-integrations/semantic-layer/excel.md
@@ -37,7 +37,7 @@ import Tools from '/snippets/_sl-excel-gsheets.md';
 
 <Tools 
 type="Microsoft Excel"
-bullet_1="There's no timeout limit."
+bullet_1="There's a timeout of 1 minute for queries."
 bullet_2="If you're using this extension, make sure you're signed into Microsoft with the same Excel profile you used to set up the Add-In. Log in with one profile at a time as using multiple  profiles at once might cause issues."
 />
 
diff --git a/website/docs/docs/cloud/configure-cloud-cli.md b/website/docs/docs/cloud/configure-cloud-cli.md
index 2874e166a8f..854950f5d8c 100644
--- a/website/docs/docs/cloud/configure-cloud-cli.md
+++ b/website/docs/docs/cloud/configure-cloud-cli.md
@@ -189,3 +189,10 @@ move %USERPROFILE%\Downloads\dbt_cloud.yml %USERPROFILE%\.dbt\dbt_cloud.yml
 This command moves the `dbt_cloud.yml` from the `Downloads` folder to the `.dbt` folder. If your `dbt_cloud.yml` file is located elsewhere, adjust the path accordingly.
 
 </Expandable>
+
+<Expandable alt_header="How to skip artifacts from being downloaded">
+
+By default, [all artifacts](/reference/artifacts/dbt-artifacts) are downloaded when you execute dbt commands from the dbt Cloud CLI. To skip these files from being downloaded, add `--download-artifacts=false` to the command you want to run. This can help improve run-time performance but might break workflows that depend on assets like the [manifest](/reference/artifacts/manifest-json). 
+
+
+</Expandable>
\ No newline at end of file
diff --git a/website/docs/docs/cloud/connect-data-platform/about-connections.md b/website/docs/docs/cloud/connect-data-platform/about-connections.md
index 58e6ece30a7..0149dd65a32 100644
--- a/website/docs/docs/cloud/connect-data-platform/about-connections.md
+++ b/website/docs/docs/cloud/connect-data-platform/about-connections.md
@@ -36,7 +36,6 @@ Up until July 2024, connections were nested under projects. One dbt Cloud projec
 We are rolling out an important change that moves connection management to the account level. The following connection management section describes these changes. 
 
 This feature is being rolled out in phases over the coming weeks. 
-
 :::
 
 Warehouse connections are an account-level resource. As such you can find them under **Accounts Settings** > **Connections**:
@@ -53,6 +52,10 @@ As shown in the image, a project with 2 environments can target between 1 and 2
 
 Rolling out account-level connections will not require any interruption of service in your current usage (IDE, CLI, jobs, etc.).
 
+:::info Why am I prompted to configure a development environment?
+If your project did not previously have a development environment, you may be redirected to the project setup page. Your project is still intact. Choose a connection for your new development environment, and you can view all your environments again.
+:::
+
 However, to fully utilize the value of account-level connections, you may have to rethink how you assign and use connections across projects and environments.
 
 <Lightbox src="/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/connections-post-rollout.png" width="60%" title="Typical connection setup post rollout"/>
diff --git a/website/docs/docs/cloud/connect-data-platform/connnect-bigquery.md b/website/docs/docs/cloud/connect-data-platform/connnect-bigquery.md
index 7ea6e380000..0243bc619b1 100644
--- a/website/docs/docs/cloud/connect-data-platform/connnect-bigquery.md
+++ b/website/docs/docs/cloud/connect-data-platform/connnect-bigquery.md
@@ -4,6 +4,9 @@ id: connect-bigquery
 description: "Configure BigQuery connection."
 sidebar_label: "Connect BigQuery"
 ---
+
+## Authentication
+
 ### JSON keyfile
 
 :::info Uploading a service account JSON keyfile
@@ -48,3 +51,99 @@ As an end user, if your organization has set up BigQuery OAuth, you can link a p
 ## Configuration
 
 To learn how to optimize performance with data platform-specific configurations in dbt Cloud, refer to [BigQuery-specific configuration](/reference/resource-configs/bigquery-configs).
+
+### Account level connections and credential management
+
+You can re-use connections across multiple projects with [global connections](/docs/cloud/connect-data-platform/about-connections#migration-from-project-level-connections-to-account-level-connections). Connections are attached at the environment level (formerly project level), so you can utilize multiple connections inside of a single project (to handle dev, staging, production, etc.).
+
+BigQuery connections in dbt Cloud currently expect the credentials to be handled at the connection level (and only BigQuery connections). This was originally designed to facilitate creating a new connection by uploading a service account keyfile. This describes how to override credentials at the environment level, via [extended attributes](/docs/dbt-cloud-environments#extended-attributes), _to allow project administrators to manage credentials independently_ of the account level connection details used for that environment.
+
+For a project, you will first create an environment variable to store the secret `private_key` value. Then, you will use extended attributes to override the entire service account JSON (you can't only override the secret key due to a constraint of extended attributes).
+
+1. **New environment variable**
+
+    - Create a new _secret_ [environment variable](https://docs.getdbt.com/docs/build/environment-variables#handling-secrets) to handle the private key: `DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY`
+    - Fill in the private key value according the environment
+
+    To automate your deployment, use the following [admin API request](https://docs.getdbt.com/dbt-cloud/api-v3#/operations/Create%20Projects%20Environment%20Variables%20Bulk), with `XXXXX` your account number, `YYYYY` your project number, `ZZZZZ` your [API token](/docs/dbt-cloud-apis/authentication):
+
+    ```shell
+    curl --request POST \
+    --url https://cloud.getdbt.com/api/v3/accounts/XXXXX/projects/YYYYY/environment-variables/bulk/ \
+    --header 'Accept: application/json' \
+    --header 'Authorization: Bearer ZZZZZ' \
+    --header 'Content-Type: application/json' \
+    --data '{
+    "env_var": [
+    {
+        "new_name": "DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY",
+        "project": "Value by default for the entire project",
+        "ENVIRONMENT_NAME_1": "Optional, if wanted, value for environment name 1",
+        "ENVIRONMENT_NAME_2": "Optional, if wanted, value for environment name 2"
+    }
+    ]
+    }'
+    ```
+
+2. **Extended attributes**
+
+    In the environment details, complete the [extended attributes](/docs/dbt-cloud-environments#extended-attributes) block with the following payload (replacing `XXX` with your corresponding information):
+
+    ```yaml
+    keyfile_json:
+      type: service_account
+      project_id: xxx
+      private_key_id: xxx
+      private_key: '{{ env_var(''DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY'') }}'
+      client_email: xxx
+      client_id: xxx
+      auth_uri: xxx
+      token_uri: xxx
+      auth_provider_x509_cert_url: xxx
+      client_x509_cert_url: xxx
+    ```
+
+    If you require [other fields](/docs/core/connect-data-platform/bigquery-setup#service-account-json) to be overridden at the environment level via extended attributes, please respect the [expected indentation](/docs/dbt-cloud-environments#only-the-top-level-keys-are-accepted-in-extended-attributes) (ordering doesn't matter):
+
+    ```yaml
+    priority: interactive
+    keyfile_json:
+      type: xxx
+      project_id: xxx
+      private_key_id: xxx
+      private_key: '{{ env_var(''DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY'') }}'
+      client_email: xxx
+      client_id: xxx
+      auth_uri: xxx
+      token_uri: xxx
+      auth_provider_x509_cert_url: xxx
+      client_x509_cert_url: xxx
+    execution_project: buck-stops-here-456
+    ```
+
+    To automate your deployment, you first need to [create the extended attributes payload](https://docs.getdbt.com/dbt-cloud/api-v3#/operations/Create%20Extended%20Attributes) for a given project, and then [assign it](https://docs.getdbt.com/dbt-cloud/api-v3#/operations/Update%20Environment) to a specific environment. With `XXXXX` as your account number, `YYYYY` as your project number, and `ZZZZZ` as your [API token](/docs/dbt-cloud-apis/authentication):
+
+    ```shell
+    curl --request POST \
+    --url https://cloud.getdbt.com/api/v3/accounts/XXXXX/projects/YYYYY/extended-attributes/ \
+    --header 'Accept: application/json' \
+    --header 'Authorization: Bearer ZZZZZ' \
+    --header 'Content-Type: application/json' \
+    --data '{
+    "id": null,
+    "extended_attributes": {"type":"service_account","project_id":"xxx","private_key_id":"xxx","private_key":"{{ env_var('DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY')    }}","client_email":"xxx","client_id":xxx,"auth_uri":"https://accounts.google.com/o/oauth2/auth","token_uri":"https://oauth2.googleapis.com/token","auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs","client_x509_cert_url":"xxx"},
+    "state": 1
+    }'
+    ```
+    _Make a note of the `id` returned in the message._ It will be used in the following call. With `EEEEE` the environment id, `FFFFF` the extended attributes id: 
+
+    ```shell
+    curl --request POST \
+    --url https://cloud.getdbt.com/api/v3/accounts/XXXXX/projects/YYYYY/environments/EEEEE/ \
+    --header 'Accept: application/json' \
+    --header 'Authorization: Bearer ZZZZZZ' \
+    --header 'Content-Type: application/json' \
+    --data '{
+      "extended_attributes_id": FFFFF
+    }'
+    ```
diff --git a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md
index 438cb8c7981..af303d0d9a0 100644
--- a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md
+++ b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md
@@ -10,7 +10,6 @@ pagination_prev: null
 
 The dbt Cloud integrated development environment (IDE) is a single web-based interface for building, testing, running, and version-controlling dbt projects. It compiles dbt code into SQL and executes it directly on your database. 
 
-The dbt Cloud IDE offers several [keyboard shortcuts](/docs/cloud/dbt-cloud-ide/keyboard-shortcuts) and [editing features](/docs/cloud/dbt-cloud-ide/ide-user-interface#editing-features) for faster and efficient development and governance:
 The dbt Cloud IDE offers several [keyboard shortcuts](/docs/cloud/dbt-cloud-ide/keyboard-shortcuts) and [editing features](/docs/cloud/dbt-cloud-ide/ide-user-interface#editing-features) for faster and efficient development and governance:
 
 - Syntax highlighting for SQL &mdash; Makes it easy to distinguish different parts of your code, reducing syntax errors and enhancing readability.
diff --git a/website/docs/docs/cloud/manage-access/audit-log.md b/website/docs/docs/cloud/manage-access/audit-log.md
index 0abf54ff991..70ef4d66f8e 100644
--- a/website/docs/docs/cloud/manage-access/audit-log.md
+++ b/website/docs/docs/cloud/manage-access/audit-log.md
@@ -9,12 +9,12 @@ pagination_prev: "docs/cloud/manage-access/about-user-access"
 
 To review actions performed by people in your organization, dbt provides logs of audited user and system events in real time. The audit log appears as events happen and includes details such as who performed the action, what the action was, and when it was performed. You can use these details to troubleshoot access issues, perform security audits, or analyze specific events. 
 
-You must be an **Account Admin** to access the audit log and this feature is only available on Enterprise plans.
+You must be an **Account Admin** or an **Account Viewer** to access the audit log and this feature is only available on Enterprise plans.
 
 The dbt Cloud audit log stores all the events that occurred in your organization in real-time, including:
 
 - For events within 90 days, the dbt Cloud audit log has a selectable date range that lists events triggered.
-- For events beyond 90 days, **Account Admins** can [export all events](#exporting-logs) by using **Export All**.
+- For events beyond 90 days, **Account Admins** and **Account Viewers** can [export all events](#exporting-logs) by using **Export All**.
 
 ## Accessing the audit log
 
@@ -170,6 +170,6 @@ You can use the audit log to export all historical audit results for security, c
 
 - **For events within 90 days** &mdash; dbt Cloud will automatically display the 90-day selectable date range. Select **Export Selection** to download a CSV file of all the events that occurred in your organization within 90 days.
 
-- **For events beyond 90 days** &mdash; Select **Export All**. The Account Admin will receive an email link to download a CSV file of all the events that occurred in your organization.
+- **For events beyond 90 days** &mdash; Select **Export All**. The Account Admin or Account Viewer will receive an email link to download a CSV file of all the events that occurred in your organization.
 
 <Lightbox src="/img/docs/dbt-cloud/dbt-cloud-enterprise/audit-log-section.jpg" width="95%" title="View audit log export options"/>
diff --git a/website/docs/docs/cloud/manage-access/set-up-snowflake-oauth.md b/website/docs/docs/cloud/manage-access/set-up-snowflake-oauth.md
index 1cd24c16481..3b3b9c2d870 100644
--- a/website/docs/docs/cloud/manage-access/set-up-snowflake-oauth.md
+++ b/website/docs/docs/cloud/manage-access/set-up-snowflake-oauth.md
@@ -12,8 +12,26 @@ This guide describes a feature of the dbt Cloud Enterprise plan. If you’re int
 
 dbt Cloud Enterprise supports [OAuth authentication](https://docs.snowflake.net/manuals/user-guide/oauth-intro.html) with Snowflake. When Snowflake OAuth is enabled, users can authorize their Development credentials using Single Sign On (SSO) via Snowflake rather than submitting a username and password to dbt Cloud. If Snowflake is setup with SSO through a third-party identity provider, developers can use this method to log into Snowflake and authorize the dbt Development credentials without any additional setup.
 
-### Configuring a security integration
-To enable Snowflake OAuth, you will need to create a [security integration](https://docs.snowflake.net/manuals/sql-reference/sql/create-security-integration.html) in Snowflake to manage the OAuth connection between dbt Cloud and Snowflake.
+To set up Snowflake OAuth in dbt Cloud, admins from both are required for the following steps:
+1. [Locate the redirect URI value](#locate-the-redirect-uri-value) in dbt Cloud.
+2. [Create a security integration](#create-a-security-integration) in Snowflake.
+3. [Configure a connection](#configure-a-connection-in-dbt-cloud) in dbt Cloud.
+
+To use Snowflake in the dbt Cloud IDE, all developers must [authenticate with Snowflake](#authorize-developer-credentials) in their profile credentials.
+
+### Locate the redirect URI value
+
+To get started, copy the connection's redirect URI from dbt Cloud:
+1. Navigate to **Account settings**
+1. Select **Projects** and choose a project from the list 
+1. Select the connection to view its details abd set the **OAuth method** to "Snowflake SSO"
+1. Copy the **Redirect URI** for use in later steps
+
+<Lightbox
+	src="/img/docs/dbt-cloud/dbt-cloud-enterprise/snowflake-oauth-redirect-uri.png"
+	title="Locate the Snowflake OAuth redirect URI"
+	alt="The OAuth method and Redirect URI inputs for a Snowflake connection in dbt Cloud."
+/>
 
 ### Create a security integration
 
@@ -25,7 +43,7 @@ CREATE OR REPLACE SECURITY INTEGRATION DBT_CLOUD
   ENABLED = TRUE
   OAUTH_CLIENT = CUSTOM
   OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
-  OAUTH_REDIRECT_URI = 'https://YOUR_ACCESS_URL/complete/snowflake'
+  OAUTH_REDIRECT_URI = LOCATED_REDIRECT_URI
   OAUTH_ISSUE_REFRESH_TOKENS = TRUE
   OAUTH_REFRESH_TOKEN_VALIDITY = 7776000;
 ```
@@ -42,7 +60,7 @@ CREATE OR REPLACE SECURITY INTEGRATION DBT_CLOUD
 | ENABLED  | Required |
 | OAUTH_CLIENT  | Required |
 | OAUTH_CLIENT_TYPE  | Required |
-| OAUTH_REDIRECT_URI  | Required. Use the access URL that corresponds to your server [region](/docs/cloud/about-cloud/access-regions-ip-addresses). |
+| OAUTH_REDIRECT_URI  | Required. Use the value in the [dbt Cloud account settings](#locate-the-redirect-uri-value). |
 | OAUTH_ISSUE_REFRESH_TOKENS  | Required |
 | OAUTH_REFRESH_TOKEN_VALIDITY  | Required. This configuration dictates the number of seconds that a refresh token is valid for. Use a smaller value to force users to re-authenticate with Snowflake more frequently. |
 
diff --git a/website/docs/docs/core/connect-data-platform/teradata-setup.md b/website/docs/docs/core/connect-data-platform/teradata-setup.md
index 7067104fb94..df32b07bd0e 100644
--- a/website/docs/docs/core/connect-data-platform/teradata-setup.md
+++ b/website/docs/docs/core/connect-data-platform/teradata-setup.md
@@ -67,7 +67,7 @@ To connect to Teradata Vantage from dbt, you'll need to add a [profile](https://
       password: <password>
       schema: <database-name>
       tmode: ANSI
-      threads: 1
+      threads: [optional, 1 or more]
       #optional fields
       <field-name: <field-value>
 ```
diff --git a/website/docs/docs/dbt-cloud-apis/service-tokens.md b/website/docs/docs/dbt-cloud-apis/service-tokens.md
index 1a5920fab8a..a0292f26874 100644
--- a/website/docs/docs/dbt-cloud-apis/service-tokens.md
+++ b/website/docs/docs/dbt-cloud-apis/service-tokens.md
@@ -48,7 +48,10 @@ Metadata-only service tokens authorize requests to the Discovery API.
 Semantic Layer-only service tokens authorize requests to the Semantic Layer APIs.
 
 **Job Admin**<br/>
-Job admin service tokens can authorize requests for viewing, editing, and creating environments, triggering runs, and viewing historical runs.  
+Job admin service tokens can authorize requests for viewing, editing, and creating environments, triggering runs, and viewing historical runs. 
+
+**Job Runner**<br/>
+Job runner service tokens can authorize requests for triggering runs and viewing historical runs.
 
 **Member** <br/>
 Member service tokens can authorize requests for viewing and editing resources, triggering runs, and inviting members to the account. Tokens assigned the Member permission set will have the same permissions as a Member user. For more information about Member users, see "[Self-service permissions](/docs/cloud/manage-access/self-service-permissions)".
diff --git a/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md b/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md
index a83ebfaadfb..35758d46afd 100644
--- a/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md
+++ b/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md
@@ -7,7 +7,7 @@ In dbt Cloud, both [jobs](/docs/deploy/jobs) and [environments](/docs/dbt-cloud-
 
 ## Environments
 
-Navigate to the settings page of an environment, then click **Edit**. Click the **dbt version** dropdown bar and make your selection. You can select a previous release of dbt Core or go [**Versionless**](#versionless)(recommended). Be sure to save your changes before navigating away.
+Navigate to the settings page of an environment, then click **Edit**. Click the **dbt version** dropdown bar and make your selection. You can select a previous release of dbt Core or go [**Versionless**](#versionless) (recommended). Be sure to save your changes before navigating away.
 
 <Lightbox src="/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/choosing-dbt-version/example-environment-settings.png" width="90%" title="Example environment settings in dbt Cloud"/>
 
diff --git a/website/docs/docs/deploy/ci-jobs.md b/website/docs/docs/deploy/ci-jobs.md
index 12e303c3536..4cd8e4b6cf0 100644
--- a/website/docs/docs/deploy/ci-jobs.md
+++ b/website/docs/docs/deploy/ci-jobs.md
@@ -12,7 +12,8 @@ dbt Labs recommends that you create your CI job in a dedicated dbt Cloud [deploy
 
 ### Prerequisites
 - You have a dbt Cloud account. 
-- For the [Concurrent CI checks](/docs/deploy/continuous-integration#concurrent-ci-checks) and [Smart cancellation of stale builds](/docs/deploy/continuous-integration#smart-cancellation) features, your dbt Cloud account must be on the [Team or Enterprise plan](https://www.getdbt.com/pricing/).
+- For the [concurrent CI checks](/docs/deploy/continuous-integration#concurrent-ci-checks) and [smart cancellation of stale builds](/docs/deploy/continuous-integration#smart-cancellation) features, your dbt Cloud account must be on the [Team or Enterprise plan](https://www.getdbt.com/pricing/).
+- For the [compare changes](/docs/deploy/continuous-integration#compare-changes) feature, your dbt Cloud account must have access to Advanced CI. Please ask your [dbt Cloud administrator to enable](/docs/dbt-cloud-environments#account-access-to-advanced-ci-features) this for you.
 - Set up a [connection with your Git provider](/docs/cloud/git/git-configuration-in-dbt-cloud). This integration lets dbt Cloud run jobs on your behalf for job triggering.
    - If you're using a native [GitLab](/docs/cloud/git/connect-gitlab) integration, you need a paid or self-hosted account that includes support for GitLab webhooks and [project access tokens](https://docs.gitlab.com/ee/user/project/settings/project_access_tokens.html). If you're using GitLab Free, merge requests will trigger CI jobs but CI job status updates (success or failure of the job) will not be reported back to GitLab.
 
@@ -21,40 +22,48 @@ To make CI job creation easier, many options on the **CI job** page are set to d
 
 1. On your deployment environment page, click **Create job** > **Continuous integration job** to create a new CI job. 
 
-2. Options in the **Job settings** section:
+1. Options in the **Job settings** section:
     - **Job name** &mdash; Specify the name for this CI job.
     - **Description** &mdash; Provide a description about the CI job.
-    - **Environment** &mdash; By default, it’s set to the environment you created the CI job from.
+    - **Environment** &mdash; By default, it’s set to the environment you created the CI job from. Use the dropdown to change the default setting. 
+
+1. Options in the **Git trigger** section:
     - **Triggered by pull requests** &mdash; By default, it’s enabled. Every time a developer opens up a pull request or pushes a commit to an existing pull request, this job will get triggered to run.
-      - **Run on Draft Pull Request** &mdash; Enable this option if you want to also trigger the job to run every time a developer opens up a draft pull request or pushes a commit to that draft pull request. 
+      - **Run on draft pull request** &mdash; Enable this option if you want to also trigger the job to run every time a developer opens up a draft pull request or pushes a commit to that draft pull request. 
 
-3. Options in the **Execution settings** section:
+1. Options in the **Execution settings** section:
     - **Commands** &mdash; By default, it includes the `dbt build --select state:modified+` command. This informs dbt Cloud to build only new or changed models and their downstream dependents. Importantly, state comparison can only happen when there is a deferred environment selected to compare state to. Click **Add command** to add more [commands](/docs/deploy/job-commands)  that you want to be invoked when this job runs.
+    - **Run compare changes**<Lifecycle status="beta" /> &mdash; Enable this option to compare the last applied state of the production environment (if one exists) with the latest changes from the pull request, and identify what those differences are. To enable record-level comparison and primary key analysis, you must add a [primary key constraint](/reference/resource-properties/constraints) or [uniqueness test](/reference/resource-properties/data-tests#unique). Otherwise, you'll receive a "Primary key missing" error message in dbt Cloud.
+    
+      To review the comparison report, navigate to the [Compare tab](/docs/deploy/run-visibility#compare-tab) in the job run's details. A summary of the report is also available from the pull request in your Git provider (see the [CI report example](#example-ci-report)). 
     - **Compare changes against an environment (Deferral)** &mdash; By default, it’s set to the **Production** environment if you created one. This option allows dbt Cloud to check the state of the code in the PR against the code running in the deferred environment, so as to only check the modified code, instead of building the full table or the entire DAG.
 
-    :::info
-    Older versions of dbt Cloud only allow you to defer to a specific job instead of an environment. Deferral to a job compares state against the project code that was run in the deferred job's last successful run. While deferral to an environment is more efficient as dbt Cloud will compare against the project representation (which is stored in the `manifest.json`) of the last successful deploy job run that executed in the deferred environment. By considering _all_ [deploy jobs](/docs/deploy/deploy-jobs) that run in the deferred environment, dbt Cloud will get a more accurate, latest project representation state.
-    :::
+      :::info
+      Older versions of dbt Cloud only allow you to defer to a specific job instead of an environment. Deferral to a job compares state against the project code that was run in the deferred job's last successful run. Deferral to an environment is more efficient as dbt Cloud will compare against the project representation (which is stored in the `manifest.json`) of the last successful deploy job run that executed in the deferred environment. By considering _all_ [deploy jobs](/docs/deploy/deploy-jobs) that run in the deferred environment, dbt Cloud will get a more accurate, latest project representation state.
+      :::
+
+    - **Run timeout** &mdash; Cancel the CI job if the run time exceeds the timeout value. You can use this option to help ensure that a CI check doesn't consume too much of your warehouse resources. If you enable the **Run compare changes** option, the timeout value defaults to `3600` (one hour) to prevent long-running comparisons. 
 
-    - **Generate docs on run** &mdash; Enable this option if you want to [generate project docs](/docs/collaborate/build-and-view-your-docs) when this job runs. This option is disabled by default since most teams do not want to test doc generation on every CI check.
 
-4. (optional) Options in the **Advanced settings** section: 
+1. (optional) Options in the **Advanced settings** section: 
     - **Environment variables** &mdash; Define [environment variables](/docs/build/environment-variables) to customize the behavior of your project when this CI job runs. You can specify that a CI job is running in a _Staging_ or _CI_ environment by setting an environment variable and modifying your project code to behave differently, depending on the context. It's common for teams to process only a subset of data for CI runs, using environment variables to branch logic in their dbt project code.
     - **Target name** &mdash; Define the [target name](/docs/build/custom-target-names). Similar to **Environment Variables**, this option lets you customize the behavior of the project. You can use this option to specify that a CI job is running in a _Staging_ or _CI_ environment by setting the target name and modifying your project code to behave differently, depending on the context. 
-    - **Run timeout** &mdash; Cancel this CI job if the run time exceeds the timeout value. You can use this option to help ensure that a CI check doesn't consume too much of your warehouse resources.
     - **dbt version** &mdash; By default, it’s set to inherit the [dbt version](/docs/dbt-versions/core) from the environment. dbt Labs strongly recommends that you don't change the default setting. This option to change the version at the job level is useful only when you upgrade a project to the next dbt version; otherwise, mismatched versions between the environment and job can lead to confusing behavior.
     - **Threads** &mdash; By default, it’s set to 4 [threads](/docs/core/connect-data-platform/connection-profiles#understanding-threads). Increase the thread count to increase model execution concurrency.
+   - **Generate docs on run** &mdash; Enable this if you want to [generate project docs](/docs/collaborate/build-and-view-your-docs) when this job runs. This is disabled by default since testing doc generation on every CI check is not a recommended practice.
     - **Run source freshness** &mdash; Enable this option to invoke the `dbt source freshness` command before running this CI job. Refer to [Source freshness](/docs/deploy/source-freshness) for more details.
 
-### Examples
+   <Lightbox src="/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png" width="90%" title="Example of CI Job page in the dbt Cloud UI"/>
 
-- Example of creating a CI job:
-   <Lightbox src="/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png" title="Example of CI Job page in dbt Cloud UI"/>
+### Example of CI check in pull request {#example-ci-check}
+The following is an example of a CI check in a GitHub pull request. The green checkmark means the dbt build and tests were successful. Clicking on the dbt Cloud section takes you to the relevant CI run in dbt Cloud.
 
-- Example of GitHub pull request. The green checkmark means the dbt build and tests were successful. Clicking on the dbt Cloud section navigates you to the relevant CI run in dbt Cloud.
+<Lightbox src="/img/docs/dbt-cloud/using-dbt-cloud/example-github-pr.png" width="60%" title="Example of CI check in GitHub pull request"/>
 
-   <Lightbox src="/img/docs/dbt-cloud/using-dbt-cloud/example-github-pr.png" title="GitHub pull request example"/>
+### Example of CI report in pull request <Lifecycle status="beta" /> {#example-ci-report}
+The following is an example of a CI report in a GitHub pull request, which is shown when the **Run compare changes** option is enabled for the CI job. It displays a high-level summary of the models that changed from the pull request.
 
+<Lightbox src="/img/docs/dbt-cloud/using-dbt-cloud/example-github-ci-report.png" width="75%" title="Example of CI report comment in GitHub pull request"/>
 
 ## Trigger a CI job with the API
 
diff --git a/website/docs/docs/deploy/continuous-integration.md b/website/docs/docs/deploy/continuous-integration.md
index fbe93e084b6..e033fc16fb7 100644
--- a/website/docs/docs/deploy/continuous-integration.md
+++ b/website/docs/docs/deploy/continuous-integration.md
@@ -30,11 +30,7 @@ dbt Cloud deletes the temporary schema from your <Term id="data-warehouse" /> w
 
 The [dbt Cloud scheduler](/docs/deploy/job-scheduler) executes CI jobs differently from other deployment jobs in these important ways:
 
-- **Concurrent CI checks** &mdash; CI runs triggered by the same dbt Cloud CI job execute concurrently (in parallel), when appropriate
-- **Smart cancellation of stale builds** &mdash; Automatically cancels stale, in-flight CI runs when there are new commits to the PR
-- **Run slot treatment** &mdash; CI runs don't consume a run slot
-
-### Concurrent CI checks
+<Expandable alt_header="Concurrent CI checks">
 
 When you have teammates collaborating on the same dbt project creating pull requests on the same dbt repository, the same CI job will get triggered. Since each run builds into a dedicated, temporary schema that’s tied to the pull request, dbt Cloud can safely execute CI runs _concurrently_ instead of _sequentially_ (differing from what is done with deployment dbt Cloud jobs). Because no one needs to wait for one CI run to finish before another one can start, with concurrent CI checks, your whole team can test and integrate dbt code faster.
 
@@ -44,12 +40,35 @@ Below describes the conditions when CI checks are run concurrently and when they
 - CI runs with the _same_ PR number and _different_ commit SHAs execute serially because they’re building into the same schema. dbt Cloud will run the latest commit and cancel any older, stale commits. For details, refer to [Smart cancellation of stale builds](#smart-cancellation). 
 - CI runs with the same PR number and same commit SHA, originating from different dbt Cloud projects will execute jobs concurrently. This can happen when two CI jobs are set up in different dbt Cloud projects that share the same dbt repository.
 
-### Smart cancellation of stale builds {#smart-cancellation}
+</Expandable>
+
+<Expandable alt_header="Smart cancellation of stale builds">
 
 When you push a new commit to a PR, dbt Cloud enqueues a new CI run for the latest commit and cancels any CI run that is (now) stale and still in flight. This can happen when you’re pushing new commits while a CI build is still in process and not yet done. By cancelling runs in a safe and deliberate way, dbt Cloud helps improve productivity and reduce data platform spend on wasteful CI runs.
 
 <Lightbox src="/img/docs/dbt-cloud/using-dbt-cloud/example-smart-cancel-job.png" width="70%" title="Example of an automatically canceled run"/>
 
-### Run slot treatment <Lifecycle status="team,enterprise" />
+</Expandable>
+
+<Expandable alt_header="Run slot treatment" lifecycle="team,enterprise">
 
 CI runs don't consume run slots. This guarantees a CI check will never block a production run.
+
+</Expandable>
+
+<Expandable alt_header="Compare changes" lifecycle="beta" >
+
+ When a pull request is opened or new commits are pushed, dbt Cloud compares the changes between the last applied state of the production environment (defaulting to deferral for lower computation costs) and the latest changes from the pull request for CI jobs that have the **Run compare changes** option enabled. By analyzing these comparisons, you can gain a better understanding of how the data changes are affected by code changes to help ensure you always ship the correct changes to production and create trusted data products.
+
+ :::info Beta feature
+
+The compare changes feature is currently in limited beta for select accounts. If you're interested in gaining access or learning more, please stay tuned for updates.
+
+ :::
+
+dbt reports the comparison differences:
+
+- **In dbt Cloud** &mdash; Shows the changes (if any) to the data's primary keys, rows, and columns. To learn more, refer to the [Compare tab](/docs/deploy/run-visibility#compare-tab) in the [Job run details](/docs/deploy/run-visibility#job-run-details). 
+- **In the pull request from your Git provider** &mdash; Shows a summary of the changes, as a git comment.
+
+</Expandable>
\ No newline at end of file
diff --git a/website/docs/docs/deploy/run-visibility.md b/website/docs/docs/deploy/run-visibility.md
index ad7aa04986d..f169031790e 100644
--- a/website/docs/docs/deploy/run-visibility.md
+++ b/website/docs/docs/deploy/run-visibility.md
@@ -9,13 +9,13 @@ You can view the history of your runs and the model timing dashboard to help ide
 
 ## Run history
 
-The **Run history** dashboard in dbt Cloud helps you monitor the health of your dbt project. It provides a detailed overview of all of your project's job runs and empowers you with a variety of filters to help you focus on specific aspects. You can also use it to review recent runs, find errored runs, and track the progress of runs in progress. You can access it on the top navigation menu by clicking **Deploy** and then **Run history**. 
+The **Run history** dashboard in dbt Cloud helps you monitor the health of your dbt project. It provides a detailed overview of all your project's job runs and empowers you with a variety of filters that enable you to focus on specific aspects. You can also use it to review recent runs, find errored runs, and track the progress of runs in progress. You can access it from the top navigation menu by clicking **Deploy** and then **Run history**. 
 
 The dashboard displays your full run history, including job name, status, associated environment, job trigger, commit SHA, schema, and timing info. 
 
 dbt Cloud developers can access their run history for the last 365 days through the dbt Cloud user interface (UI) and API.
 
-We limit self-service retrieval of run history metadata to 365 days to improve dbt Cloud's performance.
+dbt Labs limits self-service retrieval of run history metadata to 365 days to improve dbt Cloud's performance.
 
 <Lightbox src="/img/docs/dbt-cloud/deployment/run-history.png" width="85%" title="Run history dashboard allows you to monitor the health of your dbt project and displays jobs, job status, environment, timing, and more."/>
 
@@ -29,16 +29,44 @@ An example of a completed run with a configuration for a [job completion trigger
 
 <Lightbox src="/img/docs/dbt-cloud/deployment/example-job-details.png" width="65%" title="Example of run details" />
 
-### Access logs
+### Run summary tab
 
 You can view or download in-progress and historical logs for your dbt runs. This makes it easier for the team to debug errors more efficiently.
 
 <Lightbox src="/img/docs/dbt-cloud/deployment/access-logs.gif" width="85%" title="Access logs for run steps" />
 
-### Model timing <Lifecycle status="team,enterprise" /> 
+### Lineage tab
 
-The **Model timing** dashboard displays the composition, order, and time taken by each model in a job run. The visualization appears for successful jobs and highlights the top 1% of model durations. This helps you identify bottlenecks in your runs, so you can investigate them and potentially make changes to improve their performance. 
+View the lineage graph associated with the job run so you can better understand the dependencies and relationships of the resources in your project. To view a node's metadata directly in [dbt Explorer](/docs/collaborate/explore-projects), select it (double-click) from the graph. 
+
+<Lightbox src="/img/docs/collaborate/dbt-explorer/explorer-from-lineage.gif" width="85%" title="Example of accessing dbt Explorer from the Lineage tab" />
+
+### Model timing tab <Lifecycle status="team,enterprise" /> 
+
+The **Model timing** tab displays the composition, order, and time each model takes in a job run. The visualization appears for successful jobs and highlights the top 1% of model durations. This helps you identify bottlenecks in your runs so you can investigate them and potentially make changes to improve their performance. 
 
 You can find the dashboard on the [job's run details](#job-run-details). 
 
 <Lightbox src="/img/docs/dbt-cloud/model-timing.png" width="85%" title="The Model timing tab displays the top 1% of model durations and visualizes model bottlenecks" />
+
+### Artifacts tab
+
+This provides a list of the artifacts generated by the job run. The files are saved and available for download.
+
+<Lightbox src="/img/docs/dbt-cloud/example-artifacts-tab.png" width="85%" title="Example of the Artifacts tab" />
+
+### Compare tab <Lifecycle status="beta"/>
+
+The **Compare** tab is shown for [CI job runs](/docs/deploy/ci-jobs) with the **Run compare changes** setting enabled. It displays details about [the changes from the comparison dbt performed](/docs/deploy/continuous-integration#compare-changes) between what's in your production environment and the pull request. To help you better visualize the differences, dbt Cloud highlights changes to your models in red (deletions) and green (inserts).
+
+From the **Modified** section, you can view the following:
+
+- **Overview** &mdash; High-level summary about the changes to the models such as the number of primary keys that were added or removed. 
+- **Primary keys** &mdash; Details about the changes to the records.
+- **Modified rows** &mdash; Details about the modified rows. Click **Show full preview** to display all columns.
+- **Columns** &mdash; Details about the changes to the columns. 
+
+To view the dependencies and relationships of the resources in your project more closely, click **View in Explorer** to launch [dbt Explorer](/docs/collaborate/explore-projects). 
+
+<Lightbox src="/img/docs/dbt-cloud/example-ci-compare-changes-tab.png" width="85%" title="Example of the Compare tab" />
+
diff --git a/website/docs/faqs/Troubleshooting/ide-session-unknown-error.md b/website/docs/faqs/Troubleshooting/ide-session-unknown-error.md
new file mode 100644
index 00000000000..4165506993c
--- /dev/null
+++ b/website/docs/faqs/Troubleshooting/ide-session-unknown-error.md
@@ -0,0 +1,19 @@
+---
+title: I'm receiving an 'Your IDE session experienced an unknown error and was terminated. Please contact support'.
+description: "Add a repository when seeing IDE unknown error"
+sidebar_label: 'Receiving unknown error in the IDE'
+
+---
+
+If you're seeing the following error when you launch the dbt Cloud IDE, it could be due to a few scenarios but, commonly, it indicates a missing repository:
+
+```shell
+
+Your IDE session experienced an unknown error and was terminated. Please contact support.
+
+```
+
+You can try to resolve this by adding a repository like a [managed repository](/docs/collaborate/git/managed-repository) or your preferred Git account. To add your Git account, navigate to **Project** > **Repository** and select your repository.
+
+
+If you're still running into this error, please contact the Support team at support@getdbt.com for help. 
diff --git a/website/docs/reference/artifacts/dbt-artifacts.md b/website/docs/reference/artifacts/dbt-artifacts.md
index 8d3e1ae29e8..c38cc2768e1 100644
--- a/website/docs/reference/artifacts/dbt-artifacts.md
+++ b/website/docs/reference/artifacts/dbt-artifacts.md
@@ -28,6 +28,8 @@ Most dbt commands (and corresponding RPC methods) produce artifacts:
 - [catalog](catalog-json): produced by `docs generate`
 - [sources](/reference/artifacts/sources-json): produced by `source freshness`
 
+When running commands from the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), all artifacts are downloaded by default. If you want to change this behavior, refer to [How to skip artifacts from being downloaded](/docs/cloud/configure-cloud-cli#how-to-skip-artifacts-from-being-downloaded).
+
 ## Where are artifacts produced?
 
 By default, artifacts are written to the `/target` directory of your dbt project. You can configure the location using the [`target-path` flag](/reference/global-configs/json-artifacts).
diff --git a/website/docs/reference/model-configs.md b/website/docs/reference/model-configs.md
index 0746fe92036..3a93c599ea7 100644
--- a/website/docs/reference/model-configs.md
+++ b/website/docs/reference/model-configs.md
@@ -136,8 +136,8 @@ models:
     config:
       [enabled](/reference/resource-configs/enabled): true | false
       [tags](/reference/resource-configs/tags): <string> | [<string>]
-      [pre-hook](/reference/resource-configs/pre-hook-post-hook): <sql-statement> | [<sql-statement>]
-      [post-hook](/reference/resource-configs/pre-hook-post-hook): <sql-statement> | [<sql-statement>]
+      [pre_hook](/reference/resource-configs/pre-hook-post-hook): <sql-statement> | [<sql-statement>]
+      [post_hook](/reference/resource-configs/pre-hook-post-hook): <sql-statement> | [<sql-statement>]
       [database](/reference/resource-configs/database): <string>
       [schema](/reference/resource-properties/schema): <string>
       [alias](/reference/resource-configs/alias): <string>
diff --git a/website/docs/reference/project-configs/on-run-start-on-run-end.md b/website/docs/reference/project-configs/on-run-start-on-run-end.md
index e1a3d7b761a..74557839f11 100644
--- a/website/docs/reference/project-configs/on-run-start-on-run-end.md
+++ b/website/docs/reference/project-configs/on-run-start-on-run-end.md
@@ -20,7 +20,7 @@ on-run-end: sql-statement | [sql-statement]
 
 A SQL statement (or list of SQL statements) to be run at the start or end of the following commands: <OnRunCommands />
 
-`on-run-start` and `on-run-end` hooks can also call macros that return SQL statements
+`on-run-start` and `on-run-end` hooks can also [call macros](#call-a-macro-to-grant-privileges) that return SQL statements.
 
 ## Usage notes
 * The `on-run-end` hook has additional jinja variables available in the context — check out the [docs](/reference/dbt-jinja-functions/on-run-end-context).
diff --git a/website/docs/terms/model.md b/website/docs/terms/model.md
index c589cc196a7..83871d1339e 100644
--- a/website/docs/terms/model.md
+++ b/website/docs/terms/model.md
@@ -6,4 +6,11 @@ displayText: model
 hoverSnippet: A model is an essential building block of the DAG
 ---
 
-A model is an essential building block of the DAG that lives in a single file and contains logic that transforms data. This logic can be expressed as a SQL `select` statement or a Python dataframe operation. Models can be materialized in the warehouse in different ways &mdash; most of these materializations require models to be built in the warehouse. 
\ No newline at end of file
+A model is an essential building block of the DAG that lives in a single file and contains logic that transforms data. This logic can be expressed as a SQL `select` statement or a Python dataframe operation. Models can be materialized in the warehouse in different ways &mdash; most of these [materialization](/terms/materialization) require models to be built in the warehouse. 
+
+For more information, refer to:
+
+* [About dbt models](/docs/build/models)
+* [Quickstart guides](/guides?tags=Quickstart)
+* [Model properties](/reference/model-properties)
+* [Materializations](/reference/resource-configs/materialized)
diff --git a/website/sidebars.js b/website/sidebars.js
index a3b0cd2d8a4..ae5e05d4aae 100644
--- a/website/sidebars.js
+++ b/website/sidebars.js
@@ -911,6 +911,7 @@ const sidebarSettings = {
           label: "For models",
           items: [
             "reference/model-properties",
+            "reference/resource-properties/model_name",
             "reference/model-configs",
             "reference/resource-configs/materialized",
             "reference/resource-configs/on_configuration_change",
@@ -933,6 +934,7 @@ const sidebarSettings = {
           label: "For snapshots",
           items: [
             "reference/snapshot-properties",
+            "reference/resource-configs/snapshot_name",
             "reference/snapshot-configs",
             "reference/resource-configs/check_cols",
             "reference/resource-configs/strategy",
diff --git a/website/snippets/_adapters-trusted.md b/website/snippets/_adapters-trusted.md
index 2ac5268fc28..3594f050897 100644
--- a/website/snippets/_adapters-trusted.md
+++ b/website/snippets/_adapters-trusted.md
@@ -15,7 +15,7 @@
  <Card
     title="Athena"
     body="<ul><li><a href='/docs/cloud/connect-data-platform/connect-amazon-athena'>Set up in dbt Cloud (beta) </a><br /></li><li><a href='/docs/core/connect-data-platform/athena-setup'>Install with dbt Core </a> </li> </ul><br /><br /><a href=https://badge.fury.io/py/dbt-athena-community><img src=https://badge.fury.io/py/dbt-athena-community.svg/></a>"
-    pills='["dbt Core"]'
+    pills='["dbt Cloud", "dbt Core"]'
     icon="athena"/>
 
 <Card
diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md
index 166165be855..508a7e79d54 100644
--- a/website/snippets/_cloud-environments-info.md
+++ b/website/snippets/_cloud-environments-info.md
@@ -82,7 +82,7 @@ If you're developing in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in
 #### Only the **top-level keys** are accepted in extended attributes
 This means that if you want to change a specific sub-key value, you must provide the entire top-level key as a JSON block in your resulting YAML. For example, if you want to customize a particular field within a [service account JSON](/docs/core/connect-data-platform/bigquery-setup#service-account-json) for your BigQuery connection (like 'project_id' or 'client_email'), you need to provide an override for the entire top-level `keyfile_json` main key/attribute using extended attributes. Include the sub-fields as a nested JSON block.
 
-### Git repository caching
+### Git repository caching <Lifecycle status="enterprise" />
 
 At the start of every job run, dbt Cloud clones the project's Git repository so it has the latest versions of your project's code and runs `dbt deps` to install your dependencies. 
 
@@ -101,12 +101,6 @@ To enable Git repository caching, select **Account settings** from the gear menu
 
 <Lightbox src="/img/docs/deploy/example-account-settings.png" width="85%" title="Example of the Repository caching option" />
 
-:::note
-
-This feature is only available on the dbt Cloud Enterprise plan. 
-
-:::
-
 ### Partial parsing
 
 At the start of every dbt invocation, dbt reads all the files in your project, extracts information, and constructs an internal manifest containing every object (model, source, macro, and so on). Among other things, it uses the `ref()`, `source()`, and `config()` macro calls within models to set properties, infer dependencies, and construct your project's DAG. When dbt finishes parsing your project, it stores the internal manifest in a file called `partial_parse.msgpack`. 
@@ -118,3 +112,14 @@ Partial parsing in dbt Cloud requires dbt version 1.4 or newer. The feature does
 To enable, select **Account settings** from the gear menu and enable the **Partial parsing** option.
 
 <Lightbox src="/img/docs/deploy/example-account-settings.png" width="85%" title="Example of the Partial parsing option" />
+
+### Account access to Advanced CI features <Lifecycle status="beta" />
+
+To help increase the governance and improve the quality of the data, you can set up automation that tests code changes before merging them into production with [CI jobs](/docs/deploy/ci-jobs). You can also enable Advanced CI features, such as [compare changes](/docs/deploy/continuous-integration#compare-changes), that allow dbt Cloud account members to view details about the changes between what's currently in your production environment and the pull request's latest commit, providing observability into how data changes are affected by code changes.
+
+To use Advanced CI features, your dbt Cloud account must have access to them. Ask your dbt Cloud administrator to enable Advanced CI features on your account, which they can do by selecting **Account settings** from the gear menu and choosing the **Enable account access to Advanced CI** option.
+
+<Lightbox src="/img/docs/deploy/example-account-settings.png" width="85%" title="Example of the Advanced CI option" />
+
+
+
diff --git a/website/snippets/_enterprise-permissions-table.md b/website/snippets/_enterprise-permissions-table.md
index cef68e894f5..98255c660e9 100644
--- a/website/snippets/_enterprise-permissions-table.md
+++ b/website/snippets/_enterprise-permissions-table.md
@@ -19,6 +19,7 @@ Account roles enable you to manage the dbt Cloud account and manage the account
 | Audit logs              |     R         |               |                           |                 |       R        |   R    |
 | Auth provider           |     W         |               |                           |                 |       W        |   R    |
 | Billing                 |     W         |       W       |                           |                 |                |   R    |
+| Connections             |     W         |               |                           |        W        |                |        |
 | Groups                  |     W         |               |                           |        R        |       W        |   R    |
 | Invitations             |     W         |               |                           |        W        |       W        |   R    |
 | IP restrictions         |     W         |               |                           |                 |       W        |   R    |
@@ -34,7 +35,6 @@ Account roles enable you to manage the dbt Cloud account and manage the account
  
 |Project-level permission | Account Admin | Billing admin | Project creator | Security admin | Viewer | 
 |:-------------------------|:-------------:|:-------------:|:---------------:|:--------------:|:------:| 
-| Data platform connections             |       W       |               |       W         |                |   R    |
 | Environment credentials (deployment)      |       W       |               |       W         |                |   R    |
 | Custom env. variables   |       W       |               |       W         |                |   R    |
 | Data platform configurations            |       W       |               |       W         |                |   R    |
@@ -61,6 +61,7 @@ The project roles enable you to work within the projects in various capacities.
 | Account settings         |   R   |         |      R         |           |     R     |           |             |             |          |                |             |     R      |         |
 | Auth provider            |       |         |                |           |           |           |             |             |          |                |             |            |         |
 | Billing                  |       |         |                |           |           |           |             |             |          |                |             |            |         |
+| Connections              |   R   |    R    |      R         |     R     |     R     |     R     |             |             |          |                |      R      |     R      |         |
 | Groups                   |   R   |         |      R         |     R     |     R     |           |             |             |          |                |      R      |     R      |         |
 | Invitations              |   W   |    R    |      R         |     R     |     R     |     R     |             |      R      |          |                |      R      |     R      |         |
 | Licenses                 |   W   |    R    |      R         |     R     |     R     |     R     |             |      R      |          |                |             |     R      |         |
@@ -74,7 +75,6 @@ The project roles enable you to work within the projects in various capacities.
  
 |Project-level permission  | Admin | Analyst | Database admin | Developer | Git Admin | Job admin | Job runner  | Job viewer  | Metadata <br></br> (Discovery API only) | Semantic Layer | Stakeholder | Team admin | Webhook |
 |--------------------------|:-----:|:-------:|:--------------:|:---------:|:---------:|:---------:|:-----------:|:-----------:|:--------:|:--------------:|:-----------:|:----------:|:-------:|  
-| Data platform connections              |   W   |    R    |       W        |     R     |     R     |     R     |             |             |          |                |     R       |     R      |         |
 | Environment credentials (deployment)        |   W   |    W    |       W        |     W     |     R     |     W     |             |             |          |                |     R       |     R      |         |
 | Custom env. variables    |   W   |    W    |       W        |     W     |     W     |     W     |             |      R      |          |                |     R       |     W      |         |
 | Data platform configurations            |   W   |    W    |       W        |     W     |     R     |     W     |             |             |          |                |     R       |     R      |         |
diff --git a/website/snippets/_sl-excel-gsheets.md b/website/snippets/_sl-excel-gsheets.md
index 6a356b15e94..5179b4be2dc 100644
--- a/website/snippets/_sl-excel-gsheets.md
+++ b/website/snippets/_sl-excel-gsheets.md
@@ -4,8 +4,8 @@
 <p><span>When querying your data with {props.type}:</span></p>
 
 <ul>
-  <li>It returns the data to the cell you have clicked on, and each cell where data is requested will have a note attached to it, indicating what has been queried and the timestamp.</li>
-  <li> {props.bullet_1}</li>
+  <li>It returns the data to the cell you clicked on. </li>
+  <li> {props.bullet_1}</li> 
   <li>{props.bullet_2}</li>
 </ul>
 
@@ -65,9 +65,10 @@
   <li>For time dimensions, you can use the time range selector to filter on presets or custom options. The time range selector applies only to the primary time dimension (<code>metric_time</code>). For all other time dimensions that aren't <code>metric_time</code>, you can use the "Where" option to apply filters.</li>
 </ul>
 
-#### Querying without headers or columns
+#### Other settings 
 
 <p>If you would like to just query the data values without the headers, you can optionally select the <strong>Exclude Column Names</strong> box.</p>
+<p>To return your results and keep any previously selected data below it intact, un-select the <strong>Exclude Column Names</strong> box. By default, we'll clear all trailing rows if there's stale data. </p>
 
 
 
@@ -85,18 +86,11 @@
 
 <Lightbox src={ props.queryBuilder } width="25%" title="Query and save selections in the Query Builder using the arrow next to the Query button." />
 
-<p>You can also make these selections private or public:</p>
-
-<ul>
-  <li><strong>Public selections</strong> mean your inputs are available in the menu to everyone on the sheet.</li>
-  <li><strong>Private selections</strong> mean your inputs are only visible to you. Note that anyone added to the sheet can still see the data from these private selections, but they won't be able to interact with the selection in the menu or benefit from the automatic refresh.</li>
-</ul>
-
 ### Refreshing selections
 
 <p>Set your saved selections to automatically refresh every time you load the addon. You can do this by selecting <strong>Refresh on Load</strong> when creating the saved selection. When you access the addon and have saved selections that should refresh, you'll see "Loading..." in the cells that are refreshing.</p>
 
-<p>Public saved selections will refresh for anyone who edits the sheet while private selections will only update for the user who created it.</p>
+<p>Public saved selections will refresh for anyone who edits the sheet.</p>
 
 :::tip What's the difference between saved selections and saved queries?
 
diff --git a/website/static/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/choosing-dbt-version/example-environment-settings.png b/website/static/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/choosing-dbt-version/example-environment-settings.png
index 86bb59e9b90..02e5073fd16 100644
Binary files a/website/static/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/choosing-dbt-version/example-environment-settings.png and b/website/static/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/choosing-dbt-version/example-environment-settings.png differ
diff --git a/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/snowflake-oauth-redirect-uri.png b/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/snowflake-oauth-redirect-uri.png
new file mode 100644
index 00000000000..e9313ddaa48
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/snowflake-oauth-redirect-uri.png differ
diff --git a/website/static/img/docs/dbt-cloud/example-artifacts-tab.png b/website/static/img/docs/dbt-cloud/example-artifacts-tab.png
new file mode 100644
index 00000000000..f039eea2001
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/example-artifacts-tab.png differ
diff --git a/website/static/img/docs/dbt-cloud/example-ci-compare-changes-tab.png b/website/static/img/docs/dbt-cloud/example-ci-compare-changes-tab.png
new file mode 100644
index 00000000000..2736860df3d
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/example-ci-compare-changes-tab.png differ
diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png
index ba75a855233..23c18953bf1 100644
Binary files a/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png differ
diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/example-github-ci-report.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/example-github-ci-report.png
new file mode 100644
index 00000000000..8dbfd76994d
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/example-github-ci-report.png differ
diff --git a/website/static/img/docs/deploy/example-account-settings.png b/website/static/img/docs/deploy/example-account-settings.png
index 12b8d9bc49f..d5e6adc2fa6 100644
Binary files a/website/static/img/docs/deploy/example-account-settings.png and b/website/static/img/docs/deploy/example-account-settings.png differ
diff --git a/website/vercel.json b/website/vercel.json
index 07c86328d6d..0d45a879f4a 100644
--- a/website/vercel.json
+++ b/website/vercel.json
@@ -3318,17 +3318,17 @@
     },
     {
       "source": "/dbt-cloud/cloud-ide/viewing-docs-in-the-ide",
-      "destination": "/docs/getting-started/develop-in-the-cloud",
+      "destination": "/docs/cloud/dbt-cloud-ide/develop-in-the-cloud",
       "permanent": true
     },
     {
       "source": "/docs/dbt-cloud/cloud-ide/ide-beta",
-      "destination": "/docs/getting-started/develop-in-the-cloud",
+      "destination": "/docs/cloud/dbt-cloud-ide/develop-in-the-cloud",
       "permanent": true
     },
     {
       "source": "/docs/running-a-dbt-project/using-the-dbt-ide",
-      "destination": "/docs/getting-started/develop-in-the-cloud",
+      "destination": "/docs/running-a-dbt-project/using-the-dbt-ide",
       "permanent": true
     },
     {