From 6d1473150d8105dd70e30fdaff3190c01610794b Mon Sep 17 00:00:00 2001 From: Ly Nguyen Date: Fri, 10 Nov 2023 16:21:25 -0800 Subject: [PATCH] Fix old redirects, remove duplicate steps --- .../dbt-databricks-unity-catalog-support.md | 2 +- .../docs/docs/deploy/deploy-environments.md | 4 ++-- website/docs/docs/environments-in-dbt.md | 2 +- website/docs/guides/dbt-python-snowpark.md | 21 ------------------- 4 files changed, 4 insertions(+), 25 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/24-Nov-2022/dbt-databricks-unity-catalog-support.md b/website/docs/docs/dbt-versions/release-notes/24-Nov-2022/dbt-databricks-unity-catalog-support.md index ce702434cf3..012615e1e4e 100644 --- a/website/docs/docs/dbt-versions/release-notes/24-Nov-2022/dbt-databricks-unity-catalog-support.md +++ b/website/docs/docs/dbt-versions/release-notes/24-Nov-2022/dbt-databricks-unity-catalog-support.md @@ -8,6 +8,6 @@ tags: [Nov-2022, v1.1.66.15] dbt Cloud is the easiest and most reliable way to develop and deploy a dbt project. It helps remove complexity while also giving you more features and better performance. A simpler Databricks connection experience with support for Databricks’ Unity Catalog and better modeling defaults is now available for your use. -For all the Databricks customers already using dbt Cloud with the dbt-spark adapter, you can now [migrate](/guides/migrate-from-spark-to-databricks) your connection to the [dbt-databricks adapter](/reference/warehouse-setups/databricks-setup) to get the benefits. [Databricks](https://www.databricks.com/blog/2022/11/17/introducing-native-high-performance-integration-dbt-cloud.html) is committed to maintaining and improving the adapter, so this integrated experience will continue to provide the best of dbt and Databricks. +For all the Databricks customers already using dbt Cloud with the dbt-spark adapter, you can now [migrate](/guides/migrate-from-spark-to-databricks) your connection to the [dbt-databricks adapter](/docs/core/connect-data-platform/databricks-setup) to get the benefits. [Databricks](https://www.databricks.com/blog/2022/11/17/introducing-native-high-performance-integration-dbt-cloud.html) is committed to maintaining and improving the adapter, so this integrated experience will continue to provide the best of dbt and Databricks. Check out our [live blog post](https://www.getdbt.com/blog/dbt-cloud-databricks-experience/) to learn more. diff --git a/website/docs/docs/deploy/deploy-environments.md b/website/docs/docs/deploy/deploy-environments.md index 21308784434..650fdb1c28a 100644 --- a/website/docs/docs/deploy/deploy-environments.md +++ b/website/docs/docs/deploy/deploy-environments.md @@ -13,7 +13,7 @@ Deployment environments in dbt Cloud are crucial for deploying dbt jobs in produ A dbt Cloud project can have multiple deployment environments, providing you the flexibility and customization to tailor the execution of dbt jobs. You can use deployment environments to [create and schedule jobs](/docs/deploy/deploy-jobs#create-and-schedule-jobs), [enable continuous integration](/docs/deploy/continuous-integration), or more based on your specific needs or requirements. :::tip Learn how to manage dbt Cloud environments -To learn different approaches to managing dbt Cloud environments and recommendations for your organization's unique needs, read [dbt Cloud environment best practices](/best-practices/environment-setup/1-env-guide-overview). +To learn different approaches to managing dbt Cloud environments and recommendations for your organization's unique needs, read [dbt Cloud environment best practices](/guides/set-up-ci). ::: This page reviews the different types of environments and how to configure your deployment environment in dbt Cloud. @@ -186,7 +186,7 @@ This section allows you to determine the credentials that should be used when co ## Related docs -- [dbt Cloud environment best practices](/best-practices/environment-setup/1-env-guide-overview) +- [dbt Cloud environment best practices](/guides/set-up-ci) - [Deploy jobs](/docs/deploy/deploy-jobs) - [CI jobs](/docs/deploy/continuous-integration) - [Delete a job or environment in dbt Cloud](/faqs/Environments/delete-environment-job) diff --git a/website/docs/docs/environments-in-dbt.md b/website/docs/docs/environments-in-dbt.md index ab899b09516..f0691761dd6 100644 --- a/website/docs/docs/environments-in-dbt.md +++ b/website/docs/docs/environments-in-dbt.md @@ -33,7 +33,7 @@ Configure environments to tell dbt Cloud or dbt Core how to build and execute yo ## Related docs -- [dbt Cloud environment best practices](/best-practices/environment-setup/1-env-guide-overview) +- [dbt Cloud environment best practices](/guides/set-up-ci) - [Deployment environments](/docs/deploy/deploy-environments) - [About dbt Core versions](/docs/dbt-versions/core) - [Set Environment variables in dbt Cloud](/docs/build/environment-variables#special-environment-variables) diff --git a/website/docs/guides/dbt-python-snowpark.md b/website/docs/guides/dbt-python-snowpark.md index 35842eb8d91..55e6b68c172 100644 --- a/website/docs/guides/dbt-python-snowpark.md +++ b/website/docs/guides/dbt-python-snowpark.md @@ -67,27 +67,6 @@ Overall we are going to set up the environments, build scalable pipelines in dbt 6. Finally, create a new Worksheet by selecting **+ Worksheet** in the upper right corner. - -33 1. Log in to your trial Snowflake account. You can [sign up for a Snowflake Trial Account using this form](https://signup.snowflake.com/) if you don’t have one. -2. Ensure that your account is set up using **AWS** in the **US East (N. Virginia)**. We will be copying the data from a public AWS S3 bucket hosted by dbt Labs in the us-east-1 region. By ensuring our Snowflake environment setup matches our bucket region, we avoid any multi-region data copy and retrieval latency issues. - - - -3. After creating your account and verifying it from your sign-up email, Snowflake will direct you back to the UI called Snowsight. - -4. When Snowsight first opens, your window should look like the following, with you logged in as the ACCOUNTADMIN with demo worksheets open: - - - - -5. Navigate to **Admin > Billing & Terms**. Click **Enable > Acknowledge & Continue** to enable Anaconda Python Packages to run in Snowflake. - - - - - -6. Finally, create a new Worksheet by selecting **+ Worksheet** in the upper right corner. - ## Connect to data source We need to obtain our data source by copying our Formula 1 data into Snowflake tables from a public S3 bucket that dbt Labs hosts.