From 4da2077edf25c0875fefd1a574844d6d5e34d659 Mon Sep 17 00:00:00 2001 From: john-rock Date: Fri, 27 Oct 2023 11:02:46 -0400 Subject: [PATCH] update broken markdown links --- website/docs/docs/build/projects.md | 2 +- .../release-notes/03-Oct-2023/product-docs-sept-rn.md | 2 +- .../release-notes/09-April-2023/product-docs.md | 2 +- .../10-Mar-2023/public-preview-trino-in-dbt-cloud.md | 2 +- website/docs/docs/introduction.md | 2 +- .../materializations-guide-6-examining-builds.md | 2 +- website/docs/guides/codespace-qs.md | 8 ++++---- .../how-to-set-up-your-databricks-dbt-project.md | 8 ++++---- .../productionizing-your-dbt-databricks-project.md | 2 +- website/docs/guides/manual-install-qs.md | 2 +- .../migrating-from-stored-procedures/2-mapping-inserts.md | 2 +- .../2-setting-up-airflow-and-dbt-cloud.md | 2 +- 12 files changed, 18 insertions(+), 18 deletions(-) diff --git a/website/docs/docs/build/projects.md b/website/docs/docs/build/projects.md index b4b04e3334d..150879da4be 100644 --- a/website/docs/docs/build/projects.md +++ b/website/docs/docs/build/projects.md @@ -93,4 +93,4 @@ If you want to see what a mature, production project looks like, check out the [ ## Related docs * [Best practices: How we structure our dbt projects](/guides/best-practices/how-we-structure/1-guide-overview) * [Quickstarts for dbt Cloud](/quickstarts) -* [Quickstart for dbt Core](/quickstarts/manual-install) +* [Quickstart for dbt Core](/guides/manual-install) diff --git a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/product-docs-sept-rn.md b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/product-docs-sept-rn.md index e669b037d17..42a2c8daba1 100644 --- a/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/product-docs-sept-rn.md +++ b/website/docs/docs/dbt-versions/release-notes/03-Oct-2023/product-docs-sept-rn.md @@ -27,7 +27,7 @@ Here's what's new to [docs.getdbt.com](http://docs.getdbt.com/): - Deprecated dbt Core v1.0 and v1.1 from the docs. - Added configuration instructions for the [AWS Glue](/docs/core/connect-data-platform/glue-setup) community plugin. -- Revised the dbt Core quickstart, making it easier to follow. Divided this guide into steps that align with the [other guides](/quickstarts/manual-install?step=1). +- Revised the dbt Core quickstart, making it easier to follow. Divided this guide into steps that align with the [other guides](/guides/manual-install?step=1). ## New 📚 Guides, ✏️ blog posts, and FAQs diff --git a/website/docs/docs/dbt-versions/release-notes/09-April-2023/product-docs.md b/website/docs/docs/dbt-versions/release-notes/09-April-2023/product-docs.md index d30bcf85b99..faa8517cfbd 100644 --- a/website/docs/docs/dbt-versions/release-notes/09-April-2023/product-docs.md +++ b/website/docs/docs/dbt-versions/release-notes/09-April-2023/product-docs.md @@ -17,7 +17,7 @@ Hello from the dbt Docs team: @mirnawong1, @matthewshaver, @nghi-ly, and @runleo ## ☁ Cloud projects - Added Starburst/Trino adapter docs, including: - * [dbt Cloud quickstart guide](/quickstarts/starburst-galaxy),  + * [dbt Cloud quickstart guide](/guides/starburst-galaxy),  * [connection page](/docs/cloud/connect-data-platform/connect-starburst-trino),  * [set up page](/docs/core/connect-data-platform/trino-setup), and [config page](/reference/resource-configs/trino-configs). - Enhanced [dbt Cloud jobs page](/docs/deploy/jobs) and section to include conceptual info on the queue time, improvements made around it, and about failed jobs. diff --git a/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/public-preview-trino-in-dbt-cloud.md b/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/public-preview-trino-in-dbt-cloud.md index bf3840a8b02..06abf178b8a 100644 --- a/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/public-preview-trino-in-dbt-cloud.md +++ b/website/docs/docs/dbt-versions/release-notes/10-Mar-2023/public-preview-trino-in-dbt-cloud.md @@ -8,7 +8,7 @@ tags: [Mar-2023] dbt Labs is introducing the newest connection option in dbt Cloud: the `dbt-trino` adapter is now available in Public Preview. This allows you to connect to Starburst Galaxy, Starburst Enterprise, and self-hosted Trino from dbt Cloud. -Check out our [Quickstart for dbt Cloud and Starburst Galaxy](/quickstarts/starburst-galaxy) to explore more. +Check out our [Quickstart for dbt Cloud and Starburst Galaxy](/guides/starburst-galaxy) to explore more. ## What’s the reason users should be excited about this? diff --git a/website/docs/docs/introduction.md b/website/docs/docs/introduction.md index 0aeef0201cb..efe050d3205 100644 --- a/website/docs/docs/introduction.md +++ b/website/docs/docs/introduction.md @@ -43,7 +43,7 @@ Learn more about [dbt Cloud features](/docs/cloud/about-cloud/dbt-cloud-features ### dbt Core -dbt Core is an open-source tool that enables data teams to transform data using analytics engineering best practices. You can install and use dbt Core on the command line. Learn more with the [quickstart for dbt Core](/quickstarts/codespace). +dbt Core is an open-source tool that enables data teams to transform data using analytics engineering best practices. You can install and use dbt Core on the command line. Learn more with the [quickstart for dbt Core](/guides/codespace). ## The power of dbt diff --git a/website/docs/guides/best-practices/materializations/materializations-guide-6-examining-builds.md b/website/docs/guides/best-practices/materializations/materializations-guide-6-examining-builds.md index 07811b42594..ee160d2a7ad 100644 --- a/website/docs/guides/best-practices/materializations/materializations-guide-6-examining-builds.md +++ b/website/docs/guides/best-practices/materializations/materializations-guide-6-examining-builds.md @@ -16,7 +16,7 @@ hoverSnippet: Read this guide to understand how to examine your builds in dbt. ### Model Timing -That’s where dbt Cloud’s Model Timing visualization comes in extremely handy. If we’ve set up a [Job](/quickstarts/bigquery) in dbt Cloud to run our models, we can use the Model Timing tab to pinpoint our longest-running models. +That’s where dbt Cloud’s Model Timing visualization comes in extremely handy. If we’ve set up a [Job](/guides/bigquery) in dbt Cloud to run our models, we can use the Model Timing tab to pinpoint our longest-running models. ![dbt Cloud's Model Timing diagram](/img/guides/best-practices/materializations/model-timing-diagram.png) diff --git a/website/docs/guides/codespace-qs.md b/website/docs/guides/codespace-qs.md index f2671335ddc..f30d82457a8 100644 --- a/website/docs/guides/codespace-qs.md +++ b/website/docs/guides/codespace-qs.md @@ -20,10 +20,10 @@ dbt Labs provides a [GitHub Codespace](https://docs.github.com/en/codespaces/ove ## Related content -- [Create a GitHub repository](/quickstarts/manual-install?step=2) -- [Build your first models](/quickstarts/manual-install?step=3) -- [Test and document your project](/quickstarts/manual-install?step=4) -- [Schedule a job](/quickstarts/manual-install?step=5) +- [Create a GitHub repository](/guides/manual-install?step=2) +- [Build your first models](/guides/manual-install?step=3) +- [Test and document your project](/guides/manual-install?step=4) +- [Schedule a job](/guides/manual-install?step=5) - Learn more with [dbt Courses](https://courses.getdbt.com/collections) ## Create a codespace diff --git a/website/docs/guides/dbt-ecosystem/databricks-guides/how-to-set-up-your-databricks-dbt-project.md b/website/docs/guides/dbt-ecosystem/databricks-guides/how-to-set-up-your-databricks-dbt-project.md index b0be39a4273..ba66bba60d1 100644 --- a/website/docs/guides/dbt-ecosystem/databricks-guides/how-to-set-up-your-databricks-dbt-project.md +++ b/website/docs/guides/dbt-ecosystem/databricks-guides/how-to-set-up-your-databricks-dbt-project.md @@ -57,11 +57,11 @@ Now that the Databricks components are in place, we can configure our dbt projec If you are migrating an existing dbt project from the dbt-spark adapter to dbt-databricks, follow this [migration guide](https://docs.getdbt.com/guides/migration/tools/migrating-from-spark-to-databricks#migration) to switch adapters without needing to update developer credentials and other existing configs. -If you’re starting a new dbt project, follow the steps below. For a more detailed setup flow, check out our [quickstart guide.](/quickstarts/databricks) +If you’re starting a new dbt project, follow the steps below. For a more detailed setup flow, check out our [quickstart guide.](/guides/databricks) ### Connect dbt to Databricks -First, you’ll need to connect your dbt project to Databricks so it can send transformation instructions and build objects in Unity Catalog. Follow the instructions for [dbt Cloud](/quickstarts/databricks?step=4) or [Core](https://docs.getdbt.com/reference/warehouse-setups/databricks-setup) to configure your project’s connection credentials. +First, you’ll need to connect your dbt project to Databricks so it can send transformation instructions and build objects in Unity Catalog. Follow the instructions for [dbt Cloud](/guides/databricks?step=4) or [Core](https://docs.getdbt.com/reference/warehouse-setups/databricks-setup) to configure your project’s connection credentials. Each developer must generate their Databricks PAT and use the token in their development credentials. They will also specify a unique developer schema that will store the tables and views generated by dbt runs executed from their IDE. This provides isolated developer environments and ensures data access is fit for purpose. @@ -84,7 +84,7 @@ During your first invocation of `dbt run`, dbt will create the developer schema Last, we need to give dbt a way to deploy code outside of development environments. To do so, we’ll use dbt [environments](https://docs.getdbt.com/docs/collaborate/environments) to define the production targets that end users will interact with. -Core projects can use [targets in profiles](https://docs.getdbt.com/docs/core/connection-profiles#understanding-targets-in-profiles) to separate environments. [dbt Cloud environments](https://docs.getdbt.com/docs/cloud/develop-in-the-cloud#set-up-and-access-the-cloud-ide) allow you to define environments via the UI and [schedule jobs](/quickstarts/databricks#create-and-run-a-job) for specific environments. +Core projects can use [targets in profiles](https://docs.getdbt.com/docs/core/connection-profiles#understanding-targets-in-profiles) to separate environments. [dbt Cloud environments](https://docs.getdbt.com/docs/cloud/develop-in-the-cloud#set-up-and-access-the-cloud-ide) allow you to define environments via the UI and [schedule jobs](/guides/databricks#create-and-run-a-job) for specific environments. Let’s set up our deployment environment: @@ -96,7 +96,7 @@ Let’s set up our deployment environment: ### Connect dbt to your git repository -Next, you’ll need somewhere to store and version control your code that allows you to collaborate with teammates. Connect your dbt project to a git repository with [dbt Cloud](/quickstarts/databricks#set-up-a-dbt-cloud-managed-repository). [Core](/quickstarts/manual-install#create-a-repository) projects will use the git CLI. +Next, you’ll need somewhere to store and version control your code that allows you to collaborate with teammates. Connect your dbt project to a git repository with [dbt Cloud](/guides/databricks#set-up-a-dbt-cloud-managed-repository). [Core](/guides/manual-install#create-a-repository) projects will use the git CLI. ## Next steps diff --git a/website/docs/guides/dbt-ecosystem/databricks-guides/productionizing-your-dbt-databricks-project.md b/website/docs/guides/dbt-ecosystem/databricks-guides/productionizing-your-dbt-databricks-project.md index a3b4be5a051..35c5d852d74 100644 --- a/website/docs/guides/dbt-ecosystem/databricks-guides/productionizing-your-dbt-databricks-project.md +++ b/website/docs/guides/dbt-ecosystem/databricks-guides/productionizing-your-dbt-databricks-project.md @@ -184,5 +184,5 @@ To get the most out of both tools, you can use the [persist docs config](/refere - [Advanced deployments course](https://courses.getdbt.com/courses/advanced-deployment) if you want a deeper dive into these topics - [Autoscaling CI: The intelligent Slim CI](https://docs.getdbt.com/blog/intelligent-slim-ci) - [Trigger a dbt Cloud Job in your automated workflow with Python](https://discourse.getdbt.com/t/triggering-a-dbt-cloud-job-in-your-automated-workflow-with-python/2573) -- [Databricks + dbt Cloud Quickstart Guide](/quickstarts/databricks) +- [Databricks + dbt Cloud Quickstart Guide](/guides/databricks) - Reach out to your Databricks account team to get access to preview features on Databricks. diff --git a/website/docs/guides/manual-install-qs.md b/website/docs/guides/manual-install-qs.md index 2444cf29d7e..ee07bc846c2 100644 --- a/website/docs/guides/manual-install-qs.md +++ b/website/docs/guides/manual-install-qs.md @@ -15,7 +15,7 @@ When you use dbt Core to work with dbt, you will be editing files locally using * To use dbt Core, it's important that you know some basics of the Terminal. In particular, you should understand `cd`, `ls` and `pwd` to navigate through the directory structure of your computer easily. * Install dbt Core using the [installation instructions](/docs/core/installation) for your operating system. -* Complete [Setting up (in BigQuery)](/quickstarts/bigquery?step=2) and [Loading data (BigQuery)](/quickstarts/bigquery?step=3). +* Complete [Setting up (in BigQuery)](/guides/bigquery?step=2) and [Loading data (BigQuery)](/guides/bigquery?step=3). * [Create a GitHub account](https://github.com/join) if you don't already have one. ### Create a starter project diff --git a/website/docs/guides/migration/tools/migrating-from-stored-procedures/2-mapping-inserts.md b/website/docs/guides/migration/tools/migrating-from-stored-procedures/2-mapping-inserts.md index d8f31a0f14a..44e01784f04 100644 --- a/website/docs/guides/migration/tools/migrating-from-stored-procedures/2-mapping-inserts.md +++ b/website/docs/guides/migration/tools/migrating-from-stored-procedures/2-mapping-inserts.md @@ -13,7 +13,7 @@ INSERT INTO returned_orders (order_id, order_date, total_return) SELECT order_id, order_date, total FROM orders WHERE type = 'return' ``` -Converting this with a first pass to a [dbt model](/quickstarts/bigquery?step=8) (in a file called returned_orders.sql) might look something like: +Converting this with a first pass to a [dbt model](/guides/bigquery?step=8) (in a file called returned_orders.sql) might look something like: ```sql SELECT diff --git a/website/docs/guides/orchestration/airflow-and-dbt-cloud/2-setting-up-airflow-and-dbt-cloud.md b/website/docs/guides/orchestration/airflow-and-dbt-cloud/2-setting-up-airflow-and-dbt-cloud.md index 9c3b8eb7f1b..01b15440920 100644 --- a/website/docs/guides/orchestration/airflow-and-dbt-cloud/2-setting-up-airflow-and-dbt-cloud.md +++ b/website/docs/guides/orchestration/airflow-and-dbt-cloud/2-setting-up-airflow-and-dbt-cloud.md @@ -77,7 +77,7 @@ Create a service token from within dbt Cloud using the instructions [found here] ## 6. Create a dbt Cloud job -In your dbt Cloud account create a job, paying special attention to the information in the bullets below. Additional information for creating a dbt Cloud job can be found [here](/quickstarts/bigquery). +In your dbt Cloud account create a job, paying special attention to the information in the bullets below. Additional information for creating a dbt Cloud job can be found [here](/guides/bigquery). - Configure the job with the commands that you want to include when this job kicks off, as Airflow will be referring to the job’s configurations for this rather than being explicitly coded in the Airflow DAG. This job will run a set of commands rather than a single command. - Ensure that the schedule is turned **off** since we’ll be using Airflow to kick things off.