Skip to content

Commit

Permalink
update broken markdown links
Browse files Browse the repository at this point in the history
  • Loading branch information
john-rock committed Oct 27, 2023
1 parent e2c7f48 commit 4da2077
Show file tree
Hide file tree
Showing 12 changed files with 18 additions and 18 deletions.
2 changes: 1 addition & 1 deletion website/docs/docs/build/projects.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,4 +93,4 @@ If you want to see what a mature, production project looks like, check out the [
## Related docs
* [Best practices: How we structure our dbt projects](/guides/best-practices/how-we-structure/1-guide-overview)
* [Quickstarts for dbt Cloud](/quickstarts)
* [Quickstart for dbt Core](/quickstarts/manual-install)
* [Quickstart for dbt Core](/guides/manual-install)
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ Here's what's new to [docs.getdbt.com](http://docs.getdbt.com/):

- Deprecated dbt Core v1.0 and v1.1 from the docs.
- Added configuration instructions for the [AWS Glue](/docs/core/connect-data-platform/glue-setup) community plugin.
- Revised the dbt Core quickstart, making it easier to follow. Divided this guide into steps that align with the [other guides](/quickstarts/manual-install?step=1).
- Revised the dbt Core quickstart, making it easier to follow. Divided this guide into steps that align with the [other guides](/guides/manual-install?step=1).

## New 📚 Guides, ✏️ blog posts, and FAQs

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Hello from the dbt Docs team: @mirnawong1, @matthewshaver, @nghi-ly, and @runleo
## ☁ Cloud projects

- Added Starburst/Trino adapter docs, including:
* [dbt Cloud quickstart guide](/quickstarts/starburst-galaxy)
* [dbt Cloud quickstart guide](/guides/starburst-galaxy)
* [connection page](/docs/cloud/connect-data-platform/connect-starburst-trino)
* [set up page](/docs/core/connect-data-platform/trino-setup), and [config page](/reference/resource-configs/trino-configs).
- Enhanced [dbt Cloud jobs page](/docs/deploy/jobs) and section to include conceptual info on the queue time, improvements made around it, and about failed jobs.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ tags: [Mar-2023]

dbt Labs is introducing the newest connection option in dbt Cloud: the `dbt-trino` adapter is now available in Public Preview. This allows you to connect to Starburst Galaxy, Starburst Enterprise, and self-hosted Trino from dbt Cloud.

Check out our [Quickstart for dbt Cloud and Starburst Galaxy](/quickstarts/starburst-galaxy) to explore more.
Check out our [Quickstart for dbt Cloud and Starburst Galaxy](/guides/starburst-galaxy) to explore more.

## What’s the reason users should be excited about this?

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ Learn more about [dbt Cloud features](/docs/cloud/about-cloud/dbt-cloud-features

### dbt Core

dbt Core is an open-source tool that enables data teams to transform data using analytics engineering best practices. You can install and use dbt Core on the command line. Learn more with the [quickstart for dbt Core](/quickstarts/codespace).
dbt Core is an open-source tool that enables data teams to transform data using analytics engineering best practices. You can install and use dbt Core on the command line. Learn more with the [quickstart for dbt Core](/guides/codespace).

## The power of dbt

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ hoverSnippet: Read this guide to understand how to examine your builds in dbt.

### Model Timing

That’s where dbt Cloud’s Model Timing visualization comes in extremely handy. If we’ve set up a [Job](/quickstarts/bigquery) in dbt Cloud to run our models, we can use the Model Timing tab to pinpoint our longest-running models.
That’s where dbt Cloud’s Model Timing visualization comes in extremely handy. If we’ve set up a [Job](/guides/bigquery) in dbt Cloud to run our models, we can use the Model Timing tab to pinpoint our longest-running models.

![dbt Cloud's Model Timing diagram](/img/guides/best-practices/materializations/model-timing-diagram.png)

Expand Down
8 changes: 4 additions & 4 deletions website/docs/guides/codespace-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,10 @@ dbt Labs provides a [GitHub Codespace](https://docs.github.com/en/codespaces/ove

## Related content

- [Create a GitHub repository](/quickstarts/manual-install?step=2)
- [Build your first models](/quickstarts/manual-install?step=3)
- [Test and document your project](/quickstarts/manual-install?step=4)
- [Schedule a job](/quickstarts/manual-install?step=5)
- [Create a GitHub repository](/guides/manual-install?step=2)
- [Build your first models](/guides/manual-install?step=3)
- [Test and document your project](/guides/manual-install?step=4)
- [Schedule a job](/guides/manual-install?step=5)
- Learn more with [dbt Courses](https://courses.getdbt.com/collections)

## Create a codespace
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,11 +57,11 @@ Now that the Databricks components are in place, we can configure our dbt projec

If you are migrating an existing dbt project from the dbt-spark adapter to dbt-databricks, follow this [migration guide](https://docs.getdbt.com/guides/migration/tools/migrating-from-spark-to-databricks#migration) to switch adapters without needing to update developer credentials and other existing configs.

If you’re starting a new dbt project, follow the steps below. For a more detailed setup flow, check out our [quickstart guide.](/quickstarts/databricks)
If you’re starting a new dbt project, follow the steps below. For a more detailed setup flow, check out our [quickstart guide.](/guides/databricks)

### Connect dbt to Databricks

First, you’ll need to connect your dbt project to Databricks so it can send transformation instructions and build objects in Unity Catalog. Follow the instructions for [dbt Cloud](/quickstarts/databricks?step=4) or [Core](https://docs.getdbt.com/reference/warehouse-setups/databricks-setup) to configure your project’s connection credentials.
First, you’ll need to connect your dbt project to Databricks so it can send transformation instructions and build objects in Unity Catalog. Follow the instructions for [dbt Cloud](/guides/databricks?step=4) or [Core](https://docs.getdbt.com/reference/warehouse-setups/databricks-setup) to configure your project’s connection credentials.

Each developer must generate their Databricks PAT and use the token in their development credentials. They will also specify a unique developer schema that will store the tables and views generated by dbt runs executed from their IDE. This provides isolated developer environments and ensures data access is fit for purpose.

Expand All @@ -84,7 +84,7 @@ During your first invocation of `dbt run`, dbt will create the developer schema

Last, we need to give dbt a way to deploy code outside of development environments. To do so, we’ll use dbt [environments](https://docs.getdbt.com/docs/collaborate/environments) to define the production targets that end users will interact with.

Core projects can use [targets in profiles](https://docs.getdbt.com/docs/core/connection-profiles#understanding-targets-in-profiles) to separate environments. [dbt Cloud environments](https://docs.getdbt.com/docs/cloud/develop-in-the-cloud#set-up-and-access-the-cloud-ide) allow you to define environments via the UI and [schedule jobs](/quickstarts/databricks#create-and-run-a-job) for specific environments.
Core projects can use [targets in profiles](https://docs.getdbt.com/docs/core/connection-profiles#understanding-targets-in-profiles) to separate environments. [dbt Cloud environments](https://docs.getdbt.com/docs/cloud/develop-in-the-cloud#set-up-and-access-the-cloud-ide) allow you to define environments via the UI and [schedule jobs](/guides/databricks#create-and-run-a-job) for specific environments.

Let’s set up our deployment environment:

Expand All @@ -96,7 +96,7 @@ Let’s set up our deployment environment:

### Connect dbt to your git repository

Next, you’ll need somewhere to store and version control your code that allows you to collaborate with teammates. Connect your dbt project to a git repository with [dbt Cloud](/quickstarts/databricks#set-up-a-dbt-cloud-managed-repository). [Core](/quickstarts/manual-install#create-a-repository) projects will use the git CLI.
Next, you’ll need somewhere to store and version control your code that allows you to collaborate with teammates. Connect your dbt project to a git repository with [dbt Cloud](/guides/databricks#set-up-a-dbt-cloud-managed-repository). [Core](/guides/manual-install#create-a-repository) projects will use the git CLI.

## Next steps

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -184,5 +184,5 @@ To get the most out of both tools, you can use the [persist docs config](/refere
- [Advanced deployments course](https://courses.getdbt.com/courses/advanced-deployment) if you want a deeper dive into these topics
- [Autoscaling CI: The intelligent Slim CI](https://docs.getdbt.com/blog/intelligent-slim-ci)
- [Trigger a dbt Cloud Job in your automated workflow with Python](https://discourse.getdbt.com/t/triggering-a-dbt-cloud-job-in-your-automated-workflow-with-python/2573)
- [Databricks + dbt Cloud Quickstart Guide](/quickstarts/databricks)
- [Databricks + dbt Cloud Quickstart Guide](/guides/databricks)
- Reach out to your Databricks account team to get access to preview features on Databricks.
2 changes: 1 addition & 1 deletion website/docs/guides/manual-install-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ When you use dbt Core to work with dbt, you will be editing files locally using

* To use dbt Core, it's important that you know some basics of the Terminal. In particular, you should understand `cd`, `ls` and `pwd` to navigate through the directory structure of your computer easily.
* Install dbt Core using the [installation instructions](/docs/core/installation) for your operating system.
* Complete [Setting up (in BigQuery)](/quickstarts/bigquery?step=2) and [Loading data (BigQuery)](/quickstarts/bigquery?step=3).
* Complete [Setting up (in BigQuery)](/guides/bigquery?step=2) and [Loading data (BigQuery)](/guides/bigquery?step=3).
* [Create a GitHub account](https://github.com/join) if you don't already have one.

### Create a starter project
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ INSERT INTO returned_orders (order_id, order_date, total_return)
SELECT order_id, order_date, total FROM orders WHERE type = 'return'
```

Converting this with a first pass to a [dbt model](/quickstarts/bigquery?step=8) (in a file called returned_orders.sql) might look something like:
Converting this with a first pass to a [dbt model](/guides/bigquery?step=8) (in a file called returned_orders.sql) might look something like:

```sql
SELECT
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ Create a service token from within dbt Cloud using the instructions [found here]

## 6. Create a dbt Cloud job

In your dbt Cloud account create a job, paying special attention to the information in the bullets below. Additional information for creating a dbt Cloud job can be found [here](/quickstarts/bigquery).
In your dbt Cloud account create a job, paying special attention to the information in the bullets below. Additional information for creating a dbt Cloud job can be found [here](/guides/bigquery).

- Configure the job with the commands that you want to include when this job kicks off, as Airflow will be referring to the job’s configurations for this rather than being explicitly coded in the Airflow DAG. This job will run a set of commands rather than a single command.
- Ensure that the schedule is turned **off** since we’ll be using Airflow to kick things off.
Expand Down

0 comments on commit 4da2077

Please sign in to comment.