diff --git a/website/blog/2023-12-15-serverless-free-tier-data-stack-with-dlt-and-dbt-core.md b/website/blog/2023-12-15-serverless-free-tier-data-stack-with-dlt-and-dbt-core.md new file mode 100644 index 00000000000..7e63b6e1c6d --- /dev/null +++ b/website/blog/2023-12-15-serverless-free-tier-data-stack-with-dlt-and-dbt-core.md @@ -0,0 +1,160 @@ +--- +title: Serverless, free-tier data stack with dlt + dbt core. +description: "In this article, Euan shares his personal project to fetch property price data during his and his partner's house-hunting process, and how he created a serverless free-tier data stack by using Google Cloud Functions to run data ingestion tool dlt alongside dbt for transformation." +slug: serverless-dlt-dbt-stack + +authors: [euan_johnston] + +hide_table_of_contents: false + +date: 2023-01-15 +is_featured: false +--- + + + +## The problem, the builder and tooling + +**The problem**: My partner and I are considering buying a property in Portugal. There is no reference data for the real estate market here - how many houses are being sold, for what price? Nobody knows except the property office and maybe the banks, and they don’t readily divulge this information. The only data source we have is Idealista, which is a portal where real estate agencies post ads. + +Unfortunately, there are significantly fewer properties than ads - it seems many real estate companies re-post the same ad that others do, with intentionally different data and often misleading bits of info. The real estate agencies do this so the interested parties reach out to them for clarification, and from there they can start a sales process. At the same time, the website with the ads is incentivised to allow this to continue as they get paid per ad, not per property. + +**The builder:** I’m a data freelancer who deploys end to end solutions, so when I have a data problem, I cannot just let it go. + +**The tools:** I want to be able to run my project on [Google Cloud Functions](https://cloud.google.com/functions) due to the generous free tier. [dlt](https://dlthub.com/) is a new Python library for declarative data ingestion which I have wanted to test for some time. Finally, I will use dbt Core for transformation. + +## The starting point + +If I want to have reliable information on the state of the market I will need to: + +- Grab the messy data from Idealista and historize it. +- Deduplicate existing listings. +- Try to infer what listings sold for how much. + +Once I have deduplicated listings with some online history, I can get an idea: + +- How expensive which properties are. +- How fast they get sold, hopefully a signal of whether they are “worth it” or not. + +## Towards a solution + +The solution has pretty standard components: + +- An EtL pipeline. The little t stands for normalisation, such as transforming strings to dates or unpacking nested structures. This is handled by dlt functions written in Python. +- A transformation layer taking the source data loaded by my dlt functions and creating the tables necessary, handled by dbt. +- Due to the complexity of deduplication, I needed to add a human element to confirm the deduplication in Google Sheets. + +These elements are reflected in the diagram below and further clarified in greater detail later in the article: + + + +### Ingesting the data + +For ingestion, I use a couple of sources: + +First, I ingest home listings from the Idealista API, accessed through [API Dojo's freemium wrapper](https://rapidapi.com/apidojo/api/idealista2). The dlt pipeline I created for ingestion is in [this repo](https://github.com/euanjohnston-dev/Idealista_pipeline). + +After an initial round of transformation (described in the next section), the deduplicated data is loaded into BigQuery where I can query it from the Google Sheets client and manually review the deduplication. + +When I'm happy with the results, I use the [ready-made dlt Sheets source connector](https://dlthub.com/docs/dlt-ecosystem/verified-sources/google_sheets) to pull the data back into BigQuery, [as defined here](https://github.com/euanjohnston-dev/gsheets_check_pipeline). + +### Transforming the data + +For transforming I use my favorite solution, dbt Core. For running and orchestrating dbt on Cloud Functions, I am using dlt’s dbt Core runner. The benefit of the runner in this context is that I can re-use the same credential setup, instead of creating a separate profiles.yml file. + +This is the package I created: + +### Production-readying the pipeline + +To make the pipeline more “production ready”, I made some improvements: + +- Using a credential store instead of hard-coding passwords, in this case Google Secret Manager. +- Be notified when the pipeline runs and what the outcome is. For this I sent data to Slack via a dlt decorator that posts the error on failure and the metadata on success. + +```python +from dlt.common.runtime.slack import send_slack_message + +def notify_on_completion(hook): + def decorator(func): + def wrapper(*args, **kwargs): + try: + load_info = func(*args, **kwargs) + message = f"Function {func.__name__} completed successfully. Load info: {load_info}" + send_slack_message(hook, message) + return load_info + except Exception as e: + message = f"Function {func.__name__} failed. Error: {str(e)}" + send_slack_message(hook, message) + raise + return wrapper + return decorator +``` + +## The outcome + +The outcome was first and foremost a visualisation highlighting the unique properties available in my specific area of search. The map shown on the left of the page gives a live overview of location, number of duplicates (bubble size) and price (bubble colour) which can amongst other features be filtered using the sliders on the right. This represents a much better decluttered solution from which to observe the actual inventory available. + + + +Further charts highlight additional metrics which – now that deduplication is complete – can be accurately measured including most importantly, the development over time of “average price/square metre” and those properties which have been inferred to have been sold. + +### Next steps + +This version was very much about getting a base from which to analyze the properties for my own personal use case. + +In terms of further development which could take place, I have had interest from people to run the solution on their own specific target area. + +For this to work at scale I would need a more robust method to deal with duplicate attribution, which is a difficult problem as real estate agencies intentionally change details like number of rooms or surface area. + +Perhaps this is a problem ML or GPT could solve equally well as a human, given the limited options available. + +## Learnings and conclusion + +The data problem itself was an eye opener into the real-estate market. It’s a messy market full of unknowns and noise, which adds a significant purchase risk to first time buyers. + +Tooling wise, it was surprising how quick it was to set everything up. dlt integrates well with dbt and enables fast and simple data ingestion, making this project simpler than I thought it would be. + +### dlt + +Good: + +- As a big fan of dbt I love how seamlessly the two solutions complement one another. dlt handles the data cleaning and normalisation automatically so I can focus on curating and modelling it in dbt. While the automatic unpacking leaves some small adjustments for the analytics engineer, it’s much better than cleaning and typing json in the database or in custom python code. +- When creating my first dummy pipeline I used duckdb. It felt like a great introduction into how simple it is to get started and provided a solid starting block before developing something for the cloud. + +Bad: + +- I did have a small hiccup with the google sheets connector assuming an oauth authentication over my desired sdk but this was relatively easy to rectify. (explicitly stating GcpServiceAccountCredentials in the init.py file for the source). +- Using both a verified source in the gsheets connector and building my own from Rapid API endpoints seemed equally intuitive. However I would have wanted more documentation on how to run these 2 pipelines in the same script with the dbt pipeline. + +### dbt + +No surprises there. I developed the project locally, and to deploy to cloud functions I injected credentials to dbt via the dlt runner. This meant I could re-use the setup I did for the other dlt pipelines. + +```python +def dbt_run(): + # make an authenticated connection with dlt to the dwh + pipeline = dlt.pipeline( + pipeline_name='dbt_pipeline', + destination='bigquery', # credentials read from env + dataset_name='dbt' + ) + # make a venv in case we have lib conflicts between dlt and current env + venv = dlt.dbt.get_venv(pipeline) + # package the pipeline, dbt package and env + dbt = dlt.dbt.package(pipeline, "dbt/property_analytics", venv=venv) + # and run it + models = dbt.run_all() + # show outcome + for m in models: + print(f"Model {m.model_name} materialized in {m.time} with status {m.status} and message {m.message}" +``` + +### Cloud functions + +While I had used cloud functions before, I had never previously set them up for dbt and I was able to easily follow dlt’s docs to run the pipelines there. Cloud functions is a great solution to cheaply run small scale pipelines and my running cost of the project is a few cents a month. If the insights drawn from the project help us save even 1% of a house price, the project will have been a success. + +### To sum up + +dlt feels like the perfect solution for anyone who has scratched the surface of python development. To be able to have schemas ready for transformation in such a short space of time is truly… transformational. As a freelancer, being able to accelerate the development of pipelines is a huge benefit within companies who are often frustrated with the amount of time it takes to start ‘showing value’. + +I’d welcome the chance to discuss what’s been built to date or collaborate on any potential further development in the comments below. diff --git a/website/blog/2024-01-09-defer-in-development.md b/website/blog/2024-01-09-defer-in-development.md index 634fd1100c9..96e2ed53f85 100644 --- a/website/blog/2024-01-09-defer-in-development.md +++ b/website/blog/2024-01-09-defer-in-development.md @@ -12,7 +12,7 @@ date: 2024-01-09 is_featured: true --- -Picture this — you’ve got a massive dbt project, thousands of models chugging along, creating actionable insights for your stakeholders. A ticket comes your way — a model needs to be refactored! "No problem," you think to yourself, "I will simply make that change and test it locally!" You look at you lineage, and realize this model is many layers deep, buried underneath a long chain of tables and views. +Picture this — you’ve got a massive dbt project, thousands of models chugging along, creating actionable insights for your stakeholders. A ticket comes your way — a model needs to be refactored! "No problem," you think to yourself, "I will simply make that change and test it locally!" You look at your lineage, and realize this model is many layers deep, buried underneath a long chain of tables and views. “OK,” you think further, “I’ll just run a `dbt build -s +my_changed_model` to make sure I have everything I need built into my dev schema and I can test my changes”. You run the command. You wait. You wait some more. You get some coffee, and completely take yourself out of your dbt development flow state. A lot of time and money down the drain to get to a point where you can *start* your work. That’s no good! diff --git a/website/blog/authors.yml b/website/blog/authors.yml index a3548575b6e..4aa33773988 100644 --- a/website/blog/authors.yml +++ b/website/blog/authors.yml @@ -187,6 +187,16 @@ emily_riederer: - icon: fa-readme url: https://emilyriederer.com +euan_johnston: + image_url: /img/blog/authors/ejohnston.png + job_title: Freelance Business Intelligence manager + name: Euan Johnston + links: + - icon: fa-linkedin + url: https://www.linkedin.com/in/euan-johnston-610a05a8/ + - icon: fa-github + url: https://github.com/euanjohnston-dev + grace_goheen: image_url: /img/blog/authors/grace-goheen.jpeg job_title: Analytics Engineer diff --git a/website/docs/best-practices/best-practice-workflows.md b/website/docs/best-practices/best-practice-workflows.md index 9b79c244901..4381906361e 100644 --- a/website/docs/best-practices/best-practice-workflows.md +++ b/website/docs/best-practices/best-practice-workflows.md @@ -39,7 +39,7 @@ Your dbt project will depend on raw data stored in your database. Since this dat :::info Using sources for raw data references -As of v0.13.0, we recommend defining your raw data as [sources](/docs/build/sources), and selecting from the source rather than using the direct relation reference. Our dbt projects no longer contain any direct relation references in any models. +We recommend defining your raw data as [sources](/docs/build/sources), and selecting from the source rather than using the direct relation reference. Our dbt projects don't contain any direct relation references in any models. ::: diff --git a/website/docs/docs/build/groups.md b/website/docs/docs/build/groups.md index d4fda045277..62c4e4493d3 100644 --- a/website/docs/docs/build/groups.md +++ b/website/docs/docs/build/groups.md @@ -7,18 +7,6 @@ keywords: - groups access mesh --- -:::info New functionality -This functionality is new in v1.5. -::: - -## Related docs - -* [Model Access](/docs/collaborate/govern/model-access#groups) -* [Group configuration](/reference/resource-configs/group) -* [Group selection](/reference/node-selection/methods#the-group-method) - -## About groups - A group is a collection of nodes within a dbt DAG. Groups are named, and every group has an `owner`. They enable intentional collaboration within and across teams by restricting [access to private](/reference/resource-configs/access) models. Group members may include models, tests, seeds, snapshots, analyses, and metrics. (Not included: sources and exposures.) Each node may belong to only one group. @@ -126,3 +114,9 @@ dbt.exceptions.DbtReferenceError: Parsing Error Node model.jaffle_shop.marketing_model attempted to reference node model.jaffle_shop.finance_model, which is not allowed because the referenced node is private to the finance group. ``` + +## Related docs + +* [Model Access](/docs/collaborate/govern/model-access#groups) +* [Group configuration](/reference/resource-configs/group) +* [Group selection](/reference/node-selection/methods#the-group-method) \ No newline at end of file diff --git a/website/docs/docs/build/materializations.md b/website/docs/docs/build/materializations.md index 67796afdbdb..9ae6021cc71 100644 --- a/website/docs/docs/build/materializations.md +++ b/website/docs/docs/build/materializations.md @@ -120,7 +120,7 @@ required with incremental materializations * `dbt run` on materialized views corresponds to a code deployment, just like views * **Cons:** * Due to the fact that materialized views are more complex database objects, database platforms tend to have -less configuration options available, see your database platform's docs for more details +fewer configuration options available; see your database platform's docs for more details * Materialized views may not be supported by every database platform * **Advice:** * Consider materialized views for use cases where incremental models are sufficient, but you would like the data platform to manage the incremental logic and refresh. diff --git a/website/docs/docs/build/project-variables.md b/website/docs/docs/build/project-variables.md index 59d6be49b17..a328731c7d4 100644 --- a/website/docs/docs/build/project-variables.md +++ b/website/docs/docs/build/project-variables.md @@ -25,13 +25,6 @@ Jinja is not supported within the `vars` config, and all values will be interpre ::: -:::info New in v0.17.0 - -The syntax for specifying vars in the `dbt_project.yml` file has changed in -dbt v0.17.0. See the [migration guide](/docs/dbt-versions/core-upgrade) -for more information on these changes. - -::: To define variables in a dbt project, add a `vars` config to your `dbt_project.yml` file. These `vars` can be scoped globally, or to a specific package imported in your diff --git a/website/docs/docs/cloud/configure-cloud-cli.md b/website/docs/docs/cloud/configure-cloud-cli.md index d6fca00cf25..a442a6e6ad1 100644 --- a/website/docs/docs/cloud/configure-cloud-cli.md +++ b/website/docs/docs/cloud/configure-cloud-cli.md @@ -66,9 +66,8 @@ Once you install the dbt Cloud CLI, you need to configure it to connect to a dbt ```yaml # dbt_project.yml name: - version: - ... + # Your project configs... dbt-cloud: project-id: PROJECT_ID @@ -86,6 +85,7 @@ To set environment variables in the dbt Cloud CLI for your dbt project: 2. Then select **Profile Settings**, then **Credentials**. 3. Click on your project and scroll to the **Environment Variables** section. 4. Click **Edit** on the lower right and then set the user-level environment variables. + - Note, when setting up the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), using [environment variables](/docs/build/environment-variables) like `{{env_var('DBT_WAREHOUSE')}}` is not supported. You should use the actual credentials instead. ## Use the dbt Cloud CLI diff --git a/website/docs/docs/cloud/manage-access/set-up-bigquery-oauth.md b/website/docs/docs/cloud/manage-access/set-up-bigquery-oauth.md index 87018b14d56..f717bf3a5b1 100644 --- a/website/docs/docs/cloud/manage-access/set-up-bigquery-oauth.md +++ b/website/docs/docs/cloud/manage-access/set-up-bigquery-oauth.md @@ -77,4 +77,5 @@ Select **Allow**. This redirects you back to dbt Cloud. You should now be an aut ## FAQs - + + diff --git a/website/docs/docs/core/connect-data-platform/connection-profiles.md b/website/docs/docs/core/connect-data-platform/connection-profiles.md index 8088ff1dfa7..32e60c8cc18 100644 --- a/website/docs/docs/core/connect-data-platform/connection-profiles.md +++ b/website/docs/docs/core/connect-data-platform/connection-profiles.md @@ -83,11 +83,8 @@ To set up your profile, copy the correct sample profile for your warehouse into You can find more information on which values to use in your targets below. -:::info Validating your warehouse credentials +Use the [debug](/reference/dbt-jinja-functions/debug-method) command to validate your warehouse connection. Run `dbt debug` from within a dbt project to test your connection. -Use the [debug](/reference/dbt-jinja-functions/debug-method) command to check whether you can successfully connect to your warehouse. Simply run `dbt debug` from within a dbt project to test your connection. - -::: ## Understanding targets in profiles diff --git a/website/docs/docs/dbt-cloud-apis/schema-discovery-environment.mdx b/website/docs/docs/dbt-cloud-apis/schema-discovery-environment.mdx index a82bba6576d..a89d8f31962 100644 --- a/website/docs/docs/dbt-cloud-apis/schema-discovery-environment.mdx +++ b/website/docs/docs/dbt-cloud-apis/schema-discovery-environment.mdx @@ -18,13 +18,6 @@ When querying for `environment`, you can use the following arguments. -:::caution - -dbt Labs is making changes to the Discovery API. These changes will take effect on August 15, 2023. - -The data type `Int` for `id` is being deprecated and will be replaced with `BigInt`. When the time comes, you will need to update your API call accordingly to avoid errors. -::: - ### Example queries You can use your production environment's `id`: diff --git a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md index af098860e6f..1f40aaa9f40 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md +++ b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md @@ -5,10 +5,6 @@ description: New features and changes in dbt Core v1.7 displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - ## Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/8aaed0e29f9560bc53d9d3e88325a9597318e375/CHANGELOG.md) diff --git a/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md b/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md index f1f7a77e1e1..a70f220edc8 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md +++ b/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md @@ -5,10 +5,6 @@ id: "upgrading-to-v1.6" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - dbt Core v1.6 has three significant areas of focus: 1. Next milestone of [multi-project deployments](https://github.com/dbt-labs/dbt-core/discussions/6725): improvements to contracts, groups/access, versions; and building blocks for cross-project `ref` 1. Semantic layer re-launch: dbt Core and [MetricFlow](https://docs.getdbt.com/docs/build/about-metricflow) integration diff --git a/website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md b/website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md index e739caa477a..589ac162088 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md +++ b/website/docs/docs/dbt-versions/core-upgrade/02-upgrading-to-v1.5.md @@ -5,10 +5,6 @@ id: "upgrading-to-v1.5" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - dbt Core v1.5 is a feature release, with two significant additions: 1. [**Model governance**](/docs/collaborate/govern/about-model-governance) — access, contracts, versions — the first phase of [multi-project deployments](https://github.com/dbt-labs/dbt-core/discussions/6725) 2. A Python entry point for [**programmatic invocations**](/reference/programmatic-invocations), at parity with the CLI diff --git a/website/docs/docs/dbt-versions/core-upgrade/03-upgrading-to-dbt-utils-v1.0.md b/website/docs/docs/dbt-versions/core-upgrade/03-upgrading-to-dbt-utils-v1.0.md index 229a54627fc..a8bb960c37d 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/03-upgrading-to-dbt-utils-v1.0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/03-upgrading-to-dbt-utils-v1.0.md @@ -3,10 +3,6 @@ title: "Upgrading to dbt utils v1.0" description: New features and breaking changes to consider as you upgrade to dbt utils v1.0. --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - # Upgrading to dbt utils v1.0 For the first time, [dbt utils](https://hub.getdbt.com/dbt-labs/dbt_utils/latest/) is crossing the major version boundary. From [last month’s blog post](https://www.getdbt.com/blog/announcing-dbt-v1.3-and-utils/): diff --git a/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md b/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md index 240f0b86de3..41e19956690 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md +++ b/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md @@ -5,10 +5,6 @@ id: "upgrading-to-v1.4" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - ### Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md) diff --git a/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md b/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md index 5a381b16928..7febb0bade9 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md +++ b/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md @@ -5,10 +5,6 @@ id: "upgrading-to-v1.3" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - ### Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.3.latest/CHANGELOG.md) diff --git a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md index cd75e7f411b..17e62c90b43 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md +++ b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md @@ -5,10 +5,6 @@ id: "upgrading-to-v1.2" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - ### Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.2.latest/CHANGELOG.md) diff --git a/website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md b/website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md index 868f3c7ed04..aee3413e1ad 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md +++ b/website/docs/docs/dbt-versions/core-upgrade/07-upgrading-to-v1.1.md @@ -5,10 +5,6 @@ id: "upgrading-to-v1.1" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - ### Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.1.latest/CHANGELOG.md) diff --git a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md index 0ea66980874..9cbfae50831 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md @@ -5,9 +5,6 @@ id: "upgrading-to-v1.0" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - ### Resources diff --git a/website/docs/docs/dbt-versions/core-upgrade/09-upgrading-to-v0.21.md b/website/docs/docs/dbt-versions/core-upgrade/09-upgrading-to-v0.21.md index d5b429132cd..5575b0cc2af 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/09-upgrading-to-v0.21.md +++ b/website/docs/docs/dbt-versions/core-upgrade/09-upgrading-to-v0.21.md @@ -5,10 +5,6 @@ displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - :::caution Unsupported version dbt Core v0.21 has reached the end of critical support. No new patch versions will be released, and it will stop running in dbt Cloud on June 30, 2022. Read ["About dbt Core versions"](/docs/dbt-versions/core) for more details. diff --git a/website/docs/docs/dbt-versions/core-upgrade/10-upgrading-to-v0.20.md b/website/docs/docs/dbt-versions/core-upgrade/10-upgrading-to-v0.20.md index be6054087b3..d95b8d8bacd 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/10-upgrading-to-v0.20.md +++ b/website/docs/docs/dbt-versions/core-upgrade/10-upgrading-to-v0.20.md @@ -4,10 +4,6 @@ id: "upgrading-to-v0.20" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - :::caution Unsupported version dbt Core v0.20 has reached the end of critical support. No new patch versions will be released, and it will stop running in dbt Cloud on June 30, 2022. Read ["About dbt Core versions"](/docs/dbt-versions/core) for more details. ::: diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-11-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-11-0.md index e91dde4c923..27c0456660f 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-11-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-11-0.md @@ -4,10 +4,6 @@ id: "upgrading-to-0-11-0" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - ## Schema.yml v2 syntax dbt v0.11.0 adds an auto-generated docs site to your dbt project. To make effective use of the documentation site, you'll need to use the new "version 2" schema.yml syntax. For a full explanation of the version 2 syntax, check out the [schema.yml Files](/reference/configs-and-properties) section of the documentation. diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-12-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-12-0.md index b3d4e9d9bcb..a95ec3b11bd 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-12-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-12-0.md @@ -4,10 +4,6 @@ id: "upgrading-to-0-12-0" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - ## End of support Support for the `repositories:` block in `dbt_project.yml` (deprecated in 0.10.0) was removed. diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-13-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-13-0.md index bb15d1a73b0..9875eb3c346 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-13-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-13-0.md @@ -4,10 +4,6 @@ id: "upgrading-to-0-13-0" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - ## Breaking changes ### on-run-start and on-run-end diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-0.md index 48aa14a42e5..21cfbe8d3b5 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-0.md @@ -4,10 +4,6 @@ id: "upgrading-to-0-14-0" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - This guide outlines migration instructions for: 1. [Upgrading archives to snapshots](#upgrading-to-snapshot-blocks) diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-1.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-1.md index 215385acf0f..559775644cd 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-1.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-14-1.md @@ -4,10 +4,6 @@ id: "upgrading-to-0-14-1" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - The dbt v0.14.1 release _does not_ contain any breaking code changes for users upgrading from v0.14.0. If you are upgrading from a version less than 0.14.0, consult the [Upgrading to 0.14.0](upgrading-to-0-14-0) migration guide. The following section contains important information for users of the `check` strategy on Snowflake and BigQuery. Action may be required in your database. ## Changes to the Snapshot "check" algorithm diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-15-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-15-0.md index 5eba212590f..7db64f5940f 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-15-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-15-0.md @@ -4,10 +4,6 @@ id: "upgrading-to-0-15-0" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - The dbt v0.15.0 release contains a handful of breaking code changes for users upgrading from v0.14.0. diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-16-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-16-0.md index 076e6fc4e88..d6fc6f9f49a 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-16-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-16-0.md @@ -4,10 +4,6 @@ id: "upgrading-to-0-16-0" displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - dbt v0.16.0 contains many new features, bug fixes, and improvements. This guide covers all of the important information to consider when upgrading from an earlier version of dbt to 0.16.0. diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-17-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-17-0.md index 5b863777df9..b99466e7c9a 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-17-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-17-0.md @@ -5,10 +5,6 @@ displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - dbt v0.17.0 makes compilation more consistent, improves performance, and fixes a number of bugs. ## Articles: @@ -252,8 +248,8 @@ BigQuery: **Core** - [`path:` selectors](/reference/node-selection/methods#the-path-method) -- [`--fail-fast`](/reference/commands/run#failing-fast) -- [as_text Jinja filter](/reference/dbt-jinja-functions/as_text) +- [`--fail-fast` command](/reference/commands/run#failing-fast) +- `as_text` Jinja filter: removed this defunct filter - [accessing nodes in the `graph` object](/reference/dbt-jinja-functions/graph) - [persist_docs](/reference/resource-configs/persist_docs) - [source properties](reference/source-properties) diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md index 545bfd41ac6..f14fd03a534 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md @@ -4,10 +4,6 @@ displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - ### Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/dev/marian-anderson/CHANGELOG.md) diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-19-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-19-0.md index db825d8af9c..af978f9c6a9 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-19-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-19-0.md @@ -4,10 +4,6 @@ displayed_sidebar: "docs" --- -import UpgradeMove from '/snippets/_upgrade-move.md'; - - - ### Resources - [Discourse](https://discourse.getdbt.com/t/1951) diff --git a/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/sl-ga.md b/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/sl-ga.md index a81abec5d42..a1b59aa6ec1 100644 --- a/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/sl-ga.md +++ b/website/docs/docs/dbt-versions/release-notes/76-Oct-2023/sl-ga.md @@ -8,7 +8,7 @@ tags: [Oct-2023] --- :::important -If you're using the legacy Semantic Layer, we **highly** recommend you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt v1.6 or higher and [migrate](/guides/sl-migration) to the latest Semantic Layer. +If you're using the legacy Semantic Layer, we _highly_ recommend you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt v1.6 or higher and [migrate](/guides/sl-migration) to the latest Semantic Layer. ::: dbt Labs is thrilled to announce that the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) is now generally available. It offers consistent data organization, improved governance, reduced costs, enhanced efficiency, and accessible data for better decision-making and collaboration across organizations. diff --git a/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/sl-revamp-beta.md b/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/sl-revamp-beta.md index f44fd57aa4a..ac8e286c783 100644 --- a/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/sl-revamp-beta.md +++ b/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/sl-revamp-beta.md @@ -8,7 +8,7 @@ sidebar_position: 7 --- :::important -If you're using the legacy Semantic Layer, we **highly** recommend you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt v1.6 or higher to use the new dbt Semantic Layer. To migrate to the new Semantic Layer, refer to the dedicated [migration guide](/guides/sl-migration) for more info. +If you're using the legacy Semantic Layer, we _highly_ recommend you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt v1.6 or higher to use the new dbt Semantic Layer. To migrate to the new Semantic Layer, refer to the dedicated [migration guide](/guides/sl-migration) for more info. ::: dbt Labs are thrilled to announce the re-release of the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), now available in [public beta](#public-beta). It aims to bring the best of modeling and semantics to downstream applications by introducing: diff --git a/website/docs/docs/dbt-versions/upgrade-core-in-cloud.md b/website/docs/docs/dbt-versions/upgrade-core-in-cloud.md index e46294029ec..052611f66e6 100644 --- a/website/docs/docs/dbt-versions/upgrade-core-in-cloud.md +++ b/website/docs/docs/dbt-versions/upgrade-core-in-cloud.md @@ -134,12 +134,6 @@ If you believe your project might be affected, read more details in the migratio

-:::info Important - -If you have not already, you must add `config-version: 2` to your dbt_project.yml file. -See **Upgrading to v0.17.latest from v0.16** below for more details. - -:::
diff --git a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md index 665260ed9f4..11a610805a9 100644 --- a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md @@ -34,7 +34,7 @@ Use this guide to fully experience the power of the universal dbt Semantic Layer - [Define metrics](#define-metrics) in dbt using MetricFlow - [Test and query metrics](#test-and-query-metrics) with MetricFlow - [Run a production job](#run-a-production-job) in dbt Cloud -- [Set up dbt Semantic Layer](#setup) in dbt Cloud +- [Set up dbt Semantic Layer](#set-up-dbt-semantic-layer) in dbt Cloud - [Connect and query API](#connect-and-query-api) with dbt Cloud MetricFlow allows you to define metrics in your dbt project and query them whether in dbt Cloud or dbt Core with [MetricFlow commands](/docs/build/metricflow-commands). diff --git a/website/docs/faqs/API/rotate-token.md b/website/docs/faqs/API/rotate-token.md index 144c834ea8a..4470de72d5a 100644 --- a/website/docs/faqs/API/rotate-token.md +++ b/website/docs/faqs/API/rotate-token.md @@ -36,7 +36,7 @@ curl --location --request POST 'https://YOUR_ACCESS_URL/api/v2/users/YOUR_USER_I * Find your `YOUR_CURRENT_TOKEN` by going to **Profile Settings** -> **API Access** and copying the API key. * Find [`YOUR_ACCESS_URL`](/docs/cloud/about-cloud/regions-ip-addresses) for your region and plan. -:::info Example +Example: If `YOUR_USER_ID` = `123`, `YOUR_CURRENT_TOKEN` = `abcf9g`, and your `ACCESS_URL` = `cloud.getdbt.com`, then your curl request will be: @@ -44,7 +44,7 @@ If `YOUR_USER_ID` = `123`, `YOUR_CURRENT_TOKEN` = `abcf9g`, and your `ACCESS_URL curl --location --request POST 'https://cloud.getdbt.com/api/v2/users/123/apikey/' \ --header 'Authorization: Token abcf9g' ``` -::: + 2. Find the new key in the API response or in dbt Cloud. diff --git a/website/docs/faqs/Accounts/cloud-upgrade-instructions.md b/website/docs/faqs/Accounts/cloud-upgrade-instructions.md index f8daf393f9b..d16651a944c 100644 --- a/website/docs/faqs/Accounts/cloud-upgrade-instructions.md +++ b/website/docs/faqs/Accounts/cloud-upgrade-instructions.md @@ -6,11 +6,13 @@ description: "Instructions for upgrading a dbt Cloud account after the trial end dbt Cloud offers [several plans](https://www.getdbt.com/pricing/) with different features that meet your needs. This document is for dbt Cloud admins and explains how to select a plan in order to continue using dbt Cloud. -:::tip Before you begin -- You **_must_** be part of the [Owner](/docs/cloud/manage-access/self-service-permissions) user group to make billing changes. Users not included in this group will not see these options. +## Prerequisites + +Before you begin: +- You _must_ be part of the [Owner](/docs/cloud/manage-access/self-service-permissions) user group to make billing changes. Users not included in this group will not see these options. - All amounts shown in dbt Cloud are in U.S. Dollars (USD) - When your trial expires, your account's default plan enrollment will be a Team plan. -::: + ## Select a plan diff --git a/website/docs/guides/adapter-creation.md b/website/docs/guides/adapter-creation.md index 8bf082b04a0..28e0e8253ad 100644 --- a/website/docs/guides/adapter-creation.md +++ b/website/docs/guides/adapter-creation.md @@ -566,12 +566,6 @@ It should be noted that both of these files are included in the bootstrapped out ## Test your adapter -:::info - -Previously, we offered a packaged suite of tests for dbt adapter functionality: [`pytest-dbt-adapter`](https://github.com/dbt-labs/dbt-adapter-tests). We are deprecating that suite, in favor of the newer testing framework outlined in this document. - -::: - This document has two sections: 1. Refer to "About the testing framework" for a description of the standard framework that we maintain for using pytest together with dbt. It includes an example that shows the anatomy of a simple test case. diff --git a/website/docs/guides/bigquery-qs.md b/website/docs/guides/bigquery-qs.md index 9cf2447fa52..4f461a3cf3a 100644 --- a/website/docs/guides/bigquery-qs.md +++ b/website/docs/guides/bigquery-qs.md @@ -23,7 +23,6 @@ In this quickstart guide, you'll learn how to use dbt Cloud with BigQuery. It wi :::tip Videos for you You can check out [dbt Fundamentals](https://courses.getdbt.com/courses/fundamentals) for free if you're interested in course learning with videos. - ::: ### Prerequisites​ diff --git a/website/docs/guides/databricks-qs.md b/website/docs/guides/databricks-qs.md index 5a0c5536e7f..cb01daec394 100644 --- a/website/docs/guides/databricks-qs.md +++ b/website/docs/guides/databricks-qs.md @@ -21,7 +21,6 @@ In this quickstart guide, you'll learn how to use dbt Cloud with Databricks. It :::tip Videos for you You can check out [dbt Fundamentals](https://courses.getdbt.com/courses/fundamentals) for free if you're interested in course learning with videos. - ::: ### Prerequisites​ diff --git a/website/docs/guides/debug-schema-names.md b/website/docs/guides/debug-schema-names.md index c7bf1a195b1..24b7984adf5 100644 --- a/website/docs/guides/debug-schema-names.md +++ b/website/docs/guides/debug-schema-names.md @@ -14,11 +14,8 @@ recently_updated: true ## Introduction -If a model uses the [`schema` config](/reference/resource-properties/schema) but builds under an unexpected schema, here are some steps for debugging the issue. +If a model uses the [`schema` config](/reference/resource-properties/schema) but builds under an unexpected schema, here are some steps for debugging the issue. The full explanation on custom schemas can be found [here](/docs/build/custom-schemas). -:::info -The full explanation on custom schemas can be found [here](/docs/build/custom-schemas). -::: You can also follow along via this video: @@ -94,9 +91,7 @@ Now, re-read through the logic of your `generate_schema_name` macro, and mentall You should find that the schema dbt is constructing for your model matches the output of your `generate_schema_name` macro. -:::info -Note that snapshots do not follow this behavior, check out the docs on [target_schema](/reference/resource-configs/target_schema) instead. -::: +Be careful. Snapshots do not follow this behavior, check out the docs on [target_schema](/reference/resource-configs/target_schema) instead. ## Adjust as necessary diff --git a/website/docs/guides/how-to-use-databricks-workflows-to-run-dbt-cloud-jobs.md b/website/docs/guides/how-to-use-databricks-workflows-to-run-dbt-cloud-jobs.md index cb3a6804247..a2967ccbe15 100644 --- a/website/docs/guides/how-to-use-databricks-workflows-to-run-dbt-cloud-jobs.md +++ b/website/docs/guides/how-to-use-databricks-workflows-to-run-dbt-cloud-jobs.md @@ -128,15 +128,14 @@ if __name__ == '__main__': 4. Replace **``** and **``** with the correct values of your environment and [Access URL](/docs/cloud/about-cloud/regions-ip-addresses) for your region and plan. -:::tip - To find these values, navigate to **dbt Cloud**, select **Deploy -> Jobs**. Select the Job you want to run and copy the URL. For example: `https://cloud.getdbt.com/deploy/000000/projects/111111/jobs/222222` -and therefore valid code would be: + * To find these values, navigate to **dbt Cloud**, select **Deploy -> Jobs**. Select the Job you want to run and copy the URL. For example: `https://cloud.getdbt.com/deploy/000000/projects/111111/jobs/222222` + and therefore valid code would be: - # Your URL is structured https:///deploy//projects//jobs/ +Your URL is structured `https:///deploy//projects//jobs/` account_id = 000000 job_id = 222222 base_url = "cloud.getdbt.com" -::: + 5. Run the Notebook. It will fail, but you should see **a `job_id` widget** at the top of your notebook. @@ -161,9 +160,7 @@ DbtJobRunStatus.RUNNING DbtJobRunStatus.SUCCESS ``` -:::note You can cancel the job from dbt Cloud if necessary. -::: ## Configure the workflows to run the dbt Cloud jobs diff --git a/website/docs/guides/redshift-qs.md b/website/docs/guides/redshift-qs.md index 890be27e50a..c81a4d247a5 100644 --- a/website/docs/guides/redshift-qs.md +++ b/website/docs/guides/redshift-qs.md @@ -18,10 +18,8 @@ In this quickstart guide, you'll learn how to use dbt Cloud with Redshift. It wi - Document your models - Schedule a job to run - -:::tip Videos for you +:::tips Videos for you You can check out [dbt Fundamentals](https://courses.getdbt.com/courses/fundamentals) for free if you're interested in course learning with videos. - ::: ### Prerequisites diff --git a/website/docs/guides/sl-partner-integration-guide.md b/website/docs/guides/sl-partner-integration-guide.md index 61d558f504d..7eb158a2c85 100644 --- a/website/docs/guides/sl-partner-integration-guide.md +++ b/website/docs/guides/sl-partner-integration-guide.md @@ -15,10 +15,7 @@ recently_updated: true To fit your tool within the world of the Semantic Layer, dbt Labs offers some best practice recommendations for how to expose metrics and allow users to interact with them seamlessly. -:::note This is an evolving guide that is meant to provide recommendations based on our experience. If you have any feedback, we'd love to hear it! -::: - ### Prerequisites diff --git a/website/docs/guides/snowflake-qs.md b/website/docs/guides/snowflake-qs.md index 5b4f9e3e2be..0401c37871f 100644 --- a/website/docs/guides/snowflake-qs.md +++ b/website/docs/guides/snowflake-qs.md @@ -26,7 +26,7 @@ You can check out [dbt Fundamentals](https://courses.getdbt.com/courses/fundamen You can also watch the [YouTube video on dbt and Snowflake](https://www.youtube.com/watch?v=kbCkwhySV_I&list=PL0QYlrC86xQm7CoOH6RS7hcgLnd3OQioG). ::: - + ### Prerequisites​ - You have a [dbt Cloud account](https://www.getdbt.com/signup/). diff --git a/website/docs/reference/analysis-properties.md b/website/docs/reference/analysis-properties.md index 880aeddbb0d..1601c817830 100644 --- a/website/docs/reference/analysis-properties.md +++ b/website/docs/reference/analysis-properties.md @@ -18,6 +18,7 @@ analyses: [description](/reference/resource-properties/description): [docs](/reference/resource-configs/docs): show: true | false + node_color: # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") config: [tags](/reference/resource-configs/tags): | [] columns: diff --git a/website/docs/reference/commands/debug.md b/website/docs/reference/commands/debug.md index 4ae5a1d2dd9..e1865ff1b67 100644 --- a/website/docs/reference/commands/debug.md +++ b/website/docs/reference/commands/debug.md @@ -7,7 +7,7 @@ id: "debug" `dbt debug` is a utility function to test the database connection and display information for debugging purposes, such as the validity of your project file and your installation of any requisite dependencies (like `git` when you run `dbt deps`). -*Note: Not to be confused with [debug-level logging](/reference/global-configs/about-global-configs#debug-level-logging) via the `--debug` option which increases verbosity. +*Note: Not to be confused with [debug-level logging](/reference/global-configs/logs#debug-level-logging) via the `--debug` option which increases verbosity. ### Example usage diff --git a/website/docs/reference/dbt-jinja-functions/as_text.md b/website/docs/reference/dbt-jinja-functions/as_text.md deleted file mode 100644 index 6b26cfa327d..00000000000 --- a/website/docs/reference/dbt-jinja-functions/as_text.md +++ /dev/null @@ -1,58 +0,0 @@ ---- -title: "About as_text filter" -sidebar_label: "as_text" -id: "as_text" -description: "Use this filter to convert Jinja-compiled output back to text." ---- - -The `as_text` Jinja filter will coerce Jinja-compiled output back to text. It -can be used in YAML rendering contexts where values _must_ be provided as -strings, rather than as the datatype that they look like. - -:::info Heads up -In dbt v0.17.1, native rendering is not enabled by default. As such, -the `as_text` filter has no functional effect. - -It is still possible to natively render specific values using the [`as_bool`](/reference/dbt-jinja-functions/as_bool), -[`as_number`](/reference/dbt-jinja-functions/as_number), and [`as_native`](/reference/dbt-jinja-functions/as_native) filters. - -::: - -### Usage - -In the example below, the `as_text` filter is used to assert that `''` is an -empty string. In a native rendering, `''` would be coerced to the Python -keyword `None`. This specification is necessary in `v0.17.0`, but it is not -useful or necessary in later versions of dbt. - - - -```yml -models: - - name: orders - columns: - - name: order_status - tests: - - accepted_values: - values: ['pending', 'shipped', "{{ '' | as_text }}"] - -``` - - - -As of `v0.17.1`, native rendering does not occur by default, and the `as_text` -specification is superfluous. - - - -```yml -models: - - name: orders - columns: - - name: order_status - tests: - - accepted_values: - values: ['pending', 'shipped', ''] -``` - - diff --git a/website/docs/reference/dbt-jinja-functions/builtins.md b/website/docs/reference/dbt-jinja-functions/builtins.md index edc5f34ffda..7d970b9d5e1 100644 --- a/website/docs/reference/dbt-jinja-functions/builtins.md +++ b/website/docs/reference/dbt-jinja-functions/builtins.md @@ -42,9 +42,9 @@ From dbt v1.5 and higher, use the following macro to extract user-provided argum -- call builtins.ref based on provided positional arguments {% set rel = None %} {% if packagename is not none %} - {% set rel = return(builtins.ref(packagename, modelname, version=version)) %} + {% set rel = builtins.ref(packagename, modelname, version=version) %} {% else %} - {% set rel = return(builtins.ref(modelname, version=version)) %} + {% set rel = builtins.ref(modelname, version=version) %} {% endif %} -- finally, override the database name with "dev" diff --git a/website/docs/reference/dbt-jinja-functions/env_var.md b/website/docs/reference/dbt-jinja-functions/env_var.md index f4cc05cec0f..a8f2a94fbd2 100644 --- a/website/docs/reference/dbt-jinja-functions/env_var.md +++ b/website/docs/reference/dbt-jinja-functions/env_var.md @@ -100,6 +100,7 @@ select 1 as id -:::info dbt Cloud Usage +### dbt Cloud usage + If you are using dbt Cloud, you must adhere to the naming conventions for environment variables. Environment variables in dbt Cloud must be prefixed with `DBT_` (including `DBT_ENV_CUSTOM_ENV_` or `DBT_ENV_SECRET_`). Environment variables keys are uppercased and case sensitive. When referencing `{{env_var('DBT_KEY')}}` in your project's code, the key must match exactly the variable defined in dbt Cloud's UI. -::: + diff --git a/website/docs/reference/dbt_project.yml.md b/website/docs/reference/dbt_project.yml.md index a5ad601f78b..ae911200b40 100644 --- a/website/docs/reference/dbt_project.yml.md +++ b/website/docs/reference/dbt_project.yml.md @@ -1,6 +1,8 @@ Every [dbt project](/docs/build/projects) needs a `dbt_project.yml` file — this is how dbt knows a directory is a dbt project. It also contains important information that tells dbt how to operate your project. +dbt uses [YAML](https://yaml.org/) in a few different places. If you're new to YAML, it would be worth learning how arrays, dictionaries, and strings are represented. + By default, dbt will look for `dbt_project.yml` in your current working directory and its parents, but you can set a different directory using the `--project-dir` flag. @@ -15,11 +17,6 @@ Starting from dbt v1.5 and higher, you can specify your dbt Cloud project ID in -:::info YAML syntax -dbt uses YAML in a few different places. If you're new to YAML, it would be worth taking the time to learn how arrays, dictionaries, and strings are represented. -::: - - Something to note, you can't set up a "property" in the `dbt_project.yml` file if it's not a config (an example is [macros](/reference/macro-properties)). This applies to all types of resources. Refer to [Configs and properties](/reference/configs-and-properties) for more detail. The following example is a list of all available configurations in the `dbt_project.yml` file: diff --git a/website/docs/reference/model-properties.md b/website/docs/reference/model-properties.md index 65f9307b5b3..46fb0ca3bad 100644 --- a/website/docs/reference/model-properties.md +++ b/website/docs/reference/model-properties.md @@ -16,6 +16,7 @@ models: [description](/reference/resource-properties/description): [docs](/reference/resource-configs/docs): show: true | false + node_color: # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") [latest_version](/reference/resource-properties/latest_version): [deprecation_date](/reference/resource-properties/deprecation_date): [access](/reference/resource-configs/access): private | protected | public diff --git a/website/docs/reference/node-selection/methods.md b/website/docs/reference/node-selection/methods.md index 61fd380e11b..549bc5d45e1 100644 --- a/website/docs/reference/node-selection/methods.md +++ b/website/docs/reference/node-selection/methods.md @@ -8,9 +8,6 @@ you can omit it (the default value will be one of `path`, `file` or `fqn`). -:::info New functionality -New in v1.5! -::: Many of the methods below support Unix-style wildcards: diff --git a/website/docs/reference/project-configs/clean-targets.md b/website/docs/reference/project-configs/clean-targets.md index 9b464840723..8ca4065ed75 100644 --- a/website/docs/reference/project-configs/clean-targets.md +++ b/website/docs/reference/project-configs/clean-targets.md @@ -19,10 +19,10 @@ Optionally specify a custom list of directories to be removed by the `dbt clean` If this configuration is not included in your `dbt_project.yml` file, the `clean` command will remove files in your [target-path](/reference/project-configs/target-path). ## Examples -### Remove packages and compiled files as part of `dbt clean` -:::info -This is our preferred configuration, but is not the default. -::: + +### Remove packages and compiled files as part of `dbt clean` (preferred) {#remove-packages-and-compiled-files-as-part-of-dbt-clean} + + To remove packages as well as compiled files, include the value of your [packages-install-path](/reference/project-configs/packages-install-path) configuration in your `clean-targets` configuration. diff --git a/website/docs/reference/project-configs/docs-paths.md b/website/docs/reference/project-configs/docs-paths.md index 2aee7b31ee7..910cfbb0cce 100644 --- a/website/docs/reference/project-configs/docs-paths.md +++ b/website/docs/reference/project-configs/docs-paths.md @@ -20,12 +20,9 @@ Optionally specify a custom list of directories where [docs blocks](/docs/collab By default, dbt will search in all resource paths for docs blocks (i.e. the combined list of [model-paths](/reference/project-configs/model-paths), [seed-paths](/reference/project-configs/seed-paths), [analysis-paths](/reference/project-configs/analysis-paths), [macro-paths](/reference/project-configs/macro-paths) and [snapshot-paths](/reference/project-configs/snapshot-paths)). If this option is configured, dbt will _only_ look in the specified directory for docs blocks. -## Examples -:::info -We typically omit this configuration as we prefer dbt's default behavior. -::: +## Example -### Use a subdirectory named `docs` for docs blocks +Use a subdirectory named `docs` for docs blocks: @@ -34,3 +31,5 @@ docs-paths: ["docs"] ``` + +**Note:** We typically omit this configuration as we prefer dbt's default behavior. diff --git a/website/docs/reference/project-configs/require-dbt-version.md b/website/docs/reference/project-configs/require-dbt-version.md index 85a502bff60..6b17bb46120 100644 --- a/website/docs/reference/project-configs/require-dbt-version.md +++ b/website/docs/reference/project-configs/require-dbt-version.md @@ -19,7 +19,7 @@ When you set this configuration, dbt sends a helpful error message for any user If this configuration is not specified, no version check will occur. -:::info YAML Quoting +### YAML quoting This configuration needs to be interpolated by the YAML parser as a string. As such, you should quote the value of the configuration, taking care to avoid whitespace. For example: ```yml @@ -32,8 +32,6 @@ require-dbt-version: >=1.0.0 # No quotes? No good require-dbt-version: ">= 1.0.0" # Don't put whitespace after the equality signs ``` -::: - ## Examples @@ -73,18 +71,18 @@ require-dbt-version: ">=1.0.0,<2.0.0" ### Require a specific dbt version -:::caution Not recommended -With the release of major version 1.0 of dbt Core, pinning to a specific patch is discouraged. -::: + +:::info Not recommended +Pinning to a specific dbt version is discouraged because it limits project flexibility and can cause compatibility issues, especially with dbt packages. It's recommended to [pin to a major release](#pin-to-a-range), using a version range (for example, `">=1.0.0", "<2.0.0"`) for broader compatibility and to benefit from updates. While you can restrict your project to run only with an exact version of dbt Core, we do not recommend this for dbt Core v1.0.0 and higher. -In the following example, the project will only run with dbt v0.21.1. +In the following example, the project will only run with dbt v1.5: ```yml -require-dbt-version: 0.21.1 +require-dbt-version: 1.5 ``` diff --git a/website/docs/reference/resource-configs/contract.md b/website/docs/reference/resource-configs/contract.md index ccc10099a12..6c11b08dd62 100644 --- a/website/docs/reference/resource-configs/contract.md +++ b/website/docs/reference/resource-configs/contract.md @@ -6,16 +6,7 @@ default_value: {contract: false} id: "contract" --- -:::info New functionality -This functionality is new in v1.5. -::: - -## Related documentation -- [What is a model contract?](/docs/collaborate/govern/model-contracts) -- [Defining `columns`](/reference/resource-properties/columns) -- [Defining `constraints`](/reference/resource-properties/constraints) - -# Definition +Supported in dbt v1.5 and higher. When the `contract` configuration is enforced, dbt will ensure that your model's returned dataset exactly matches the attributes you have defined in yaml: - `name` and `data_type` for every column @@ -120,3 +111,8 @@ Imagine: - The result is a delta between the yaml-defined contract, and the actual table in the database - which means the contract is now incorrect! Why `append_new_columns`, rather than `sync_all_columns`? Because removing existing columns is a breaking change for contracted models! + +## Related documentation +- [What is a model contract?](/docs/collaborate/govern/model-contracts) +- [Defining `columns`](/reference/resource-properties/columns) +- [Defining `constraints`](/reference/resource-properties/constraints) \ No newline at end of file diff --git a/website/docs/reference/resource-configs/delimiter.md b/website/docs/reference/resource-configs/delimiter.md index 58d6ba8344a..5cc5ddaf44b 100644 --- a/website/docs/reference/resource-configs/delimiter.md +++ b/website/docs/reference/resource-configs/delimiter.md @@ -4,19 +4,14 @@ datatype: default_value: "," --- +Supported in v1.7 and higher. + ## Definition You can use this optional seed configuration to customize how you separate values in a [seed](/docs/build/seeds) by providing the one-character string. * The delimiter defaults to a comma when not specified. * Explicitly set the `delimiter` configuration value if you want seed files to use a different delimiter, such as "|" or ";". - -:::info New in 1.7! - -Delimiter is new functionality available beginning with dbt Core v1.7. - -::: - ## Usage diff --git a/website/docs/reference/resource-configs/docs.md b/website/docs/reference/resource-configs/docs.md index d5f7b6499d8..bb0f3714dd4 100644 --- a/website/docs/reference/resource-configs/docs.md +++ b/website/docs/reference/resource-configs/docs.md @@ -30,6 +30,7 @@ models: [](/reference/resource-configs/resource-path): +docs: show: true | false + node_color: color_id # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") ``` @@ -44,7 +45,7 @@ models: - name: model_name docs: show: true | false - node_color: "black" + node_color: color_id # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") ``` @@ -67,7 +68,7 @@ seeds: [](/reference/resource-configs/resource-path): +docs: show: true | false - + node_color: color_id # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") ``` @@ -81,6 +82,7 @@ seeds: - name: seed_name docs: show: true | false + node_color: color_id # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") ``` @@ -97,6 +99,7 @@ snapshots: [](/reference/resource-configs/resource-path): +docs: show: true | false + node_color: color_id # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") ``` @@ -111,6 +114,7 @@ snapshots: - name: snapshot_name docs: show: true | false + node_color: color_id # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") ``` @@ -130,6 +134,7 @@ analyses: - name: analysis_name docs: show: true | false + node_color: color_id # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") ``` @@ -156,7 +161,7 @@ macros: ## Definition -The docs field can be used to provide documentation-specific configuration to models. It supports the doc attribute `show`, which controls whether or not models are shown in the auto-generated documentation website. It also supports `node_color` for some node types. +The docs field can be used to provide documentation-specific configuration to models. It supports the doc attribute `show`, which controls whether or not models are shown in the auto-generated documentation website. It also supports `node_color` for models, seeds, snapshots, and analyses. Other node types are not supported. **Note:** Hidden models will still appear in the dbt DAG visualization but will be identified as "hidden.” @@ -204,9 +209,9 @@ models: ## Custom node colors -The `docs` attribute now supports `node_color` to customize the display color of some node types in the DAG within dbt docs. You can define node colors in the files below and apply overrides where needed. +The `docs` attribute now supports `node_color` to customize the display color of some node types in the DAG within dbt docs. You can define node colors in the following files and apply overrides where needed. Note, you need to run or re-run the command `dbt docs generate`. -`node_color` hiearchy: +`node_color` hierarchy: `` overrides `schema.yml` overrides `dbt_project.yml` diff --git a/website/docs/reference/resource-configs/group.md b/website/docs/reference/resource-configs/group.md index a71935013c4..e8370d18638 100644 --- a/website/docs/reference/resource-configs/group.md +++ b/website/docs/reference/resource-configs/group.md @@ -3,10 +3,6 @@ resource_types: [models, seeds, snapshots, tests, analyses, metrics] id: "group" --- -:::info New functionality -This functionality is new in v1.5. -::: - -BigQuery allows defining `not null` constraints. However, it does _not_ support or enforce the definition of unenforced constraints, such as `primary key`. +BigQuery allows defining and enforcing `not null` constraints, and defining (but _not_ enforcing) `primary key` and `foreign key` constraints (which can be used for query optimization). BigQuery does not support defining or enforcing other constraints. For more information, refer to [Platform constraint support](/docs/collaborate/govern/model-contracts#platform-constraint-support) Documentation: https://cloud.google.com/bigquery/docs/reference/standard-sql/data-definition-language diff --git a/website/docs/reference/resource-properties/database.md b/website/docs/reference/resource-properties/database.md index c2f6ba76dd8..59159495435 100644 --- a/website/docs/reference/resource-properties/database.md +++ b/website/docs/reference/resource-properties/database.md @@ -26,12 +26,9 @@ The database that your source is stored in. Note that to use this parameter, your warehouse must allow cross-database queries. -:::info - #### BigQuery terminology -If you're using BigQuery, use the _project_ name as the `database:` property. -::: +If you're using BigQuery, use the _project_ name as the `database:` property. ## Default By default, dbt will search in your target database (i.e. the database that you are creating tables and views). diff --git a/website/docs/reference/resource-properties/schema.md b/website/docs/reference/resource-properties/schema.md index 9e6a09b8569..157a9ffc0a2 100644 --- a/website/docs/reference/resource-properties/schema.md +++ b/website/docs/reference/resource-properties/schema.md @@ -27,12 +27,10 @@ The schema name as stored in the database. This parameter is useful if you want to use a source name that differs from the schema name. -:::info #### BigQuery terminology -If you're using BigQuery, use the _dataset_ name as the `schema:` property. -::: +If you're using BigQuery, use the _dataset_ name as the `schema:` property. ## Default By default, dbt will use the source's `name:` parameter as the schema name. diff --git a/website/docs/reference/seed-properties.md b/website/docs/reference/seed-properties.md index 9201df65f4c..ebe222dd11c 100644 --- a/website/docs/reference/seed-properties.md +++ b/website/docs/reference/seed-properties.md @@ -16,6 +16,7 @@ seeds: [description](/reference/resource-properties/description): [docs](/reference/resource-configs/docs): show: true | false + node_color: # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") [config](/reference/resource-properties/config): [](/reference/seed-configs): [tests](/reference/resource-properties/data-tests): diff --git a/website/docs/reference/snapshot-properties.md b/website/docs/reference/snapshot-properties.md index 8f01fd8e988..49769af8f6d 100644 --- a/website/docs/reference/snapshot-properties.md +++ b/website/docs/reference/snapshot-properties.md @@ -20,6 +20,7 @@ snapshots: [meta](/reference/resource-configs/meta): {} [docs](/reference/resource-configs/docs): show: true | false + node_color: # Use name (such as node_color: purple) or hex code with quotes (such as node_color: "#cd7f32") [config](/reference/resource-properties/config): [](/reference/snapshot-configs): [tests](/reference/resource-properties/data-tests): diff --git a/website/docs/sql-reference/aggregate-functions/sql-avg.md b/website/docs/sql-reference/aggregate-functions/sql-avg.md index d1dba119292..1512cee7763 100644 --- a/website/docs/sql-reference/aggregate-functions/sql-avg.md +++ b/website/docs/sql-reference/aggregate-functions/sql-avg.md @@ -17,6 +17,8 @@ The AVG function is a part of the group of mathematical or aggregate functions ( ### AVG function example +The following example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop): + ```sql select date_trunc('month', order_date) as order_month, @@ -26,10 +28,6 @@ where status not in ('returned', 'return_pending') group by 1 ``` -:::note What dataset is this? -This example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop). -::: - This query using the Jaffle Shop’s `orders` table will return the rounded order amount per each order month: | order_month | avg_order_amount | diff --git a/website/docs/sql-reference/aggregate-functions/sql-count.md b/website/docs/sql-reference/aggregate-functions/sql-count.md index d65c670df90..1438b7c11d5 100644 --- a/website/docs/sql-reference/aggregate-functions/sql-count.md +++ b/website/docs/sql-reference/aggregate-functions/sql-count.md @@ -25,6 +25,8 @@ Let’s take a look at a practical example using COUNT, DISTINCT, and GROUP BY b ### COUNT example +The following example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop): + ```sql select date_part('month', order_date) as order_month, @@ -34,9 +36,6 @@ from {{ ref('orders') }} group by 1 ``` -:::note What dataset is this? -This example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop). -::: This simple query is something you may do while doing initial exploration of your data; it will return the count of `order_ids` and count of distinct `customer_ids` per order month that appear in the Jaffle Shop’s `orders` table: diff --git a/website/docs/sql-reference/aggregate-functions/sql-max.md b/website/docs/sql-reference/aggregate-functions/sql-max.md index 0b5dc5521ea..fab72770af5 100644 --- a/website/docs/sql-reference/aggregate-functions/sql-max.md +++ b/website/docs/sql-reference/aggregate-functions/sql-max.md @@ -25,6 +25,8 @@ Let’s take a look at a practical example using MAX and GROUP BY below. ### MAX example +The following example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop): + ```sql select date_part('month', order_date) as order_month, @@ -33,10 +35,6 @@ from {{ ref('orders') }} group by 1 ``` -:::note What dataset is this? -This example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop). -::: - This simple query is something you may do while doing initial exploration of your data; it will return the maximum order `amount` per order month that appear in the Jaffle Shop’s `orders` table: | order_month | max_amount | diff --git a/website/docs/sql-reference/aggregate-functions/sql-min.md b/website/docs/sql-reference/aggregate-functions/sql-min.md index 6080bb20c0b..95de0af8df3 100644 --- a/website/docs/sql-reference/aggregate-functions/sql-min.md +++ b/website/docs/sql-reference/aggregate-functions/sql-min.md @@ -27,6 +27,8 @@ Let’s take a look at a practical example using MIN below. ### MIN example +The following example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop): + ```sql select customer_id, @@ -37,10 +39,6 @@ group by 1 limit 3 ``` -:::note What dataset is this? -This example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop). -::: - This simple query is returning the first and last order date for a customer in the Jaffle Shop’s `orders` table: | customer_id | first_order_date | last_order_date | diff --git a/website/docs/sql-reference/aggregate-functions/sql-round.md b/website/docs/sql-reference/aggregate-functions/sql-round.md index bc9669e22cb..a080f5a63e5 100644 --- a/website/docs/sql-reference/aggregate-functions/sql-round.md +++ b/website/docs/sql-reference/aggregate-functions/sql-round.md @@ -24,11 +24,8 @@ In this function, you’ll need to input the *numeric* field or data you want ro ### SQL ROUND function example -:::note What dataset is this? -This example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop). -::: -You can round some of the numeric fields of the Jaffle Shop’s `orders` model using the following code: +You can round some of the numeric fields of the [Jaffle Shop’s](https://github.com/dbt-labs/jaffle_shop) `orders` model using the following code: ```sql select diff --git a/website/docs/sql-reference/aggregate-functions/sql-sum.md b/website/docs/sql-reference/aggregate-functions/sql-sum.md index d6ca00c2daa..494a3863ad3 100644 --- a/website/docs/sql-reference/aggregate-functions/sql-sum.md +++ b/website/docs/sql-reference/aggregate-functions/sql-sum.md @@ -27,6 +27,8 @@ Let’s take a look at a practical example using the SUM function below. ### SUM example +The following example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop): + ```sql select customer_id, @@ -36,10 +38,6 @@ group by 1 limit 3 ``` -:::note What dataset is this? -This example is querying from a sample dataset created by dbt Labs called [jaffle_shop](https://github.com/dbt-labs/jaffle_shop). -::: - This simple query is returning the summed amount of all orders for a customer in the Jaffle Shop’s `orders` table: | customer_id | all_orders_amount | diff --git a/website/docs/terms/data-extraction.md b/website/docs/terms/data-extraction.md index bc37b68cf66..52148a35421 100644 --- a/website/docs/terms/data-extraction.md +++ b/website/docs/terms/data-extraction.md @@ -37,7 +37,7 @@ Obviously, the type of business you work for and the systems your team uses will The data that is typically extracted and loaded in your data warehouse is data that business users will need for baseline reporting, OKR measurement, or other analytics. :::tip Don’t fix what’s not broken -As we just said, there are usually common data sources that data teams will extract from, regardless of business. Instead of writing transformations for these tables and data sources, leverage [dbt packages](https://hub.getdbt.com/) to save yourself some carpal tunnel and use the work someone else has already done for you :) +As we just said, there are usually common data sources that data teams will extract from, regardless of business. Instead of writing transformations for these tables and data sources, leverage [dbt packages](https://hub.getdbt.com/) to save yourself some carpal tunnel and use the work someone else has already done for you. ::: ## Data extraction tools diff --git a/website/docs/terms/table.md b/website/docs/terms/table.md index cbe36ec1315..bfc4e680660 100644 --- a/website/docs/terms/table.md +++ b/website/docs/terms/table.md @@ -5,9 +5,6 @@ description: "Read this guide to understand how tables work in dbt." displayText: table hoverSnippet: In simplest terms, a table is the direct storage of data in rows and columns. Think excel sheet with raw values in each of the cells. --- -:::important This page could use some love -This term would benefit from additional depth and examples. Have knowledge to contribute? [Create an issue in the docs.getdbt.com repository](https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose) to begin the process of becoming a glossary contributor! -::: In simplest terms, a table is the direct storage of data in rows and columns. Think excel sheet with raw values in each of the cells. diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index 4a882589f77..2083d8f07ec 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -1,4 +1,3 @@ - ## Types of environments In dbt Cloud, there are two types of environments: @@ -47,7 +46,7 @@ For more info, check out this [FAQ page on this topic](/faqs/Environments/custom ### Extended attributes :::note -Extended attributes are retrieved and applied only at runtime when `profiles.yml` is requested for a specific Cloud run. Extended attributes are currently _not_ taken into consideration for Cloud-specific features such as PrivateLink or SSH Tunneling that do not rely on `profiles.yml` values. +Extended attributes are retrieved and applied only at runtime when `profiles.yml` is requested for a specific Cloud run. Extended attributes are currently _not_ taken into consideration for SSH Tunneling which do not rely on `profiles.yml` values. ::: Extended Attributes is a feature that allows users to set a flexible [profiles.yml](/docs/core/connect-data-platform/profiles.yml) snippet in their dbt Cloud Environment settings. It provides users with more control over environments (both deployment and development) and extends how dbt Cloud connects to the data platform within a given environment. @@ -109,7 +108,4 @@ Partial parsing in dbt Cloud requires dbt version 1.4 or newer. The feature does To enable, select **Account settings** from the gear menu and enable the **Partial parsing** option. - - - - + \ No newline at end of file diff --git a/website/snippets/_new-sl-setup.md b/website/snippets/_new-sl-setup.md index a02481db33d..a93f233d09c 100644 --- a/website/snippets/_new-sl-setup.md +++ b/website/snippets/_new-sl-setup.md @@ -1,14 +1,12 @@ You can set up the dbt Semantic Layer in dbt Cloud at the environment and project level. Before you begin: -- You must have a dbt Cloud Team or Enterprise account. Suitable for both Multi-tenant and Single-tenant deployment. - - Single-tenant accounts should contact their account representative for necessary setup and enablement. - You must be part of the Owner group, and have the correct [license](/docs/cloud/manage-access/seats-and-users) and [permissions](/docs/cloud/manage-access/self-service-permissions) to configure the Semantic Layer: * Enterprise plan — Developer license with Account Admin permissions. Or Owner with a Developer license, assigned Project Creator, Database Admin, or Admin permissions. * Team plan — Owner with a Developer license. - You must have a successful run in your new environment. :::tip -If you've configured the legacy Semantic Layer, it has been deprecated, and dbt Labs strongly recommends that you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt version 1.6 or higher to use the latest dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/sl-migration) for details. +If you've configured the legacy Semantic Layer, it has been deprecated. dbt Labs strongly recommends that you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt version 1.6 or higher to use the latest dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/sl-migration) for details. ::: 1. In dbt Cloud, create a new [deployment environment](/docs/deploy/deploy-environments#create-a-deployment-environment) or use an existing environment on dbt 1.6 or higher. @@ -20,7 +18,10 @@ If you've configured the legacy Semantic Layer, it has been deprecated, and dbt -4. In the **Set Up Semantic Layer Configuration** page, enter the credentials you want the Semantic Layer to use specific to your data platform. We recommend credentials have the least privileges required because your Semantic Layer users will be querying it in downstream applications. At a minimum, the Semantic Layer needs to have read access to the schema(s) that contains the dbt models that you used to build your semantic models. +4. In the **Set Up Semantic Layer Configuration** page, enter the credentials you want the Semantic Layer to use specific to your data platform. + + - Use credentials with minimal privileges. This is because the Semantic Layer requires read access to the schema(s) containing the dbt models used in your semantic models for downstream applications + - Note, [Environment variables](/docs/build/environment-variables) such as `{{env_var('DBT_WAREHOUSE')}`, doesn't supported the dbt Semantic Layer yet. You must use the actual credentials. @@ -28,13 +29,10 @@ If you've configured the legacy Semantic Layer, it has been deprecated, and dbt 6. After saving it, you'll be provided with the connection information that allows you to connect to downstream tools. If your tool supports JDBC, save the JDBC URL or individual components (like environment id and host). If it uses the GraphQL API, save the GraphQL API host information instead. - + 7. Save and copy your environment ID, service token, and host, which you'll need to use downstream tools. For more info on how to integrate with partner integrations, refer to [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations). 8. Return to the **Project Details** page, then select **Generate Service Token**. You will need Semantic Layer Only and Metadata Only [service token](/docs/dbt-cloud-apis/service-tokens) permissions. - - -Great job, you've configured the Semantic Layer 🎉! - +Great job, you've configured the Semantic Layer 🎉! diff --git a/website/snippets/_upgrade-move.md b/website/snippets/_upgrade-move.md deleted file mode 100644 index 7572077fd1b..00000000000 --- a/website/snippets/_upgrade-move.md +++ /dev/null @@ -1,5 +0,0 @@ -:::important Upgrade Guides Are Moving - -The location of the dbt Core upgrade guides has changed, and they will soon be removed from `Guides`. The new location is in the `Docs` tab under `Available dbt versions`. You have been redirected to the new URL, so please update any saved links and bookmarks. - -::: \ No newline at end of file diff --git a/website/snippets/available-beta-banner.md b/website/snippets/available-beta-banner.md deleted file mode 100644 index 15d365a84b1..00000000000 --- a/website/snippets/available-beta-banner.md +++ /dev/null @@ -1,3 +0,0 @@ -:::info Beta feature -This feature is currently in beta and subject to change. If you are interested in getting access to the beta, please [contact us](mailto:support@getdbt.com). -::: diff --git a/website/snippets/available-prerelease-banner.md b/website/snippets/available-prerelease-banner.md deleted file mode 100644 index 3531a2f646f..00000000000 --- a/website/snippets/available-prerelease-banner.md +++ /dev/null @@ -1,7 +0,0 @@ -:::info Release candidate -dbt Core v1.2 is now available as a **release candidate**. - -For more information on prereleases, see ["About Core versions: Trying prereleases"](core-versions#trying-prereleases). - -Join the [#dbt-prereleases](https://getdbt.slack.com/archives/C016X6ABVUK) channel in the Community Slack so you can be the first to read about prereleases as soon as they're available! -::: diff --git a/website/snippets/quickstarts/schedule-a-job.md b/website/snippets/quickstarts/schedule-a-job.md index ab8f4350dbf..70848388f35 100644 --- a/website/snippets/quickstarts/schedule-a-job.md +++ b/website/snippets/quickstarts/schedule-a-job.md @@ -35,9 +35,9 @@ As the `jaffle_shop` business gains more customers, and those customers create m 8. Click the run and watch its progress under "Run history." 9. Once the run is complete, click **View Documentation** to see the docs for your project. -:::tip + Congratulations 🎉! You've just deployed your first dbt project! -::: + #### FAQs diff --git a/website/snippets/sl-considerations-banner.md b/website/snippets/sl-considerations-banner.md deleted file mode 100644 index 33cfb5edac5..00000000000 --- a/website/snippets/sl-considerations-banner.md +++ /dev/null @@ -1,8 +0,0 @@ -:::caution Considerations - -Some important considerations to know about using the dbt Semantic Layer during the Public Preview: - -- Support for Snowflake data platform only (_additional data platforms coming soon_) -- Support for the deployment environment only (_development experience coming soon_) - -::: diff --git a/website/snippets/test-snippet.md b/website/snippets/test-snippet.md deleted file mode 100644 index c1de326aa7a..00000000000 --- a/website/snippets/test-snippet.md +++ /dev/null @@ -1,8 +0,0 @@ ---- ---- - -### Header 2 - -Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam fermentum porttitor dui, id scelerisque enim scelerisque at. Proin imperdiet sem sed magna ornare, sit amet rutrum ligula vehicula. Aenean eget magna placerat, dictum velit sed, dapibus quam. Maecenas lectus tellus, dictum semper gravida vel, feugiat vitae nibh. Vestibulum nec lorem nibh. Fusce nisi felis, tincidunt ac scelerisque ut, aliquam in eros. Praesent euismod dolor ac lacinia laoreet. Phasellus orci orci, scelerisque vitae mollis id, consectetur ut libero. Aenean diam leo, tempor ut vulputate in, laoreet id ipsum. Quisque gravida et ex id eleifend. Etiam ultricies erat diam. Morbi hendrerit, ligula non aliquam tempus, erat elit suscipit quam, eu cursus quam nisi sit amet dui. Cras iaculis risus vel enim tempor molestie. - -Curabitur a porttitor odio. Curabitur sit amet tristique ante. Ut eleifend, erat eget imperdiet accumsan, quam diam sodales dolor, vulputate consequat lacus felis non sapien. Nam et nunc sed diam congue rutrum nec non massa. Nam eget fermentum sem. Nam ac imperdiet massa. Phasellus a elementum dui. diff --git a/website/snippets/tutorial-create-new-dbt-cloud-account.md b/website/snippets/tutorial-create-new-dbt-cloud-account.md deleted file mode 100644 index bdde874d0c9..00000000000 --- a/website/snippets/tutorial-create-new-dbt-cloud-account.md +++ /dev/null @@ -1,10 +0,0 @@ -Let's start this section by creating a dbt Cloud account if you haven't already. - -1. Navigate to [dbt Cloud](https://cloud.getdbt.com). -2. If you don't have a dbt Cloud account, create a new one, and verify your account via email. -3. If you already have a dbt Cloud account, you can create a new project from your existing account: - 1. Click the gear icon in the top-right, then click **Projects**. - 2. Click **+ New Project**. -4. You've arrived at the "Setup a New Project" page. -5. Type "Analytics" in the dbt Project Name field. You will be able to rename this project later. -6. Click **Continue**. \ No newline at end of file diff --git a/website/snippets/tutorial-initiate-project.md b/website/snippets/tutorial-initiate-project.md deleted file mode 100644 index 008b6bdf487..00000000000 --- a/website/snippets/tutorial-initiate-project.md +++ /dev/null @@ -1,44 +0,0 @@ -Now that you have a repository configured, you can initialize your project and start development in dbt Cloud: - -1. Click **Develop** from the upper left. It might take a few minutes for your project to spin up for the first time as it establishes your git connection, clones your repo, and tests the connection to the warehouse. -2. Above the file tree to the left, click **Initialize your project**. This builds out your folder structure with example models. -3. Make your initial commit by clicking **Commit**. Use the commit message `initial commit`. This creates the first commit to your managed repo and allows you to open a branch where you can add new dbt code. -4. Now you should be able to **directly query data from your warehouse** and **execute dbt run**. Paste your following warehouse-specific code in the IDE: - - - -
- -```sql -select * from `dbt-tutorial.jaffle_shop.customers` -``` - -
- -
- -```sql -select * from default.jaffle_shop_customers -``` - -
- -
- -```sql -select * from jaffle_shop.customers -``` - -
- -
- -```sql -select * from raw.jaffle_shop.customers -``` - -
- -
- -- In the command line bar at the bottom, type `dbt run` and click **Enter**. We will explore what happens in the next section of the tutorial. diff --git a/website/static/img/blog/authors/ejohnston.png b/website/static/img/blog/authors/ejohnston.png new file mode 100644 index 00000000000..09fc4ed7ba3 Binary files /dev/null and b/website/static/img/blog/authors/ejohnston.png differ diff --git a/website/static/img/blog/serverless-free-tier-data-stack-with-dlt-and-dbt-core/architecture_diagram.png b/website/static/img/blog/serverless-free-tier-data-stack-with-dlt-and-dbt-core/architecture_diagram.png new file mode 100644 index 00000000000..ad10d32c2e7 Binary files /dev/null and b/website/static/img/blog/serverless-free-tier-data-stack-with-dlt-and-dbt-core/architecture_diagram.png differ diff --git a/website/static/img/blog/serverless-free-tier-data-stack-with-dlt-and-dbt-core/map_screenshot.png b/website/static/img/blog/serverless-free-tier-data-stack-with-dlt-and-dbt-core/map_screenshot.png new file mode 100644 index 00000000000..da8309c2510 Binary files /dev/null and b/website/static/img/blog/serverless-free-tier-data-stack-with-dlt-and-dbt-core/map_screenshot.png differ diff --git a/website/vercel.json b/website/vercel.json index f9dd018357b..b662e1c2144 100644 --- a/website/vercel.json +++ b/website/vercel.json @@ -3633,8 +3633,8 @@ "permanent": true }, { - "source": "/docs/writing-code-in-dbt/jinja-context/as_text", - "destination": "/reference/dbt-jinja-functions/as_text", + "source": "/reference/dbt-jinja-functions/as_text", + "destination": "/reference/dbt-jinja-functions", "permanent": true }, {