Skip to content

Commit

Permalink
Merge branch 'current' into mwong-mega-sl-clou-cli
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Oct 10, 2023
2 parents 84907e5 + d28b0ee commit 9102625
Show file tree
Hide file tree
Showing 13 changed files with 185 additions and 48 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/labeler.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@

name: "Pull Request Labeler"
on:
- pull_request_target

pull_request_target:
types: [opened]
jobs:
triage:
permissions:
Expand Down
15 changes: 9 additions & 6 deletions website/docs/docs/build/semantic-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,17 +8,20 @@ sidebar_label: Semantic models
tags: [Metrics, Semantic Layer]
---

Semantic models serve as the foundation for defining data in MetricFlow, which powers the dbt Semantic Layer. You can think of semantic models as nodes in your semantic graph, connected via entities as edges. MetricFlow takes semantic models defined in YAML configuration files as inputs and creates a semantic graph that can be used to query metrics.
Semantic models are the foundation for data definition in MetricFlow, which powers the dbt Semantic Layer:

Each semantic model corresponds to a dbt model in your DAG. Therefore you will have one YAML config for each semantic model in your dbt project. You can create multiple semantic models out of a single dbt model, as long as you give each semantic model a unique name.

You can configure semantic models in your dbt project directory in a `YAML` file. Depending on your project structure, you can nest semantic models under a `metrics:` folder or organize them under project sources.
- Think of semantic models as nodes connected by entities in a semantic graph.
- MetricFlow uses YAML configuration files to create this graph for querying metrics.
- Each semantic model corresponds to a dbt model in your DAG, requiring a unique YAML configuration for each semantic model.
- You can create multiple semantic models from a single dbt model, as long as you give each semantic model a unique name.
- Configure semantic models in a YAML file within your dbt project directory.
- Organize them under a `metrics:` folder or within project sources as needed.

Semantic models have 6 components and this page explains the definitions with some examples:

| Component | Description | Type |
| --------- | ----------- | ---- |
| [Name](#name) | Unique name for the semantic model | Required |
| [Name](#name) | Choose a unique name for the semantic model. Avoid using double underscores (__) in the name as they're not supported. | Required |
| [Description](#description) | Includes important details in the description | Optional |
| [Model](#model) | Specifies the dbt model for the semantic model using the `ref` function | Required |
| [Defaults](#defaults) | The defaults for the model, currently only `agg_time_dimension` is supported. | Required |
Expand Down Expand Up @@ -107,7 +110,7 @@ semantic_models:
### Name
Define the name of the semantic model. You must define a unique name for the semantic model. The semantic graph will use this name to identify the model, and you can update it at any time.
Define the name of the semantic model. You must define a unique name for the semantic model. The semantic graph will use this name to identify the model, and you can update it at any time. Avoid using double underscores (__) in the name as they're not supported.
### Description
Expand Down
1 change: 1 addition & 0 deletions website/docs/docs/collaborate/git-version-control.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,3 +22,4 @@ When you develop in the command line interface (CLI) or Cloud integrated develo
- [Merge conflicts](/docs/collaborate/git/merge-conflicts)
- [Connect to GitHub](/docs/cloud/git/connect-github)
- [Connect to GitLab](/docs/cloud/git/connect-gitlab)
- [Connect to Azure DevOps](/docs/cloud/git/connect-azure-devops)
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Enhancement: Native support for the dbt retry command"
description: "October 2023: Rerun errored jobs from start or from the failure point"
sidebar_label: "Enhancement: Support for dbt retry"
tags: [Oct-2023, Scheduler]
date: 2023-10-06
sidebar_position: 10
---

Previously in dbt Cloud, you could only rerun an errored job from start but now you can also rerun it from its point of failure.

You can view which job failed to complete successully, which command failed in the run step, and choose how to rerun it. To learn more, refer to [Retry jobs](/docs/deploy/retry-jobs).


<Lightbox src="/img/docs/deploy/native-retry.gif" width="70%" title="Example of the Rerun options in dbt Cloud"/>
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
---
title: "September 2023 product docs updates"
id: "product-docs-sept"
description: "September 2023: The Product docs team merged 107 PRs, made various updates to dbt Cloud and Core, such as GAing continuous integration jobs, Semantic Layer GraphQL API doc, a new community plugin, and more"
sidebar_label: "Update: Product docs changes"
tags: [Sept-2023, product-docs]
date: 2023-10-10
sidebar_position: 09
---

Hello from the dbt Docs team: @mirnawong1, @matthewshaver, @nghi-ly, and @runleonarun! First, we’d like to thank the 15 new community contributors to docs.getdbt.com. We merged [107 PRs](https://github.com/dbt-labs/docs.getdbt.com/pulls?q=is%3Apr+merged%3A2023-09-01..2023-09-31) in September.

Here's what's new to [docs.getdbt.com](http://docs.getdbt.com/):

* Migrated docs.getdbt.com from Netlify to Vercel.

## ☁ Cloud projects
- Continuous integration jobs are now generally available and no longer in beta!
- Added [Postgres PrivateLink set up page](/docs/cloud/secure/postgres-privatelink)
- Published beta docs for [dbt Explorer](/docs/collaborate/explore-projects).
- Added a new Semantic Layer [GraphQL API doc](/docs/dbt-cloud-apis/sl-graphql) and updated the [integration docs](/docs/use-dbt-semantic-layer/avail-sl-integrations) to include Hex. Responded to dbt community feedback and clarified Metricflow use cases for dbt Core and dbt Cloud.
- Added an [FAQ](/faqs/Git/git-migration) describing how to migrate from one git provider to another in dbt Cloud.
- Clarified an example and added a [troubleshooting section](/docs/cloud/connect-data-platform/connect-snowflake#troubleshooting) to Snowflake connection docs to address common errors and provide solutions.


## 🎯 Core projects

- Deprecated dbt Core v1.0 and v1.1 from the docs.
- Added configuration instructions for the [AWS Glue](/docs/core/connect-data-platform/glue-setup) community plugin.
- Revised the dbt Core quickstart, making it easier to follow. Divided this guide into steps that align with the [other guides](/quickstarts/manual-install?step=1).

## New 📚 Guides, ✏️ blog posts, and FAQs

Added a [style guide template](/guides/best-practices/how-we-style/6-how-we-style-conclusion#style-guide-template) that you can copy & paste to make sure you adhere to best practices when styling dbt projects!

## Upcoming changes

Stay tuned for a flurry of releases in October and a filterable guides section that will make guides easier to find!
32 changes: 32 additions & 0 deletions website/docs/docs/deploy/retry-jobs.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
---
title: "Retry your dbt jobs"
sidebar_label: "Retry jobs"
description: "Rerun your errored jobs from start or the failure point."
---

If your dbt job run completed with a status of `result:error` , you can rerun it from start or from the point of failure in dbt Cloud.

## Prerequisites

- You have a [dbt Cloud account](https://www.getdbt.com/signup).
- You must be using [dbt version](/docs/dbt-versions/upgrade-core-in-cloud) 1.6 or newer.
- The most recent run of the job hasn't completed successfully. The latest status of the run is `error`.
- The job command that failed in the run must be one that supports the [retry command](/reference/commands/retry).

## Rerun an errored job

1. Select **Deploy** from the top navigation bar and choose **Run History.**
2. Choose the job run that has errored.
3. In the **Run Summary** tab on the job’s **Run** page, expand the run step that failed. An :x: denotes the failed step.
4. Examine the error message and determine how to fix it. After you have made your changes, save and commit them to your [Git repo](/docs/collaborate/git-version-control).
5. Return to your job’s **Run** page. In the upper right corner, click **Rerun** and choose **Rerun from start** or **Rerun from failure**.

If you chose to rerun from the failure point, a **Rerun failed steps** modal opens. The modal lists the run steps that will be invoked: the failed step and any skipped steps. To confirm these run steps, click **Rerun from failure**. The job reruns from the failed command in the previously failed run. A banner at the top of the **Run Summary** tab captures this with the message, "This run resumed execution from last failed step".

<Lightbox src="/img/docs/deploy/native-retry.gif" width="70%" title="Example of the Rerun options in dbt Cloud"/>

## Related content

- [Run visibility](/docs/deploy/run-visibility)
- [Jobs](/docs/deploy/jobs)
- [Job commands](/docs/deploy/job-commands)
11 changes: 0 additions & 11 deletions website/docs/faqs/Models/reference-models-in-another-project.md

This file was deleted.

20 changes: 10 additions & 10 deletions website/docs/quickstarts/manual-install-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,16 +103,16 @@ When developing locally, dbt connects to your <Term id="data-warehouse" /> using
jaffle_shop: # this needs to match the profile in your dbt_project.yml file
target: dev
outputs:
dev:
type: bigquery
method: service-account
keyfile: /Users/BBaggins/.dbt/dbt-tutorial-project-331118.json # replace this with the full path to your keyfile
project: grand-highway-265418 # Replace this with your project id
dataset: dbt_bbagins # Replace this with dbt_your_name, e.g. dbt_bilbo
threads: 1
timeout_seconds: 300
location: US
priority: interactive
dev:
type: bigquery
method: service-account
keyfile: /Users/BBaggins/.dbt/dbt-tutorial-project-331118.json # replace this with the full path to your keyfile
project: grand-highway-265418 # Replace this with your project id
dataset: dbt_bbagins # Replace this with dbt_your_name, e.g. dbt_bilbo
threads: 1
timeout_seconds: 300
location: US
priority: interactive
```
</File>
Expand Down
1 change: 1 addition & 0 deletions website/sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -352,6 +352,7 @@ const sidebarSettings = {
link: { type: "doc", id: "docs/deploy/monitor-jobs" },
items: [
"docs/deploy/run-visibility",
"docs/deploy/retry-jobs",
"docs/deploy/job-notifications",
"docs/deploy/webhooks",
"docs/deploy/artifacts",
Expand Down
19 changes: 10 additions & 9 deletions website/snippets/quickstarts/schedule-a-job.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,15 +24,16 @@ Jobs are a set of dbt commands that you want to run on a schedule. For example,

As the `jaffle_shop` business gains more customers, and those customers create more orders, you will see more records added to your source data. Because you materialized the `customers` model as a table, you'll need to periodically rebuild your table to ensure that the data stays up-to-date. This update will happen when you run a job.

1. After creating your deployment environment, you should be directed to the page for new environment. If not, select **Deploy** in the upper left, then click **Jobs**.
2. Click **Create one** and provide a name, for example "Production run", and link to the Environment you just created.
3. Scroll down to "Execution Settings" and select **Generate docs on run**.
4. Under "Commands," add this command as part of your job if you don't see them:
* `dbt build`
5. For this exercise, do _not_ set a schedule for your project to run &mdash; while your organization's project should run regularly, there's no need to run this example project on a schedule. Scheduling a job is sometimes referred to as _deploying a project_.
6. Select **Save**, then click **Run now** to run your job.
7. Click the run and watch its progress under "Run history."
8. Once the run is complete, click **View Documentation** to see the docs for your project.
1. After creating your deployment environment, you should be directed to the page for a new environment. If not, select **Deploy** in the upper left, then click **Jobs**.
2. Click **Create one** and provide a name, for example, "Production run", and link to the Environment you just created.
3. Scroll down to the **Execution Settings** section.
4. Under **Commands**, add this command as part of your job if you don't see it:
* `dbt build`
5. Select the **Generate docs on run** checkbox to automatically [generate updated project docs](/docs/collaborate/build-and-view-your-docs) each time your job runs.
6. For this exercise, do _not_ set a schedule for your project to run &mdash; while your organization's project should run regularly, there's no need to run this example project on a schedule. Scheduling a job is sometimes referred to as _deploying a project_.
7. Select **Save**, then click **Run now** to run your job.
8. Click the run and watch its progress under "Run history."
9. Once the run is complete, click **View Documentation** to see the docs for your project.

:::tip
Congratulations 🎉! You've just deployed your first dbt project!
Expand Down
67 changes: 57 additions & 10 deletions website/src/css/custom.css
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
--pagination-icon-prev: "\2190";
--filter-brightness-low: 1.1;
--filter-brightness-high: 1.5;

--darkmode-link-color: #1FA4A3;
--light-dark-toggle: "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMTYiIGhlaWdodD0iMTYiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PHBhdGggZD0iTTQuMzA4IDMuMzg1YzAtMS4xNzguMTczLTIuMzcuNjE1LTMuMzg1QzEuOTgzIDEuMjggMCA0LjI4MiAwIDcuNjkyQTguMzA4IDguMzA4IDAgMCAwIDguMzA4IDE2YzMuNDEgMCA2LjQxMi0xLjk4MyA3LjY5Mi00LjkyMy0xLjAxNS40NDItMi4yMDcuNjE1LTMuMzg1LjYxNWE4LjMwOCA4LjMwOCAwIDAgMS04LjMwNy04LjMwN1oiIGZpbGw9IiM5MkEwQjMiLz48L3N2Zz4=";

/* search overrides */
Expand Down Expand Up @@ -104,10 +104,10 @@ html[data-theme="dark"] {

/* Linked `code` tags visibility adjustment */
html[data-theme=dark] a code {
color: var(--ifm-link-color);
color: var(--darkmode-link-color);
}
html[data-theme=dark] a code:hover {
color: var(--ifm-link-hover-color);;
color: var(--darkmode-link-color);
}

/* For /dbt-cloud/api REDOC Page */
Expand All @@ -122,11 +122,11 @@ html[data-theme="dark"] .api-content h1 {

html[data-theme="dark"] .api-content button,
html[data-theme="dark"] .api-content a {
filter: brightness(1.25);
filter: brightness(var(--filter-brightness-low));
}

html[data-theme="dark"] .api-content a:hover {
filter: brightness(1.25);
filter: brightness(var(--filter-brightness-low));
}

.redoc-wrap .api-content a,
Expand Down Expand Up @@ -165,8 +165,19 @@ table td {
vertical-align: top;
}

html[data-theme=dark] main .row .col:first-of-type a:not(.button) {
color: var(--darkmode-link-color);
}

html[data-theme="dark"] main .row .col:first-of-type a:hover {
filter: brightness(var(--filter-brightness-low));
}

html[data-theme="dark"] main .row .col:first-of-type a article * {
color: white;
}

html[data-theme="dark"] table td {
filter: brightness(1.5);
color: white;
}

Expand Down Expand Up @@ -668,6 +679,14 @@ i.theme-doc-sidebar-item-category.theme-doc-sidebar-item-category-level-2.menu__
color: var(--ifm-color-gray-900);
}

.alert--secondary,
.alert--secondary a,
.alert--secondary svg {
--ifm-alert-background-color: #474748;
color: white !important;
fill: white !important;
}

html[data-theme="dark"] .alert * {
--ifm-alert-foreground-color: var(--ifm-color-gray-900);
}
Expand All @@ -683,7 +702,7 @@ html[data-theme="dark"] .alert table {
.alert--success a,
.alert--danger a,
.alert--warning a {
color: var(--ifm-color-gray-900);
color: var(--ifm-color-gray-900) !important;
}

.linkout {
Expand Down Expand Up @@ -842,6 +861,14 @@ div .toggle_src-components-faqs-styles-module {
gap: 1em;
}

html[data-theme="dark"] .pagination-nav a {
color: var(--darkmode-link-color);
}

html[data-theme="dark"] .pagination-nav a:hover {
filter: brightness(var(--filter-brightness-low));
}

.pagination-nav__link {
padding: 1rem 0;
transition: 100ms all ease-in-out;
Expand Down Expand Up @@ -948,8 +975,13 @@ html[data-theme="dark"] .blog-breadcrumbs a[href="#"] {
filter: brightness(var(--filter-brightness-low));
}

html[data-theme="dark"] .blog-breadcrumbs a:not(:last-of-type):after {
color: var(--ifm-link-color);
html[data-theme="dark"] .blog-breadcrumbs a:hover {
filter: brightness(var(--filter-brightness-low));
}

html[data-theme="dark"] .blog-breadcrumbs a:not(:last-of-type):after,
html[data-theme="dark"] .blog-breadcrumbs a {
color: var(--darkmode-link-color);
}

html[data-theme="dark"] .breadcrumbs__item--active .breadcrumbs__link {
Expand Down Expand Up @@ -993,6 +1025,21 @@ article[itemprop="blogPost"] h2 {
font-size: 2rem;
}

html[data-theme="dark"] article[itemprop="blogPost"] a {
color: var(--darkmode-link-color);
}

html[data-theme="dark"] article[itemprop="blogPost"] a:hover {
filter: brightness(var(--filter-brightness-low));
}

/* Sidebar Nav */
html[data-theme="dark"] .main-wrapper nav a:hover,
html[data-theme="dark"] .main-wrapper nav a:active {
color: var(--darkmode-link-color) !important;
filter: brightness(var(--filter-brightness-low));
}

/* footer styles */
.footer {
font-weight: var(--ifm-font-weight-narrow);
Expand Down Expand Up @@ -1053,7 +1100,7 @@ article[itemprop="blogPost"] h2 {
/* copyright */
.footer__bottom {
text-align: left;
color: var(--color-footer-accent);
color: white;
font-size: 0.875rem;
}

Expand Down
Binary file added website/static/img/docs/deploy/native-retry.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
10 changes: 10 additions & 0 deletions website/vercel.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,16 @@
"cleanUrls": true,
"trailingSlash": false,
"redirects": [
{
"source": "/faqs/models/reference-models-in-another-project",
"destination": "/docs/collaborate/govern/project-dependencies",
"permanent": true
},
{
"source": "/faqs/Models/reference-models-in-another-project",
"destination": "/docs/collaborate/govern/project-dependencies",
"permanent": true
},
{
"source": "/docs/deploy/job-triggers",
"destination": "/docs/deploy/deploy-jobs",
Expand Down

0 comments on commit 9102625

Please sign in to comment.