Skip to content

Commit

Permalink
Merge branch 'current' into quickstarts-q32023-update
Browse files Browse the repository at this point in the history
  • Loading branch information
john-rock authored Oct 27, 2023
2 parents 8002ac1 + d2a313b commit 1e69534
Show file tree
Hide file tree
Showing 35 changed files with 1,667 additions and 136 deletions.
7 changes: 6 additions & 1 deletion website/blog/ctas.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,9 @@
header: Join data practitioners worldwide at Coalesce 2023
subheader: Kicking off on October 16th, both online and in-person (Sydney, London, and San Diego)
button_text: Register now
url: https://coalesce.getdbt.com/?utm_medium=internal&utm_source=docs&utm_campaign=q3-2024_coalesce-2023_aw&utm_content=coalesce____&utm_term=all___
url: https://coalesce.getdbt.com/?utm_medium=internal&utm_source=docs&utm_campaign=q3-2024_coalesce-2023_aw&utm_content=coalesce____&utm_term=all___
- name: coalesce_2023_catchup
header: Missed Coalesce 2023?
subheader: Watch Coalesce 2023 highlights and full sessions, dbt Labs' annual analytics engineering conference.
button_text: Watch the talks
url: https://www.youtube.com/playlist?list=PL0QYlrC86xQnT3HLh-XgvoTf9F3lbsADf
2 changes: 1 addition & 1 deletion website/blog/metadata.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
featured_image: ""

# This CTA lives in right sidebar on blog index
featured_cta: "coalesce_2023_signup"
featured_cta: "coalesce_2023_catchup"

# Show or hide hero title, description, cta from blog index
show_title: true
Expand Down
20 changes: 2 additions & 18 deletions website/docs/docs/build/about-metricflow.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,24 +37,6 @@ MetricFlow abides by these principles:
- **Simplicity with gradual complexity:** Approach MetricFlow using familiar data modeling concepts.
- **Performance and efficiency**: Optimize performance while supporting centralized data engineering and distributed logic ownership.

<!--MetricFlow is a SQL query generation engine that helps you create metrics by constructing appropriate queries for different granularities and dimensions that are useful for various business applications.
- It uses YAML files to define a semantic graph, which maps language to data. This graph consists of , which serve as data entry points, and [, which are functions used to create new quantitative indicators.
- MetricFlow is a () and available on dbt versions 1.6 and higher.
- MetricFlow, as a part of the dbt Semantic Layer, allows organizations to define company metrics logic through YAML abstractions, as described in the following sections.
- To query metrics dimensions, dimension values, and validate your configurations; install MetricFlow in
MetricFlow has the following principles:
- **Flexible, but complete** &mdash; Ability to create any metric on any data model by defining logic in flexible abstractions.
- **Don't Repeat Yourself (DRY)** &mdash; Avoid repetition by allowing metric definitions to be enabled whenever possible.
- **Simple with progressive complexity** &mdash; Make MetricFlow approachable by relying on known concepts and structures in data modeling.
- **Performant and efficient** &mdash; Allow for performance optimizations in centralized data engineering while still enabling distributed definition and ownership of logic.
-->

### Semantic graph

We're introducing a new concept: a "semantic graph". It's the relationship between semantic models and YAML configurations that creates a data landscape for building metrics. You can think of it like a map, where tables are like locations, and the connections between them (edges) are like roads. Although it's under the hood, the semantic graph is a subset of the <Term id="dag" />, and you can see the semantic models as nodes on the DAG.
Expand All @@ -73,6 +55,8 @@ For a semantic model, there are three main pieces of metadata:
* [Dimensions](/docs/build/dimensions) &mdash; These are the ways you want to group or slice/dice your metrics.
* [Measures](/docs/build/measures) &mdash; The aggregation functions that give you a numeric result and can be used to create your metrics.

<Lightbox src="/img/docs/dbt-cloud/semantic-layer/semantic_foundation.jpg" width="70%" title="A semantic model is made up of different components: Entities, Measures, and Dimensions."/>

### Metrics

Metrics, which is a key concept, are functions that combine measures, constraints, or other mathematical functions to define new quantitative indicators. MetricFlow uses measures and various aggregation types, such as average, sum, and count distinct, to create metrics. Dimensions add context to metrics and without them, a metric is simply a number for all time. You can define metrics in the same YAML files as your semantic models, or create a new file.
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/analyses.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ pagination_next: null

## Overview

dbt's notion of `models` makes it easy for data teams to version control and collaborate on data transformations. Sometimes though, a certain sql statement doesn't quite fit into the mold of a dbt model. These more "analytical" sql files can be versioned inside of your dbt project using the `analysis` functionality of dbt.
dbt's notion of `models` makes it easy for data teams to version control and collaborate on data transformations. Sometimes though, a certain SQL statement doesn't quite fit into the mold of a dbt model. These more "analytical" SQL files can be versioned inside of your dbt project using the `analysis` functionality of dbt.

Any `.sql` files found in the `analyses/` directory of a dbt project will be compiled, but not executed. This means that analysts can use dbt functionality like `{{ ref(...) }}` to select from models in an environment-agnostic way.

Expand Down
18 changes: 11 additions & 7 deletions website/docs/docs/build/metricflow-commands.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,11 @@ sidebar_label: "MetricFlow commands"
tags: [Metrics, Semantic Layer]
---

Once you define metrics in your dbt project, you can query metrics, dimensions, dimension values, and validate your configs using the MetricFlow commands.
Once you define metrics in your dbt project, you can query metrics, dimensions, and dimension values, and validate your configs using the MetricFlow commands.

MetricFlow allows you to define and query metrics in your dbt project in the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), or [dbt Core](/docs/core/installation). To experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and dynamically query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account.

MetricFlow is compatible with Python versions 3.8, 3.9, 3.10 and 3.11.
MetricFlow is compatible with Python versions 3.8, 3.9, 3.10, and 3.11.


## MetricFlow
Expand All @@ -25,7 +25,7 @@ MetricFlow is a dbt package that allows you to define and query metrics in your

MetricFlow commands are embedded in the dbt Cloud CLI, which means you can immediately run them once you install the dbt Cloud CLI.

A benefit to using the dbt Cloud CLI or dbt Cloud IDE is that you won't need to manage versioning &mdash; your dbt Cloud account will automatically manage the versioning for you.
A benefit to using the dbt Cloud is that you won't need to manage versioning &mdash; your dbt Cloud account will automatically manage the versioning.

</TabItem>

Expand All @@ -35,6 +35,8 @@ A benefit to using the dbt Cloud CLI or dbt Cloud IDE is that you won't need to
You can create metrics using MetricFlow in the dbt Cloud IDE. However, support for running MetricFlow commands in the IDE will be available soon.
:::

A benefit to using the dbt Cloud is that you won't need to manage versioning &mdash; your dbt Cloud account will automatically manage the versioning.

</TabItem>

<TabItem value="core" label="dbt Core">
Expand All @@ -44,13 +46,13 @@ You can create metrics using MetricFlow in the dbt Cloud IDE. However, support f

Use the dbt Cloud CLI for the experience in defining and querying metrics in your dbt project on dbt Cloud or dbt Core with MetricFlow.

A benefit to using the dbt Cloud CLI or dbt Cloud IDE is that you won't need to manage versioning your dbt Cloud account will automatically manage the versioning for you.
A benefit to using the dbt Cloud is that you won't need to manage versioning &mdash; your dbt Cloud account will automatically manage the versioning.
:::


You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-started) from [PyPI](https://pypi.org/project/dbt-metricflow/). You need to use `pip` to instal MetricFlow on Windows or Linux operating systems:
You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-started) from [PyPI](https://pypi.org/project/dbt-metricflow/). You need to use `pip` to install MetricFlow on Windows or Linux operating systems:

1. Create or activate your virtual environment`python -m venv venv`
1. Create or activate your virtual environment `python -m venv venv`
2. Run `pip install dbt-metricflow`
* You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `pip install "dbt-metricflow[snowflake]"`

Expand All @@ -60,6 +62,7 @@ You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-star

</Tabs>

Something to note, MetricFlow `mf` commands return an error if you have a Metafont latex package installed. To run `mf` commands, uninstall the package.

## MetricFlow commands

Expand All @@ -77,7 +80,7 @@ Use the `dbt sl` prefix before the command name to execute them in dbt Cloud. Fo
- [`list entities`](#list-entities) &mdash; Lists all unique entities.
- [`query`](#query) &mdash; Query metrics and dimensions you want to see in the command line interface. Refer to [query examples](#query-examples) to help you get started.

<!--below commands aren't support in dbt cloud yet
<!--below commands aren't supported in dbt cloud yet
- [`validate-configs`](#validate-configs) &mdash; Validates semantic model configurations.
- [`health-checks`](#health-checks) &mdash; Performs data platform health check.
- [`tutorial`](#tutorial) &mdash; Dedicated MetricFlow tutorial to help get you started.
Expand Down Expand Up @@ -552,3 +555,4 @@ Keep in mind that modifying your shell configuration files can have an impact on
</details>
</details>
2 changes: 2 additions & 0 deletions website/docs/docs/build/semantic-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@ Semantic models are the foundation for data definition in MetricFlow, which powe
- Configure semantic models in a YAML file within your dbt project directory.
- Organize them under a `metrics:` folder or within project sources as needed.

<Lightbox src="/img/docs/dbt-cloud/semantic-layer/semantic_foundation.jpg" width="70%" title="A semantic model is made up of different components: Entities, Measures, and Dimensions."/>

Semantic models have 6 components and this page explains the definitions with some examples:

| Component | Description | Type |
Expand Down
1 change: 1 addition & 0 deletions website/docs/docs/build/sl-getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,3 +94,4 @@ The dbt Semantic Layer is proprietary, however, some components of the dbt Seman
- [Build your metrics](/docs/build/build-metrics-intro)
- [Get started with the dbt Semantic Layer](/docs/use-dbt-semantic-layer/quickstart-sl)
- [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations)
- Demo on [how to define and query metrics with MetricFlow](https://www.loom.com/share/60a76f6034b0441788d73638808e92ac?sid=861a94ac-25eb-4fd8-a310-58e159950f5a)
14 changes: 7 additions & 7 deletions website/docs/docs/cloud/about-cloud-develop-defer.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,29 +17,29 @@ Both the dbt Cloud IDE and the dbt Cloud CLI allow users to natively defer to pr
- This can be set for one deployment environment per dbt Cloud project.
- You must have a successful job run first.

When using 'defer', it compares artifacts from the most recent successful production job, excluding CI jobs.
When using defer, it compares artifacts from the most recent successful production job, excluding CI jobs.

### Defer in the dbt Cloud IDE

To enable 'Defer' in the dbt Cloud IDE, toggle the **Defer to production** button on the command bar. Once enabled, dbt Cloud will:
To enable defer in the dbt Cloud IDE, toggle the **Defer to production** button on the command bar. Once enabled, dbt Cloud will:

1. Pull down the most recent manifest from the Production environment for comparison
2. Pass the `--defer` flag to the command (for any command that accepts the flag)

For example, if you were to start developing on a new branch with [nothing in your development schema](/reference/node-selection/defer#usage), edit a single model, and run `dbt build -s state:modified` &mdash; only the edited model would run. Any `{{ ref() }}` functions will point to the production location of the referenced models.

<Lightbox src="/img/docs/dbt-cloud/defer-toggle.jpg" width="100%" title="Select the 'Defer to production' toggle on the botom right of the command bar to enable defer in the dbt Cloud IDE."/>
<Lightbox src="/img/docs/dbt-cloud/defer-toggle.jpg" width="100%" title="Select the 'Defer to production' toggle on the bottom right of the command bar to enable defer in the dbt Cloud IDE."/>

### Defer in dbt Cloud CLI

One key difference between using `--defer` in the dbt Cloud CLI and the dbt Cloud IDE is that `--defer` is *automatically* enabled in the dbt Cloud CLI for all invocations, comparing with production artifacts. You can disable it with the `--no-defer` flag.
One key difference between using `--defer` in the dbt Cloud CLI and the dbt Cloud IDE is that `--defer` is *automatically* enabled in the dbt Cloud CLI for all invocations, compared with production artifacts. You can disable it with the `--no-defer` flag.

The dbt Cloud CLI offers additional flexibility by letting you choose the source environment for deferral artifacts. You can set a `defer-env-id` key in either your `dbt_project.yml` or `dbt_cloud.yml` file. If you do not provide a `defer-env-id` setting, the dbt Cloud CLI will use artifacts from your dbt Cloud environment marked 'Production'.
The dbt Cloud CLI offers additional flexibility by letting you choose the source environment for deferral artifacts. You can set a `defer-env-id` key in either your `dbt_project.yml` or `dbt_cloud.yml` file. If you do not provide a `defer-env-id` setting, the dbt Cloud CLI will use artifacts from your dbt Cloud environment marked "Production".

<File name="dbt_cloud.yml">

```yml
dever-env-id: '123456'
defer-env-id: '123456'
```
</File>
Expand All @@ -49,7 +49,7 @@ dever-env-id: '123456'
```yml
dbt_cloud:
dever-env-id: '123456'
defer-env-id: '123456'
```
</File>
Loading

0 comments on commit 1e69534

Please sign in to comment.