Skip to content

Commit

Permalink
Correct SL guide inconsistencies (#5343)
Browse files Browse the repository at this point in the history
This pr seeks to update teh SL quickstart due to feedback by doing the
following:
- Clarify how users can correctly iteract with metrics via the ide.
- Updates inconsisten file path guidances for users by focusing on the
file name and not adding the directory/folder name to the guidance steps
- update snowflake partner connect location 

Resolves #5339 #5341 #5338
  • Loading branch information
mirnawong1 authored Apr 24, 2024
2 parents f080c81 + 2cd746d commit ab6b931
Show file tree
Hide file tree
Showing 2 changed files with 79 additions and 15 deletions.
84 changes: 74 additions & 10 deletions website/docs/guides/sl-snowflake-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -273,7 +273,7 @@ There are two ways to connect dbt Cloud to Snowflake. The first option is Partne

Using Partner Connect allows you to create a complete dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [a managed repository](/docs/collaborate/git/managed-repository), [environments](/docs/build/custom-schemas#managing-environments), and credentials.

1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Admin**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt.
1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Data Products**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt.

<Lightbox src="/img/snowflake_tutorial/snowflake_partner_connect_box.png" title="Snowflake Partner Connect Box" />

Expand Down Expand Up @@ -347,7 +347,11 @@ If you used Partner Connect, you can skip to [initializing your dbt project](#in
<Snippet path="tutorial-managed-repo" />

## Initialize your dbt project and start developing
Now that you have a repository configured, you can initialize your project and start development in dbt Cloud:
This guide assumes you use the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to develop your dbt project and define metrics. However, the dbt Cloud IDE doesn't support using [MetricFlow commands](/docs/build/metricflow-commands) to query or preview metrics (support coming soon).

To query and preview metrics in your development tool, you can use the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) to run the [MetricFlow commands](/docs/build/metricflow-commands).

Now that you have a repository configured, you can initialize your project and start development in dbt Cloud using the IDE:

1. Click **Start developing in the dbt Cloud IDE**. It might take a few minutes for your project to spin up for the first time as it establishes your git connection, clones your repo, and tests the connection to the warehouse.
2. Above the file tree to the left, click **Initialize your project**. This builds out your folder structure with example models.
Expand Down Expand Up @@ -378,6 +382,8 @@ Name the new branch `build-project`.
2. Name the file `staging/jaffle_shop/src_jaffle_shop.yml` , then click **Create**.
3. Copy the following text into the file and click **Save**.

<File name='models/staging/jaffle_shop/src_jaffle_shop.yml'>

```yaml
version: 2
Expand All @@ -390,6 +396,8 @@ sources:
- name: orders
```

</File>

:::tip
In your source file, you can also use the **Generate model** button to create a new model file for each source. This creates a new file in the `models` directory with the given source name and fill in the SQL code of the source definition.
:::
Expand All @@ -398,6 +406,8 @@ In your source file, you can also use the **Generate model** button to create a
5. Name the file `staging/stripe/src_stripe.yml` , then click **Create**.
6. Copy the following text into the file and click **Save**.

<File name='models/staging/stripe/src_stripe.yml'>

```yaml
version: 2
Expand All @@ -408,13 +418,16 @@ sources:
tables:
- name: payment
```
</File>

### Add staging models
[Staging models](/best-practices/how-we-structure/2-staging) are the first transformation step in dbt. They clean and prepare your raw data, making it ready for more complex transformations and analyses. Follow these steps to add your staging models to your project.

1. Create the file `models/staging/jaffle_shop/stg_customers.sql`. Or, you can use the **Generate model** button to create a new model file for each source.
1. In the `jaffle_shop` sub-directory, create the file `stg_customers.sql`. Or, you can use the **Generate model** button to create a new model file for each source.
2. Copy the following query into the file and click **Save**.

<File name='models/staging/jaffle_shop/stg_customers.sql'>

```sql
select
id as customer_id,
Expand All @@ -423,9 +436,13 @@ sources:
from {{ source('jaffle_shop', 'customers') }}
```

3. Create the file `models/staging/jaffle_shop/stg_orders.sql`
</File>

3. In the same `jaffle_shop` sub-directory, create the file `stg_orders.sql`
4. Copy the following query into the file and click **Save**.

<File name='models/staging/jaffle_shop/stg_orders.sql'>

```sql
select
id as order_id,
Expand All @@ -435,9 +452,13 @@ from {{ source('jaffle_shop', 'customers') }}
from {{ source('jaffle_shop', 'orders') }}
```

5. Create the file `models/staging/stripe/stg_payments.sql`.
</File>

5. In the `stripe` sub-directory, create the file `stg_payments.sql`.
6. Copy the following query into the file and click **Save**.

<File name='models/staging/stripe/stg_payments.sql'>

```sql
select
id as payment_id,
Expand All @@ -452,6 +473,8 @@ select
from {{ source('stripe', 'payment') }}
```

</File>

7. Enter `dbt run` in the command prompt at the bottom of the screen. You should get a successful run and see the three models.

### Add business-defined entities
Expand All @@ -463,6 +486,8 @@ This phase is the [marts layer](/best-practices/how-we-structure/1-guide-overvie
1. Create the file `models/marts/fct_orders.sql`.
2. Copy the following query into the file and click **Save**.
<File name='models/marts/fct_orders.sql'>
```sql
with orders as (
select * from {{ ref('stg_orders' )}}
Expand Down Expand Up @@ -504,9 +529,13 @@ select * from final
```
3. Create the file `models/marts/dim_customers.sql`.
</File>
3. In the `models/marts` directory, create the file `dim_customers.sql`.
4. Copy the following query into the file and click **Save**.
<File name='models/marts/dim_customers.sql'>
```sql
with customers as (
select * from {{ ref('stg_customers')}}
Expand Down Expand Up @@ -539,18 +568,26 @@ final as (
select * from final
```
5. Create the file `packages.yml` in your main directory
</File>
5. In your main directory, create the file `packages.yml`.
6. Copy the following text into the file and click **Save**.
<File name='packages.yml'>
```sql
packages:
- package: dbt-labs/dbt_utils
version: 1.1.1
```
7. Create the file `models/metrics/metricflow_time_spine.sql` in your main directory.
</File>
7. In the `models` directory, create the file `metrics/metricflow_time_spine.sql` in your main directory.
8. Copy the following query into the file and click **Save**.
<File name='models/metrics/metricflow_time_spine.sql'>
```sql
{{
config(
Expand All @@ -574,6 +611,8 @@ select * from final
```
</File>
9. Enter `dbt run` in the command prompt at the bottom of the screen. You should get a successful run message and also see in the run details that dbt has successfully built five models.
## Create semantic models
Expand All @@ -587,9 +626,11 @@ select * from final
In the following steps, semantic models enable you to define how to interpret the data related to orders. It includes entities (like ID columns serving as keys for joining data), dimensions (for grouping or filtering data), and measures (for data aggregations).
1. Create a new file `models/metrics/fct_orders.yml`
1. In the `metrics` sub-directory, create a new file `fct_orders.yml`.
2. Add the following code to that newly created file:
<File name='models/metrics/fct_orders.yml'>
```yaml
semantic_models:
- name: orders
Expand All @@ -600,6 +641,8 @@ semantic_models:
model: ref('fct_orders')
```
</File>
The following sections explain [dimensions](/docs/build/dimensions), [entities](/docs/build/entities), and [measures](/docs/build/measures) in more detail, showing how they each play a role in semantic models.
- [Entities](#entities) act as unique identifiers (like ID columns) that link data together from different tables.
Expand All @@ -612,6 +655,8 @@ The following sections explain [dimensions](/docs/build/dimensions), [entities](
Add entities to your `fct_orders.yml` semantic model file:
<File name='models/metrics/fct_orders.yml'>
```yaml
semantic_models:
- name: orders
Expand All @@ -628,12 +673,16 @@ semantic_models:
type: foreign
```
</File>
### Dimensions
[Dimensions](/docs/build/semantic-models#entities) are a way to group or filter information based on categories or time.
Add dimensions to your `fct_orders.yml` semantic model file:
<File name='models/metrics/fct_orders.yml'>
```yaml
semantic_models:
- name: orders
Expand All @@ -655,12 +704,16 @@ semantic_models:
time_granularity: day
```
</File>
### Measures
[Measures](/docs/build/semantic-models#measures) are aggregations performed on columns in your model. Often, you’ll find yourself using them as final metrics themselves. Measures can also serve as building blocks for more complicated metrics.
Add measures to your `fct_orders.yml` semantic model file:
<File name='models/metrics/fct_orders.yml'>
```yaml
semantic_models:
- name: orders
Expand Down Expand Up @@ -701,6 +754,8 @@ semantic_models:
use_approximate_percentile: False
```
</File>
## Define metrics
[Metrics](/docs/build/metrics-overview) are the language your business users speak and measure business performance. They are an aggregation over a column in your warehouse that you enrich with dimensional cuts.
Expand All @@ -717,6 +772,8 @@ Once you've created your semantic models, it's time to start referencing those m
Add metrics to your `fct_orders.yml` semantic model file:
<File name='models/metrics/fct_orders.yml'>
```yaml
semantic_models:
- name: orders
Expand Down Expand Up @@ -805,6 +862,8 @@ metrics:
- name: order_count
```
</File>
## Add second semantic model to your project
Great job, you've successfully built your first semantic model! It has all the required elements: entities, dimensions, measures, and metrics.
Expand All @@ -813,9 +872,11 @@ Let’s expand your project's analytical capabilities by adding another semantic
After setting up your orders model:
1. Create the file `models/metrics/dim_customers.yml`.
1. In the `metrics` sub-directory, create the file `dim_customers.yml`.
2. Copy the following query into the file and click **Save**.
<File name='models/metrics/dim_customers.yml'>
```yaml
semantic_models:
- name: customers
Expand Down Expand Up @@ -862,6 +923,9 @@ metrics:
measure: customers
```
</File>
This semantic model uses simple metrics to focus on customer metrics and emphasizes customer dimensions like name, type, and order dates. It uniquely analyzes customer behavior, lifetime value, and order patterns.
## Test and query metrics
Expand Down
10 changes: 5 additions & 5 deletions website/snippets/_sl-test-and-query-metrics.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
To work with metrics in dbt, you have several tools to validate or run commands. Here's how you can test and query metrics depending on your setup:

- [**dbt Cloud IDE users**](#dbt-cloud-ide-users) &mdash; Currently, running MetricFlow commands directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) isn't supported, but is coming soon. You can still validate metrics using the **Preview** or **Compile** options, or visually through the DAG for semantic checks. This ensures your metrics are correctly defined without directly running commands.
- [**dbt Cloud CLI users**](#dbt-cloud-cli-users) &mdash; The [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) enables you to run [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) for direct interaction with metrics.
- **dbt Core users** &mdash; Use the MetricFlow CLI for command execution. While this guide focuses on dbt Cloud users, dbt Core users can find detailed MetricFlow CLI setup instructions in the [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) page. Note that to use the dbt Semantic Layer, you need to have a Team or Enterprise account.
- [**dbt Cloud IDE users**](#dbt-cloud-ide-users) &mdash; Currently, running MetricFlow commands directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) isn't supported, but is coming soon. You can view metrics visually through the DAG in the **Lineage** tab without directly running commands.
- [**dbt Cloud CLI users**](#dbt-cloud-cli-users) &mdash; The [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) enables you to run [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) to query and preview metrics directly in your command line interface.
- **dbt Core users** &mdash; Use the MetricFlow CLI for command execution. While this guide focuses on dbt Cloud users, dbt Core users can find detailed MetricFlow CLI setup instructions in the [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) page. Note that to use the dbt Semantic Layer, you need to have a [Team or Enterprise account](https://www.getdbt.com/).

Alternatively, you can run commands with SQL client tools like DataGrip, DBeaver, or RazorSQL.

### dbt Cloud IDE users

You can validate your metrics in the dbt Cloud IDE by selecting the metric you want to validate and viewing it in the **Lineage** tab.
You can view your metrics in the dbt Cloud IDE by viewing them in the **Lineage** tab. The dbt Cloud IDE **Status button** (located in the bottom right of the editor) displays an **Error** status if there's an error in your metric or semantic model definition. You can click the button to see the specific issue and resolve it.

Once validated, make sure you commit and merge your changes in your project.
Once viewed, make sure you commit and merge your changes in your project.

<Lightbox src="/img/docs/dbt-cloud/semantic-layer/sl-ide-dag.jpg" title="Validate your metrics using the Lineage tab in the IDE." />

Expand Down

0 comments on commit ab6b931

Please sign in to comment.