Skip to content

Commit

Permalink
Merge branch 'ly-docs-fix-cards' of github.com:dbt-labs/docs.getdbt.c…
Browse files Browse the repository at this point in the history
…om into ly-docs-fix-cards
  • Loading branch information
nghi-ly committed Jan 30, 2024
2 parents 8fae638 + 8614221 commit 4a97069
Show file tree
Hide file tree
Showing 3 changed files with 243 additions and 35 deletions.
39 changes: 25 additions & 14 deletions website/docs/docs/build/metricflow-commands.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Using MetricFlow with dbt Cloud means you won't need to manage versioning &mdash

<TabItem value="cloudcli" label="dbt Cloud CLI">

- MetricFlow commands are embedded in the dbt Cloud CLI. This means you can immediately run them once you install the dbt Cloud CLI and don't need to install MetricFlow separately.
- MetricFlow [commands](#metricflow-commands) are embedded in the dbt Cloud CLI. This means you can immediately run them once you install the dbt Cloud CLI and don't need to install MetricFlow separately.
- You don't need to manage versioning &mdash; your dbt Cloud account will automatically manage the versioning for you.

</TabItem>
Expand All @@ -36,21 +36,17 @@ Using MetricFlow with dbt Cloud means you won't need to manage versioning &mdash
You can create metrics using MetricFlow in the dbt Cloud IDE. However, support for running MetricFlow commands in the IDE will be available soon.
:::



</TabItem>

<TabItem value="core" label="dbt Core">

:::tip Use dbt Cloud CLI for semantic layer development

:::info Use dbt Cloud CLI for semantic layer development

Use the dbt Cloud CLI for the experience in defining and querying metrics in your dbt project on dbt Cloud or dbt Core with MetricFlow.
You can use the dbt Cloud CLI for the experience in defining and querying metrics in your dbt project.

A benefit to using the dbt Cloud is that you won't need to manage versioning &mdash; your dbt Cloud account will automatically manage the versioning.
:::


You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-started) from [PyPI](https://pypi.org/project/dbt-metricflow/). You need to use `pip` to install MetricFlow on Windows or Linux operating systems:

1. Create or activate your virtual environment `python -m venv venv`
Expand All @@ -70,16 +66,16 @@ Something to note, MetricFlow `mf` commands return an error if you have a Metafo
MetricFlow provides the following commands to retrieve metadata and query metrics.

<Tabs>
<TabItem value="cloud" label="Commands for dbt Cloud">
<TabItem value="cloud" label="Commands for dbt Cloud CLI">

Use the `dbt sl` prefix before the command name to execute them in dbt Cloud. For example, to list all metrics, run `dbt sl list metrics`.
You can use the `dbt sl` prefix before the command name to execute them in the dbt Cloud CLI. For example, to list all metrics, run `dbt sl list metrics`.

- [`list`](#list) &mdash; Retrieves metadata values.
- [`list metrics`](#list-metrics) &mdash; Lists metrics with dimensions.
- [`list dimensions`](#list) &mdash; Lists unique dimensions for metrics.
- [`list dimension-values`](#list-dimension-values) &mdash; List dimensions with metrics.
- [`list entities`](#list-entities) &mdash; Lists all unique entities.
- [`query`](#query) &mdash; Query metrics and dimensions you want to see in the command line interface. Refer to [query examples](#query-examples) to help you get started.
- [`query`](#query) &mdash; Query metrics, saved queries, and dimensions you want to see in the command line interface. Refer to [query examples](#query-examples) to help you get started.

<!--below commands aren't supported in dbt cloud yet
- [`validate-configs`](#validate-configs) &mdash; Validates semantic model configurations.
Expand Down Expand Up @@ -226,10 +222,11 @@ mf tutorial # In dbt Core

### Query

Create a new query with MetricFlow, execute that query against the user's data platform, and return the result:
Create a new query with MetricFlow and execute it against your data platform. The query returns the following result:

```bash
dbt sl query --metrics <metric_name> --group-by <dimension_name> # In dbt Cloud
dbt sl query --saved-query <name> # In dbt Cloud CLI

mf query --metrics <metric_name> --group-by <dimension_name> # In dbt Core

Expand Down Expand Up @@ -372,7 +369,6 @@ mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 -
You can further filter the data set by adding a `where` clause to your query.
**Query**
```bash
# In dbt Cloud
dbt sl query --metrics order_total --group-by metric_time --where "{{ Dimension('order_id__is_food_order') }} = True"
Expand Down Expand Up @@ -406,7 +402,6 @@ To filter by time, there are dedicated start and end time options. Using these o
**Query**
```bash
# In dbt Cloud
dbt sl query --metrics order_total --group-by metric_time,is_food_order --limit 10 --order -metric_time --where "is_food_order = True" --start-time '2017-08-22' --end-time '2017-08-27'
Expand All @@ -429,9 +424,25 @@ mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 -
</TabItem>
<TabItem value="eg6" label=" Saved queries">
</Tabs>
You can use this for frequently used queries. Replace `<name>` with the name of your [saved query](/docs/build/saved-queries).
**Query**
```bash
dbt sl query --saved-query <name> # In dbt Cloud
mf query --saved-query <name> # In dbt Core
```
For example, if you use dbt Cloud and have a saved query named `new_customer_orders`, you would run `dbt sl query --saved-query new_customer_orders`.
:::info A note on querying saved queries
When querying [saved queries](/docs/build/saved-queries),you can use parameters such as `where`, `limit`, `order`, `compile`, and so on. However, keep in mind that you can't access `metric` or `group_by` parameters in this context. This is because they are predetermined and fixed parameters for saved queries, and you can't change them at query time. If you would like to query more metrics or dimensions, you can build the query using the standard format.
:::
</TabItem>
</Tabs>
### Additional query examples
Expand Down
60 changes: 60 additions & 0 deletions website/docs/docs/dbt-cloud-apis/sl-graphql.md
Original file line number Diff line number Diff line change
Expand Up @@ -217,6 +217,31 @@ Dimension {
DimensionType = [CATEGORICAL, TIME]
```

**List saved queries**

```graphql
{
savedQueries(environmentId: 200532) {
name
description
label
queryParams {
metrics {
name
}
groupBy {
name
grain
datePart
}
where {
whereSqlTemplate
}
}
}
}
```

### Querying

When querying for data, _either_ a `groupBy` _or_ a `metrics` selection is required.
Expand Down Expand Up @@ -576,3 +601,38 @@ mutation {
}
}
```

**Querying compile SQL with saved queries**

This query includes the field `savedQuery` and generates the SQL based on a predefined [saved query](/docs/build/saved-queries),rather than dynamically building it from a list of metrics and groupings. You can use this for frequently used queries.

```graphql
mutation {
compileSql(
environmentId: 200532
savedQuery: "new_customer_orders" # new field
) {
queryId
sql
}
}
```

:::info A note on querying saved queries
When querying [saved queries](/docs/build/saved-queries),you can use parameters such as `where`, `limit`, `order`, `compile`, and so on. However, keep in mind that you can't access `metric` or `group_by` parameters in this context. This is because they are predetermined and fixed parameters for saved queries, and you can't change them at query time. If you would like to query more metrics or dimensions, you can build the query using the standard format.
:::

**Create query with saved queries**

This takes the same inputs as the `createQuery` mutation, but includes the field `savedQuery`. You can use this for frequently used queries.

```graphql
mutation {
createQuery(
environmentId: 200532
savedQuery: "new_customer_orders" # new field
) {
queryId
}
}
```
Loading

0 comments on commit 4a97069

Please sign in to comment.