Skip to content

Commit

Permalink
Merge branch 'current' into nfiann-constraints-foreignkey
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Sep 30, 2024
2 parents a7b7291 + 5b9a904 commit c3be91a
Show file tree
Hide file tree
Showing 35 changed files with 161 additions and 152 deletions.
23 changes: 20 additions & 3 deletions website/docs/docs/build/environment-variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -139,10 +139,14 @@ _The following variables are currently only available for GitHub, GitLab, and Az

Environment variables can be used in many ways, and they give you the power and flexibility to do what you want to do more easily in dbt Cloud.

#### Clone private packages
<Expandable alt_header="Clone private packages">

Now that you can set secrets as environment variables, you can pass git tokens into your package HTTPS URLs to allow for on-the-fly cloning of private repositories. Read more about enabling [private package cloning](/docs/build/packages#private-packages).

#### Dynamically set your warehouse in your Snowflake connection
</Expandable>

<Expandable alt_header="Dynamically set your warehouse in your Snowflake connection">

Environment variables make it possible to dynamically change the Snowflake virtual warehouse size depending on the job. Instead of calling the warehouse name directly in your project connection, you can reference an environment variable which will get set to a specific virtual warehouse at runtime.

For example, suppose you'd like to run a full-refresh job in an XL warehouse, but your incremental job only needs to run in a medium-sized warehouse. Both jobs are configured in the same dbt Cloud environment. In your connection configuration, you can use an environment variable to set the warehouse name to `{{env_var('DBT_WAREHOUSE')}}`. Then in the job settings, you can set a different value for the `DBT_WAREHOUSE` environment variable depending on the job's workload.
Expand All @@ -163,7 +167,10 @@ However, there are some limitations when using env vars with Snowflake OAuth Con
Something to note, if you supply an environment variable in the account/host field, Snowflake OAuth Connection will **fail** to connect. This happens because the field doesn't pass through Jinja rendering, so dbt Cloud simply passes the literal `env_var` code into a URL string like `{{ env_var("DBT_ACCOUNT_HOST_NAME") }}.snowflakecomputing.com`, which is an invalid hostname. Use [extended attributes](/docs/deploy/deploy-environments#deployment-credentials) instead.
:::

#### Audit your run metadata
</Expandable>

<Expandable alt_header="Audit your run metadata">

Here's another motivating example that uses the dbt Cloud run ID, which is set automatically at each run. This additional data field can be used for auditing and debugging:

```sql
Expand All @@ -189,3 +196,13 @@ select *,

from users_aggregated
```

</Expandable>

<Expandable alt_header="Configure Semantic Layer credentials">

import SLEnvVars from '/snippets/_sl-env-vars.md';

<SLEnvVars/>

</Expandable>
89 changes: 39 additions & 50 deletions website/docs/docs/build/metricflow-commands.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ tags: [Metrics, Semantic Layer]

Once you define metrics in your dbt project, you can query metrics, dimensions, and dimension values, and validate your configs using the MetricFlow commands.

MetricFlow allows you to define and query metrics in your dbt project in the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), or [dbt Core](/docs/core/installation-overview). To experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and dynamically query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account.
MetricFlow allows you to define and query metrics in your dbt project in the [dbt Cloud](/docs/cloud/about-develop-dbt) or [dbt Core](/docs/core/installation-overview). To experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and dynamically query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account.

MetricFlow is compatible with Python versions 3.8, 3.9, 3.10, and 3.11.

Expand All @@ -18,33 +18,18 @@ MetricFlow is a dbt package that allows you to define and query metrics in your

Using MetricFlow with dbt Cloud means you won't need to manage versioning &mdash; your dbt Cloud account will automatically manage the versioning.

**dbt Cloud jobs** &mdash; MetricFlow commands aren't supported in dbt Cloud jobs yet. However, you can add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`python -m pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs.
dbt Cloud jobs support the `dbt sl validate` command to [automatically test your semantic nodes](/docs/deploy/ci-jobs#semantic-validations-in-ci). You can also add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`python -m pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs.

<Tabs>

<TabItem value="cloudcli" label="dbt Cloud CLI">
<TabItem value="cloud" label="MetricFlow with dbt Cloud">

- MetricFlow [commands](#metricflow-commands) are embedded in the dbt Cloud CLI. This means you can immediately run them once you install the dbt Cloud CLI and don't need to install MetricFlow separately.
- You don't need to manage versioning &mdash; your dbt Cloud account will automatically manage the versioning for you.

</TabItem>

<TabItem value="cloud ide" label="dbt Cloud IDE">

:::info
You can create metrics using MetricFlow in the dbt Cloud IDE and run the [dbt sl validate](/docs/build/validation#validations-command) command. Support for running more MetricFlow commands in the IDE will be available soon.
:::
In dbt Cloud, run MetricFlow commands directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) or in the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation).

For dbt Cloud CLI users, MetricFlow commands are embedded in the dbt Cloud CLI, which means you can immediately run them once you install the dbt Cloud CLI and don't need to install MetricFlow separately. You don't need to manage versioning because your dbt Cloud account will automatically manage the versioning for you.
</TabItem>

<TabItem value="core" label="dbt Core">

:::tip Use dbt Cloud CLI for semantic layer development

You can use the dbt Cloud CLI for the experience in defining and querying metrics in your dbt project.

A benefit to using the dbt Cloud is that you won't need to manage versioning &mdash; your dbt Cloud account will automatically manage the versioning.
:::
<TabItem value="core" label="MetricFlow with dbt Core">

You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-started) from [PyPI](https://pypi.org/project/dbt-metricflow/). You need to use `pip` to install MetricFlow on Windows or Linux operating systems:

Expand All @@ -54,31 +39,37 @@ You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-star

**Note**, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow.

</TabItem>
Something to note, MetricFlow `mf` commands return an error if you have a Metafont latex package installed. To run `mf` commands, uninstall the package.

</TabItem>
</Tabs>

Something to note, MetricFlow `mf` commands return an error if you have a Metafont latex package installed. To run `mf` commands, uninstall the package.

## MetricFlow commands

MetricFlow provides the following commands to retrieve metadata and query metrics.

<Tabs>
<TabItem value="cloud" label="Commands for dbt Cloud CLI">
<TabItem value="cloudcommands" label="Commands for dbt Cloud">

You can use the `dbt sl` prefix before the command name to execute them in the dbt Cloud CLI. For example, to list all metrics, run `dbt sl list metrics`. For a complete list of the MetricFlow commands and flags, run the `dbt sl --help` command in your terminal.
You can use the `dbt sl` prefix before the command name to execute them in the dbt Cloud IDE or dbt Cloud CLI. For example, to list all metrics, run `dbt sl list metrics`.

dbt Cloud CLI users can run `dbt sl --help` in the terminal for a complete list of the MetricFlow commands and flags.

The following table lists the commands compatible with the dbt Cloud IDE and dbt Cloud CLI:

| <div style={{width:'250px'}}>Command</div> | <div style={{width:'100px'}}>Description</div> | dbt Cloud IDE | dbt Cloud CLI |
|---------|-------------|---------------|---------------|
| [`list`](#list) | Retrieves metadata values. |||
| [`list metrics`](#list-metrics) | Lists metrics with dimensions. |||
| [`list dimensions`](#list) | Lists unique dimensions for metrics. |||
| [`list dimension-values`](#list-dimension-values) | List dimensions with metrics. |||
| [`list entities`](#list-entities) | Lists all unique entities. |||
| [`list saved-queries`](#list-saved-queries) | Lists available saved queries. Use the `--show-exports` flag to display each export listed under a saved query or `--show-parameters` to show the full query parameters each saved query uses. |||
| [`query`](#query) | Query metrics, saved queries, and dimensions you want to see in the command line interface. Refer to [query examples](#query-examples) to help you get started. |||
| [`validate`](#validate) | Validates semantic model configurations. |||
| [`export`](#export) | Runs exports for a singular saved query for testing and generating exports in your development environment. You can also use the `--select` flag to specify particular exports from a saved query. |||
| [`export-all`](#export-all) | Runs exports for multiple saved queries at once, saving time and effort. |||

- [`list`](#list) &mdash; Retrieves metadata values.
- [`list metrics`](#list-metrics) &mdash; Lists metrics with dimensions.
- [`list dimensions`](#list) &mdash; Lists unique dimensions for metrics.
- [`list dimension-values`](#list-dimension-values) &mdash; List dimensions with metrics.
- [`list entities`](#list-entities) &mdash; Lists all unique entities.
- [`list saved-queries`](#list-saved-queries) &mdash; Lists available saved queries. Use the `--show-exports` flag to display each export listed under a saved query.
- [`query`](#query) &mdash; Query metrics, saved queries, and dimensions you want to see in the command line interface. Refer to [query examples](#query-examples) to help you get started.
- [`export`](#export) &mdash; Runs exports for a singular saved query for testing and generating exports in your development environment. You can also use the `--select` flag to specify particular exports from a saved query.
- [`export-all`](#export-all) &mdash; Runs exports for multiple saved queries at once, saving time and effort.
- [`validate`](#validate) &mdash; Validates semantic model configurations.

<!--below commands aren't supported in dbt cloud yet
- [`health-checks`](#health-checks) &mdash; Performs data platform health check.
Expand All @@ -99,7 +90,7 @@ Check out the following video for a short video demo of how to query or preview

</TabItem>

<TabItem value="core" label="Commands for dbt Core">
<TabItem value="corecommands" label="Commands for dbt Core">

Use the `mf` prefix before the command name to execute them in dbt Core. For example, to list all metrics, run `mf list metrics`.

Expand Down Expand Up @@ -502,8 +493,6 @@ The following tabs present additional query examples, like exporting to a CSV. S
<Tabs>
<TabItem value="eg6" label="--compile/--explain flag">
Add `--compile` (or `--explain` for dbt Core users) to your query to view the SQL generated by MetricFlow.
Expand All @@ -522,24 +511,24 @@ mf query --metrics order_total --group-by metric_time,is_food_order --limit 10 -
```bash
✔ Success 🦄 - query completed after 0.28 seconds
🔎 SQL (remove --compile to see data or add --show-dataflow-plan to see the generated dataflow plan):
SELECT
select
metric_time
, is_food_order
, SUM(order_cost) AS order_total
FROM (
SELECT
cast(ordered_at as date) AS metric_time
, sum(order_cost) as order_total
from (
select
cast(ordered_at as date) as metric_time
, is_food_order
, order_cost
FROM ANALYTICS.js_dbt_sl_demo.orders orders_src_1
WHERE cast(ordered_at as date) BETWEEN CAST('2017-08-22' AS TIMESTAMP) AND CAST('2017-08-27' AS TIMESTAMP)
from analytics.js_dbt_sl_demo.orders orders_src_1
where cast(ordered_at as date) between cast('2017-08-22' as timestamp) and cast('2017-08-27' as timestamp)
) subq_3
WHERE is_food_order = True
GROUP BY
where is_food_order = True
group by
metric_time
, is_food_order
ORDER BY metric_time DESC
LIMIT 10
order by metric_time desc
limit 10
```
</TabItem>
Expand Down
10 changes: 7 additions & 3 deletions website/docs/docs/build/metricflow-time-spine.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,10 @@ MetricFlow requires you to define a time-spine table as a model-level configurat
- [Conversion metrics](/docs/build/conversion)
- [Slowly Changing Dimensions](/docs/build/dimensions#scd-type-ii)
- [Metrics](/docs/build/metrics-overview) with the `join_to_timespine` configuration set to true

To see the generated SQL for the metric and dimension types that use time-spine joins, refer to the respective documentation or add the `compile=True` flag when querying the Semantic Layer to return the compiled SQL.

#### Configuring time-spine
## Configuring time-spine in YAML
- You only need to configure time-spine models that the Semantic Layer should recognize.
- At a minimum, define a time-spine table for a daily grain.
- You can optionally define a time-spine table for a different granularity, like hourly.
Expand Down Expand Up @@ -66,6 +67,9 @@ The example creates a time spine at a daily grain and an hourly grain. A few thi
* You can add a time spine for each granularity you intend to use if query efficiency is more important to you than configuration time, or storage constraints. For most engines, the query performance difference should be minimal and transforming your time spine to a coarser grain at query time shouldn't add significant overhead to your queries.
* We recommend having a time spine at the finest grain used in any of your dimensions to avoid unexpected errors. i.e., if you have dimensions at an hourly grain, you should have a time spine at an hourly grain.

## Example time-spine tables

### Daily
<File name="metricflow_time_spine.sql">

<VersionBlock lastVersion="1.6">
Expand Down Expand Up @@ -134,7 +138,7 @@ and date_hour < dateadd(day, 30, current_timestamp())
```
</VersionBlock>


### Daily (BigQuery)
Use this model if you're using BigQuery. BigQuery supports `DATE()` instead of `TO_DATE()`:
<VersionBlock lastVersion="1.6">

Expand Down Expand Up @@ -197,7 +201,7 @@ and date_hour < dateadd(day, 30, current_timestamp())

</File>

## Hourly time spine
### Hourly
<File name='time_spine_hourly.sql'>

```sql
Expand Down
4 changes: 1 addition & 3 deletions website/docs/docs/cloud-integrations/semantic-layer/excel.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@ tags: [Semantic Layer]
sidebar_label: "Microsoft Excel"
---

# Microsoft Excel <Lifecycle status='preview'/>

The dbt Semantic Layer offers a seamless integration with Excel Online and Desktop through a custom menu. This add-on allows you to build dbt Semantic Layer queries and return data on your metrics directly within Excel.

## Prerequisites
Expand All @@ -25,7 +23,7 @@ import SLCourses from '/snippets/_sl-course.md';

## Installing the add-on

The dbt Semantic Layer Microsoft Excel integration is available to download directly on [Microsoft AppSource](https://appsource.microsoft.com/en-us/marketplace/apps?product=office). You can choose to download this add in for both [Excel Desktop](https://pages.store.office.com/addinsinstallpage.aspx?assetid=WA200007100&rs=en-US&correlationId=4132ecd1-425d-982d-efb4-de94ebc83f26) and [Excel Online](https://pages.store.office.com/addinsinstallpage.aspx?assetid=WA200007100&rs=en-US&correlationid=4132ecd1-425d-982d-efb4-de94ebc83f26&isWac=True)
The dbt Semantic Layer Microsoft Excel integration is available to download directly on [Microsoft AppSource](https://appsource.microsoft.com/en-us/product/office/WA200007100?tab=Overview). You can choose to download this add-on in for both [Excel Desktop](https://pages.store.office.com/addinsinstallpage.aspx?assetid=WA200007100&rs=en-US&correlationId=4132ecd1-425d-982d-efb4-de94ebc83f26) and [Excel Online](https://pages.store.office.com/addinsinstallpage.aspx?assetid=WA200007100&rs=en-US&correlationid=4132ecd1-425d-982d-efb4-de94ebc83f26&isWac=True)

1. In Excel, authenticate with your host, dbt Cloud environment ID, and service token.
- Access your Environment ID, Host, and URLs in your dbt Cloud Semantic Layer settings. Generate a service token in the Semantic Layer settings or API tokens settings
Expand Down
1 change: 0 additions & 1 deletion website/docs/docs/cloud/configure-cloud-cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,6 @@ To set environment variables in the dbt Cloud CLI for your dbt project:
2. Then select **Profile Settings**, then **Credentials**.
3. Click on your project and scroll to the **Environment Variables** section.
4. Click **Edit** on the lower right and then set the user-level environment variables.
- Note, when setting up the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), using [environment variables](/docs/build/environment-variables) like `{{env_var('DBT_WAREHOUSE')}}` is not supported. You should use the actual credentials instead.

## Use the dbt Cloud CLI

Expand Down
2 changes: 2 additions & 0 deletions website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,7 @@ Nice job, you're ready to start developing and building models 🎉!
### Considerations
- To improve your experience using dbt Cloud, we suggest that you turn off ad blockers. This is because some project file names, such as `google_adwords.sql`, might resemble ad traffic and trigger ad blockers.
- To preserve performance, there's a file size limitation for repositories over 6 GB. If you have a repo over 6 GB, please contact [dbt Support](mailto:[email protected]) before running dbt Cloud.
- The IDE's idle session timeout is one hour.
- <Expandable alt_header="About the start up process and work retention">

### Start-up process
Expand All @@ -127,6 +128,7 @@ Nice job, you're ready to start developing and building models 🎉!
- If a model or test fails, dbt Cloud makes it easy for you to view and download the run logs for your dbt invocations to fix the issue.
- Use dbt's [rich model selection syntax](/reference/node-selection/syntax) to [run dbt commands](/reference/dbt-commands) directly within dbt Cloud.
- Starting from dbt v1.6, leverage [environments variables](/docs/build/environment-variables#special-environment-variables) to dynamically use the Git branch name. For example, using the branch name as a prefix for a development schema.
- Run [MetricFlow commands](/docs/build/metricflow-commands) to create and manage metrics in your project with the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl).

- **Generate your YAML configurations with dbt Assist** <Lifecycle status="beta"/> &mdash; [dbt Assist](/docs/cloud/dbt-assist) is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud. It generates documentation and tests for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time. Available for dbt Cloud Enterprise plans.

Expand Down
1 change: 0 additions & 1 deletion website/docs/docs/cloud/manage-access/audit-log.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,6 @@ The audit log supports various events for different objects in dbt Cloud. You wi
| Event Name | Event Type | Description |
| -------------------------- | ---------------------------------------- | ------------------------------------------------------ |
| Auth Provider Changed | auth_provider.Changed | Authentication provider settings changed |
| Credential Login Failed | auth.CredentialsLoginFailed | User login via username and password failed |
| Credential Login Succeeded | auth.CredentialsLoginSucceeded | User successfully logged in with username and password |
| SSO Login Failed | auth.SsoLoginFailed | User login via SSO failed |
| SSO Login Succeeded | auth.SsoLoginSucceeded | User successfully logged in via SSO
Expand Down
Loading

0 comments on commit c3be91a

Please sign in to comment.