Skip to content

Commit

Permalink
Merge branch 'current' into entity-name
Browse files Browse the repository at this point in the history
  • Loading branch information
matthewshaver authored Nov 30, 2023
2 parents a3a45d3 + b5c27c9 commit c959921
Show file tree
Hide file tree
Showing 55 changed files with 458 additions and 314 deletions.
8 changes: 4 additions & 4 deletions website/docs/docs/about-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,14 +21,14 @@ To begin configuring dbt now, select the option that is right for you.

<Card
title="dbt Cloud setup"
body="Learn how to connect to a data platform, integrate with secure authentication methods, configure a sync with a git repo, how to use the IDE, and how to install the dbt Cloud CLI."
body="Learn how to connect to a data platform, integrate with secure authentication methods, and configure a sync with a git repo."
link="/docs/cloud/about-cloud-setup"
icon="dbt-bit"/>

<Card
title="dbt Core installation"
body="Learn how to connect install dbt Core using Pip, Homebrew, Docker, or the open source repo."
link="/docs/core/installation"
title="dbt Core setup"
body="Learn about dbt Core and how to setup data platform connections."
link="/docs/core/about-core-setup"
icon="dbt-bit"/>

</div>
2 changes: 1 addition & 1 deletion website/docs/docs/build/build-metrics-intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ Use MetricFlow in dbt to centrally define your metrics. As a key component of th

MetricFlow allows you to:
- Intuitively define metrics in your dbt project
- Develop from your preferred environment, whether that's the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), or [dbt Core](/docs/core/installation)
- Develop from your preferred environment, whether that's the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), or [dbt Core](/docs/core/installation-overview)
- Use [MetricFlow commands](/docs/build/metricflow-commands) to query and test those metrics in your development environment
- Harness the true magic of the universal dbt Semantic Layer and dynamically query these metrics in downstream tools (Available for dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) accounts only).

Expand Down
5 changes: 1 addition & 4 deletions website/docs/docs/build/cumulative-metrics.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,10 +38,7 @@ metrics:

## Limitations
Cumulative metrics are currently under active development and have the following limitations:

1. You can only use the [`metric_time` dimension](/docs/build/dimensions#time) to check cumulative metrics. If you don't use `metric_time` in the query, the cumulative metric will return incorrect results because it won't perform the time spine join. This means you cannot reference time dimensions other than the `metric_time` in the query.
2. If you use `metric_time` in your query filter but don't include "start_time" and "end_time," cumulative metrics will left-censor the input data. For example, if you query a cumulative metric with a 7-day window with the filter `{{ TimeDimension('metric_time') }} BETWEEN '2023-08-15' AND '2023-08-30' `, the values for `2023-08-15` to `2023-08-20` return missing or incomplete data. This is because we apply the `metric_time` filter to the aggregation input. To avoid this, you must use `start_time` and `end_time` in the query filter.

- You are required to use [`metric_time` dimension](/docs/build/dimensions#time) when querying cumulative metrics. If you don't use `metric_time` in the query, the cumulative metric will return incorrect results because it won't perform the time spine join. This means you cannot reference time dimensions other than the `metric_time` in the query.

## Cumulative metrics example

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/metricflow-commands.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ tags: [Metrics, Semantic Layer]

Once you define metrics in your dbt project, you can query metrics, dimensions, and dimension values, and validate your configs using the MetricFlow commands.

MetricFlow allows you to define and query metrics in your dbt project in the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), or [dbt Core](/docs/core/installation). To experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and dynamically query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account.
MetricFlow allows you to define and query metrics in your dbt project in the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), or [dbt Core](/docs/core/installation-overview). To experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and dynamically query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account.

MetricFlow is compatible with Python versions 3.8, 3.9, 3.10, and 3.11.

Expand Down
10 changes: 2 additions & 8 deletions website/docs/docs/build/packages.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,15 +25,9 @@ dbt _packages_ are in fact standalone dbt projects, with models and macros that
* It's important to note that defining and installing dbt packages is different from [defining and installing Python packages](/docs/build/python-models#using-pypi-packages)


:::info `dependencies.yml` has replaced `packages.yml`
Starting from dbt v1.6, `dependencies.yml` has replaced `packages.yml`. This file can now contain both types of dependencies: "package" and "project" dependencies.
- "Package" dependencies lets you add source code from someone else's dbt project into your own, like a library.
- "Project" dependencies provide a different way to build on top of someone else's work in dbt. Refer to [Project dependencies](/docs/collaborate/govern/project-dependencies) for more info.
-
You can rename `packages.yml` to `dependencies.yml`, _unless_ you need to use Jinja within your packages specification. This could be necessary, for example, if you want to add an environment variable with a git token in a private git package specification.

:::
import UseCaseInfo from '/snippets/_packages_or_dependencies.md';

<UseCaseInfo/>

## How do I add a package to my project?
1. Add a file named <VersionBlock firstVersion="1.6"> `dependencies.yml` or </VersionBlock> `packages.yml` to your dbt project. This should be at the same level as your `dbt_project.yml` file.
Expand Down
33 changes: 0 additions & 33 deletions website/docs/docs/cloud/about-cloud-develop.md

This file was deleted.

3 changes: 1 addition & 2 deletions website/docs/docs/cloud/about-cloud-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,13 @@ dbt Cloud is the fastest and most reliable way to deploy your dbt jobs. It conta
- Configuring access to [GitHub](/docs/cloud/git/connect-github), [GitLab](/docs/cloud/git/connect-gitlab), or your own [git repo URL](/docs/cloud/git/import-a-project-by-git-url).
- [Managing users and licenses](/docs/cloud/manage-access/seats-and-users)
- [Configuring secure access](/docs/cloud/manage-access/about-user-access)
- Configuring the [dbt Cloud IDE](/docs/cloud/about-cloud-develop)
- Installing and configuring the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation)

These settings are intended for dbt Cloud administrators. If you need a more detailed first-time setup guide for specific data platforms, read our [quickstart guides](/guides).

If you want a more in-depth learning experience, we recommend taking the dbt Fundamentals on our [dbt Learn online courses site](https://courses.getdbt.com/).

## Prerequisites

- To set up dbt Cloud, you'll need to have a dbt Cloud account with administrator access. If you still need to create a dbt Cloud account, [sign up today](https://getdbt.com) on our North American servers or [contact us](https://getdbt.com/contact) for international options.
- To have the best experience using dbt Cloud, we recommend you use modern and up-to-date web browsers like Chrome, Safari, Edge, and Firefox.

Expand Down
30 changes: 30 additions & 0 deletions website/docs/docs/cloud/about-develop-dbt.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
---
title: About developing in dbt
id: about-develop-dbt
description: "Learn how to develop your dbt projects using dbt Cloud."
sidebar_label: "About developing in dbt"
pagination_next: "docs/cloud/about-cloud-develop-defer"
hide_table_of_contents: true
---

Develop dbt projects using dbt Cloud, which offers a fast and reliable way to work on your dbt project. It runs dbt Core in a hosted (single or multi-tenant) environment.

You can develop in your browser using an integrated development environment (IDE) or in a dbt Cloud-powered command line interface (CLI).

<div className="grid--2-col" >

<Card
title="dbt Cloud CLI"
body="Allows you to develop and run dbt commands from your local command line or code editor against your dbt Cloud development environment."
link="/docs/cloud/cloud-cli-installation"
icon="dbt-bit"/>

<Card
title="dbt Cloud IDE"
body="Develop directly in your browser, making dbt project development efficient by compiling code into SQL and managing project changes seamlessly using an intuitive user interface."
link="/docs/cloud/dbt-cloud-ide/develop-in-the-cloud"
icon="dbt-bit"/>

</div><br />

To get started with dbt development, you'll need a [dbt Cloud](https://www.getdbt.com/signup) account and developer seat. For a more comprehensive guide about developing in dbt, refer to our [quickstart guides](/guides).
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,15 @@ title: "About data platform connections"
id: about-connections
description: "Information about data platform connections"
sidebar_label: "About data platform connections"
pagination_next: "docs/cloud/connect-data-platform/connect-starburst-trino"
pagination_next: "docs/cloud/connect-data-platform/connect-microsoft-fabric"
pagination_prev: null
---
dbt Cloud can connect with a variety of data platform providers including:
- [Amazon Redshift](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb)
- [Apache Spark](/docs/cloud/connect-data-platform/connect-apache-spark)
- [Databricks](/docs/cloud/connect-data-platform/connect-databricks)
- [Google BigQuery](/docs/cloud/connect-data-platform/connect-bigquery)
- [Microsoft Fabric](/docs/cloud/connect-data-platform/connect-microsoft-fabric)
- [PostgreSQL](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb)
- [Snowflake](/docs/cloud/connect-data-platform/connect-snowflake)
- [Starburst or Trino](/docs/cloud/connect-data-platform/connect-starburst-trino)
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
---
title: "Connect Microsoft Fabric"
description: "Configure Microsoft Fabric connection."
sidebar_label: "Connect Microsoft Fabric"
---

## Supported authentication methods
The supported authentication methods are:
- Azure Active Directory (Azure AD) service principal
- Azure AD password

SQL password (LDAP) is not supported in Microsoft Fabric Synapse Data Warehouse so you must use Azure AD. This means that to use [Microsoft Fabric](https://www.microsoft.com/en-us/microsoft-fabric) in dbt Cloud, you will need at least one Azure AD service principal to connect dbt Cloud to Fabric, ideally one service principal for each user.

### Active Directory service principal
The following are the required fields for setting up a connection with a Microsoft Fabric using Azure AD service principal authentication.

| Field | Description |
| --- | --- |
| **Server** | The service principal's **host** value for the Fabric test endpoint. |
| **Port** | The port to connect to Microsoft Fabric. You can use `1433` (the default), which is the standard SQL server port number. |
| **Database** | The service principal's **database** value for the Fabric test endpoint. |
| **Authentication** | Choose **Service Principal** from the dropdown. |
| **Tenant ID** | The service principal's **Directory (tenant) ID**. |
| **Client ID** | The service principal's **application (client) ID id**. |
| **Client secret** | The service principal's **client secret** (not the **client secret id**). |


### Active Directory password

The following are the required fields for setting up a connection with a Microsoft Fabric using Azure AD password authentication.

| Field | Description |
| --- | --- |
| **Server** | The server hostname to connect to Microsoft Fabric. |
| **Port** | The server port. You can use `1433` (the default), which is the standard SQL server port number. |
| **Database** | The database name. |
| **Authentication** | Choose **Active Directory Password** from the dropdown. |
| **User** | The AD username. |
| **Password** | The AD username's password. |

## Configuration

To learn how to optimize performance with data platform-specific configurations in dbt Cloud, refer to [Microsoft Fabric DWH configurations](/reference/resource-configs/fabric-configs).
37 changes: 0 additions & 37 deletions website/docs/docs/cloud/dbt-cloud-ide/dbt-cloud-ide.md

This file was deleted.

46 changes: 46 additions & 0 deletions website/docs/docs/collaborate/explore-multiple-projects.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
---
title: "Explore multiple projects"
sidebar_label: "Explore multiple projects"
description: "Learn about project-level lineage in dbt Explorer and its uses."
pagination_next: null
---

You can also view all the different projects and public models in the account, where the public models are defined, and how they are used to gain a better understanding about your cross-project resources.

The resource-level lineage graph for a given project displays the cross-project relationships in the DAG. The different icons indicate whether you’re looking at an upstream producer project (parent) or a downstream consumer project (child).

When you view an upstream (parent) project, its public models display a counter icon in the upper right corner indicating how many downstream (child) projects depend on them. Selecting a model reveals the lineage indicating the projects dependent on that model. These counts include all projects listing the upstream one as a dependency in its `dependencies.yml`, even without a direct `{{ ref() }}`. Selecting a project node from a public model opens its detailed lineage graph, which is subject to your [permission](/docs/cloud/manage-access/enterprise-permissions).

<Lightbox src="/img/docs/collaborate/dbt-explorer/cross-project-lineage-parent.png" width="80%" height="100" title="Cross-project lineage in a parent project"/>

When viewing a downstream (child) project that imports and refs public models from upstream (parent) projects, public models will show up in the lineage graph and display an icon on the graph edge that indicates what the relationship is to a model from another project. Hovering over this icon indicates the specific dbt Cloud project that produces that model. Double-clicking on a model from another project opens the resource-level lineage graph of the parent project, which is subject to your permissions.


<Lightbox src="/img/docs/collaborate/dbt-explorer/cross-project-lineage-child.png" width="85%" height="100" title="Cross-project lineage in a child project"/>

## Explore the project-level lineage graph

For cross-project collaboration, you can interact with the DAG in all the same ways as described in [Explore your project's lineage](/docs/collaborate/explore-projects#project-lineage) but you can also interact with it at the project level and view the details.

To get a list view of all the projects, select the account name at the top of the **Explore** page near the navigation bar. This view includes a public model list, project list, and a search bar for project searches. You can also view the project-level lineage graph by clicking the Lineage view icon in the page's upper right corner.

If you have permissions for a project in the account, you can view all public models used across the entire account. However, you can only view full public model details and private models if you have permissions for a project where the models are defined.

From the project-level lineage graph, you can:

- Click the Lineage view icon (in the graph’s upper right corner) to view the cross-project lineage graph.
- Click the List view icon (in the graph’s upper right corner) to view the project list.
- Select a project from the **Projects** tab to switch to that project’s main **Explore** page.
- Select a model from the **Public Models** tab to view the [model’s details page](/docs/collaborate/explore-projects#view-resource-details).
- Perform searches on your projects with the search bar.
- Select a project node in the graph (double-clicking) to switch to that particular project’s lineage graph.

When you select a project node in the graph, a project details panel opens on the graph’s right-hand side where you can:

- View counts of the resources defined in the project.
- View a list of its public models, if any.
- View a list of other projects that uses the project, if any.
- Click **Open Project Lineage** to switch to the project’s lineage graph.
- Click the Share icon to copy the project panel link to your clipboard so you can share the graph with someone.

<LoomVideo id='606f02e1cce343eba7e1061d6273ff0a?t=1' />
Loading

0 comments on commit c959921

Please sign in to comment.