Skip to content

Commit

Permalink
Merge branch 'current' into fix-airflow-guide
Browse files Browse the repository at this point in the history
  • Loading branch information
joellabes authored Dec 12, 2023
2 parents b341b69 + 2560cfb commit cb6bd11
Show file tree
Hide file tree
Showing 27 changed files with 54 additions and 22 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -49,9 +49,9 @@ The **prod** service principal should have “read” access to raw source data,

| | Source Data | Development catalog | Production catalog | Test catalog |
| --- | --- | --- | --- | --- |
| developers | use | use, create table & create view | use or none | none |
| production service principal | use | none | use, create table & create view | none |
| Test service principal | use | none | none | use, create table & create view |
| developers | use | use, create schema, table, & view | use or none | none |
| production service principal | use | none | use, create schema, table & view | none |
| Test service principal | use | none | none | use, create schema, table & view |


## Next steps
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ displayText: Materializations best practices
hoverSnippet: Read this guide to understand the incremental models you can create in dbt.
---

So far we’ve looked at tables and views, which map to the traditional objects in the data warehouse. As mentioned earlier, incremental models are a little different. This where we start to deviate from this pattern with more powerful and complex materializations.
So far we’ve looked at tables and views, which map to the traditional objects in the data warehouse. As mentioned earlier, incremental models are a little different. This is where we start to deviate from this pattern with more powerful and complex materializations.

- 📚 **Incremental models generate tables.** They physically persist the data itself to the warehouse, just piece by piece. What’s different is **how we build that table**.
- 💅 **Only apply our transformations to rows of data with new or updated information**, this maximizes efficiency.
Expand Down Expand Up @@ -53,7 +53,7 @@ where
updated_at > (select max(updated_at) from {{ this }})
```

Let’s break down that `where` clause a bit, because this where the action is with incremental models. Stepping through the code **_right-to-left_** we:
Let’s break down that `where` clause a bit, because this is where the action is with incremental models. Stepping through the code **_right-to-left_** we:

1. Get our **cutoff.**
1. Select the `max(updated_at)` timestamp — the **most recent record**
Expand Down Expand Up @@ -138,7 +138,7 @@ where
{% endif %}
```

Fantastic! We’ve got a working incremental model. On our first run, when there is no corresponding table in the warehouse, `is_incremental` will evaluate to false and we’ll capture the entire table. On subsequent runs is it will evaluate to true and we’ll apply our filter logic, capturing only the newer data.
Fantastic! We’ve got a working incremental model. On our first run, when there is no corresponding table in the warehouse, `is_incremental` will evaluate to false and we’ll capture the entire table. On subsequent runs it will evaluate to true and we’ll apply our filter logic, capturing only the newer data.

### Late arriving facts

Expand Down
1 change: 1 addition & 0 deletions website/docs/community/spotlight/alison-stanton.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ socialLinks:
link: https://github.com/alison985/
dateCreated: 2023-11-07
hide_table_of_contents: true
communityAward: true
---

## When did you join the dbt community and in what way has it impacted your career?
Expand Down
1 change: 1 addition & 0 deletions website/docs/community/spotlight/bruno-de-lima.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ socialLinks:
link: https://medium.com/@bruno.szdl
dateCreated: 2023-11-05
hide_table_of_contents: true
communityAward: true
---

## When did you join the dbt community and in what way has it impacted your career?
Expand Down
1 change: 1 addition & 0 deletions website/docs/community/spotlight/dakota-kelley.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ socialLinks:
link: https://www.linkedin.com/in/dakota-kelley/
dateCreated: 2023-11-08
hide_table_of_contents: true
communityAward: true
---

## When did you join the dbt community and in what way has it impacted your career?
Expand Down
1 change: 1 addition & 0 deletions website/docs/community/spotlight/fabiyi-opeyemi.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ socialLinks:
link: https://www.linkedin.com/in/opeyemifabiyi/
dateCreated: 2023-11-06
hide_table_of_contents: true
communityAward: true
---

## When did you join the dbt community and in what way has it impacted your career?
Expand Down
1 change: 1 addition & 0 deletions website/docs/community/spotlight/josh-devlin.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ socialLinks:
link: https://www.linkedin.com/in/josh-devlin/
dateCreated: 2023-11-10
hide_table_of_contents: true
communityAward: true
---

## When did you join the dbt community and in what way has it impacted your career?
Expand Down
1 change: 1 addition & 0 deletions website/docs/community/spotlight/karen-hsieh.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ socialLinks:
link: https://medium.com/@ijacwei
dateCreated: 2023-11-04
hide_table_of_contents: true
communityAward: true
---

## When did you join the dbt community and in what way has it impacted your career?
Expand Down
1 change: 1 addition & 0 deletions website/docs/community/spotlight/oliver-cramer.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ socialLinks:
link: https://www.linkedin.com/in/oliver-cramer/
dateCreated: 2023-11-02
hide_table_of_contents: true
communityAward: true
---

## When did you join the dbt community and in what way has it impacted your career?
Expand Down
1 change: 1 addition & 0 deletions website/docs/community/spotlight/sam-debruyn.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ socialLinks:
link: https://debruyn.dev/
dateCreated: 2023-11-03
hide_table_of_contents: true
communityAward: true
---

## When did you join the dbt community and in what way has it impacted your career?
Expand Down
1 change: 1 addition & 0 deletions website/docs/community/spotlight/stacy-lo.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ socialLinks:
link: https://www.linkedin.com/in/olycats/
dateCreated: 2023-11-01
hide_table_of_contents: true
communityAward: true
---

## When did you join the dbt community and in what way has it impacted your career?
Expand Down
1 change: 1 addition & 0 deletions website/docs/community/spotlight/sydney-burns.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ socialLinks:
link: https://www.linkedin.com/in/sydneyeburns/
dateCreated: 2023-11-09
hide_table_of_contents: true
communityAward: true
---

## When did you join the dbt community and in what way has it impacted your career?
Expand Down
5 changes: 5 additions & 0 deletions website/docs/docs/build/metricflow-commands.md
Original file line number Diff line number Diff line change
Expand Up @@ -556,3 +556,8 @@ Keep in mind that modifying your shell configuration files can have an impact on
</details>
<details>
<summary>Why is my query limited to 100 rows in the dbt Cloud CLI?</summary>
The default <code>limit</code> for query issues from the dbt Cloud CLI is 100 rows. We set this default to prevent returning unnecessarily large data sets as the dbt Cloud CLI is typically used to query the dbt Semantic Layer during the development process, not for production reporting or to access large data sets. For most workflows, you only need to return a subset of the data.<br /><br />
However, you can change this limit if needed by setting the <code>--limit</code> option in your query. For example, to return 1000 rows, you can run <code>dbt sl list metrics --limit 1000</code>.
</details>
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ dbt Cloud is [hosted](/docs/cloud/about-cloud/architecture) in multiple regions
| Region | Location | Access URL | IP addresses | Developer plan | Team plan | Enterprise plan |
|--------|----------|------------|--------------|----------------|-----------|-----------------|
| North America multi-tenant [^1] | AWS us-east-1 (N. Virginia) | cloud.getdbt.com | 52.45.144.63 <br /> 54.81.134.249 <br />52.22.161.231 <br />52.3.77.232 <br />3.214.191.130 <br />34.233.79.135 ||||
| North America Cell 1 [^1] | AWS us-east-1 (N. Virginia) | {account prefix}.us1.dbt.com | 52.45.144.63 <br /> 54.81.134.249 <br />52.22.161.231 <br />52.3.77.232 <br />3.214.191.130 <br />34.233.79.135 | |||
| North America Cell 1 [^1] | AWS us-east-1 (N. Virginia) | {account prefix}.us1.dbt.com | 52.45.144.63 <br /> 54.81.134.249 <br />52.22.161.231 <br />52.3.77.232 <br />3.214.191.130 <br />34.233.79.135 | |||
| EMEA [^1] | AWS eu-central-1 (Frankfurt) | emea.dbt.com | 3.123.45.39 <br /> 3.126.140.248 <br /> 3.72.153.148 ||||
| APAC [^1] | AWS ap-southeast-2 (Sydney)| au.dbt.com | 52.65.89.235 <br /> 3.106.40.33 <br /> 13.239.155.206 <br />||||
| Virtual Private dbt or Single tenant | Customized | Customized | Ask [Support](/community/resources/getting-help#dbt-cloud-support) for your IPs ||||
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/cloud/cloud-cli-installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ dbt commands are run against dbt Cloud's infrastructure and benefit from:
The dbt Cloud CLI is available in all [deployment regions](/docs/cloud/about-cloud/regions-ip-addresses) and for both multi-tenant and single-tenant accounts (Azure single-tenant not supported at this time).

- Ensure you are using dbt version 1.5 or higher. Refer to [dbt Cloud versions](/docs/dbt-versions/upgrade-core-in-cloud) to upgrade.
- Note that SSH tunneling for [Postgres and Redshift](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb) connections and [Single sign-on (SSO)](/docs/cloud/manage-access/sso-overview) doesn't support the dbt Cloud CLI yet.
- Note that SSH tunneling for [Postgres and Redshift](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb) connections doesn't support the dbt Cloud CLI yet.

## Install dbt Cloud CLI

Expand Down
16 changes: 11 additions & 5 deletions website/docs/docs/cloud/manage-access/audit-log.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ On the audit log page, you will see a list of various events and their associate

Click the event card to see the details about the activity that triggered the event. This view provides important details, including when it happened and what type of event was triggered. For example, if someone changes the settings for a job, you can use the event details to see which job was changed (type of event: `v1.events.job_definition.Changed`), by whom (person who triggered the event: `actor`), and when (time it was triggered: `created_at_utc`). For types of events and their descriptions, see [Events in audit log](#events-in-audit-log).

The event details provides the key factors of an event:
The event details provide the key factors of an event:

| Name | Description |
| -------------------- | --------------------------------------------- |
Expand Down Expand Up @@ -160,16 +160,22 @@ The audit log supports various events for different objects in dbt Cloud. You wi
You can search the audit log to find a specific event or actor, which is limited to the ones listed in [Events in audit log](#events-in-audit-log). The audit log successfully lists historical events spanning the last 90 days. You can search for an actor or event using the search bar, and then narrow your results using the time window.


<Lightbox src="/img/docs/dbt-cloud/dbt-cloud-enterprise/audit-log-search.png" width="85%" title="Use search bar to find content in the audit log"/>
<Lightbox src="/img/docs/dbt-cloud/dbt-cloud-enterprise/audit-log-search.png" width="95%" title="Use search bar to find content in the audit log"/>


## Exporting logs

You can use the audit log to export all historical audit results for security, compliance, and analysis purposes:

- For events within 90 days &mdash; dbt Cloud will automatically display the 90-day selectable date range. Select **Export Selection** to download a CSV file of all the events that occurred in your organization within 90 days.
- For events beyond 90 days &mdash; Select **Export All**. The Account Admin will receive an email link to download a CSV file of all the events that occurred in your organization.
- **For events within 90 days** &mdash; dbt Cloud will automatically display the 90-day selectable date range. Select **Export Selection** to download a CSV file of all the events that occurred in your organization within 90 days.

<Lightbox src="/img/docs/dbt-cloud/dbt-cloud-enterprise/audit-log-section.jpg" width="85%" title="View audit log export options"/>
- **For events beyond 90 days** &mdash; Select **Export All**. The Account Admin will receive an email link to download a CSV file of all the events that occurred in your organization.

<Lightbox src="/img/docs/dbt-cloud/dbt-cloud-enterprise/audit-log-section.jpg" width="95%" title="View audit log export options"/>

### Azure Single-tenant

For users deployed in [Azure single tenant](/docs/cloud/about-cloud/tenancy), while the **Export All** button isn't available, you can conveniently use specific APIs to access all events:

- [Get recent audit log events CSV](/dbt-cloud/api-v3#/operations/Get%20Recent%20Audit%20Log%20Events%20CSV) &mdash; This API returns all events in a single CSV without pagination.
- [List recent audit log events](/dbt-cloud/api-v3#/operations/List%20Recent%20Audit%20Log%20Events) &mdash; This API returns a limited number of events at a time, which means you will need to paginate the results.
2 changes: 1 addition & 1 deletion website/docs/docs/dbt-cloud-apis/schema-discovery-job.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -61,4 +61,4 @@ query JobQueryExample {
### Fields
When querying an `job`, you can use the following fields.

<SchemaTable nodeName="JobNode" />
<SchemaTable nodeName="JobNode" exclude={['metric', 'metrics']} />
3 changes: 2 additions & 1 deletion website/docs/docs/dbt-cloud-apis/schema.jsx
Original file line number Diff line number Diff line change
Expand Up @@ -173,7 +173,7 @@ export const NodeArgsTable = ({ parent, name, useBetaAPI }) => {
)
}

export const SchemaTable = ({ nodeName, useBetaAPI }) => {
export const SchemaTable = ({ nodeName, useBetaAPI, exclude = [] }) => {
const [data, setData] = useState(null)
useEffect(() => {
const fetchData = () => {
Expand Down Expand Up @@ -255,6 +255,7 @@ export const SchemaTable = ({ nodeName, useBetaAPI }) => {
</thead>
<tbody>
{data.data.__type.fields.map(function ({ name, description, type }) {
if (exclude.includes(name)) return;
return (
<tr key={name}>
<td><code>{name}</code></td>
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/dbt-cloud-apis/service-tokens.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ Job admin service tokens can authorize requests for viewing, editing, and creati
Member service tokens can authorize requests for viewing and editing resources, triggering runs, and inviting members to the account. Tokens assigned the Member permission set will have the same permissions as a Member user. For more information about Member users, see "[Self-service permissions](/docs/cloud/manage-access/self-service-permissions)".

**Read-only**<br/>
Read-only service tokens can authorize requests for viewing a read-only dashboard, viewing generated documentation, and viewing source freshness reports.
Read-only service tokens can authorize requests for viewing a read-only dashboard, viewing generated documentation, and viewing source freshness reports. This token can access and retrieve account-level information endpoints on the [Admin API](/docs/dbt-cloud-apis/admin-cloud-api) and authorize requests to the [Discovery API](/docs/dbt-cloud-apis/discovery-api).

### Enterprise plans using service account tokens

Expand Down
4 changes: 3 additions & 1 deletion website/docs/docs/dbt-support.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,9 @@ If you're developing on the command line (CLI) and have questions or need some h

## dbt Cloud support

The global dbt Support team is available to dbt Cloud customers by email or in-product live chat. We want to help you work through implementing and utilizing dbt Cloud at your organization. Have a question you can't find an answer to in [our docs](https://docs.getdbt.com/) or [the Community Forum](https://discourse.getdbt.com/)? Our Support team is here to `dbt help` you!
The global dbt Support team is available to dbt Cloud customers by [email](mailto:[email protected]) or using the in-product live chat (💬).

We want to help you work through implementing and utilizing dbt Cloud at your organization. Have a question you can't find an answer to in [our docs](https://docs.getdbt.com/) or [the Community Forum](https://discourse.getdbt.com/)? Our Support team is here to `dbt help` you!

- **Enterprise plans** &mdash; Priority [support](#severity-level-for-enterprise-support), options for custom support coverage hours, implementation assistance, dedicated management, and dbt Labs security reviews depending on price point.
- **Developer and Team plans** &mdash; 24x5 support (no service level agreement (SLA); [contact Sales](https://www.getdbt.com/pricing/) for Enterprise plan inquiries).
Expand Down
2 changes: 1 addition & 1 deletion website/docs/guides/bigquery-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ In order to let dbt connect to your warehouse, you'll need to generate a keyfile
- Click **Next** to create a new service account.
2. Create a service account for your new project from the [Service accounts page](https://console.cloud.google.com/projectselector2/iam-admin/serviceaccounts?supportedpurview=project). For more information, refer to [Create a service account](https://developers.google.com/workspace/guides/create-credentials#create_a_service_account) in the Google Cloud docs. As an example for this guide, you can:
- Type `dbt-user` as the **Service account name**
- From the **Select a role** dropdown, choose **BigQuery Admin** and click **Continue**
- From the **Select a role** dropdown, choose **BigQuery Job User** and **BigQuery Data Editor** roles and click **Continue**
- Leave the **Grant users access to this service account** fields blank
- Click **Done**
3. Create a service account key for your new project from the [Service accounts page](https://console.cloud.google.com/iam-admin/serviceaccounts?walkthrough_id=iam--create-service-account-keys&start_index=1#step_index=1). For more information, refer to [Create a service account key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating) in the Google Cloud docs. When downloading the JSON file, make sure to use a filename you can easily remember. For example, `dbt-user-creds.json`. For security reasons, dbt Labs recommends that you protect this JSON file like you would your identity credentials; for example, don't check the JSON file into your version control software.
Expand Down
2 changes: 1 addition & 1 deletion website/docs/guides/manual-install-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ When you use dbt Core to work with dbt, you will be editing files locally using

* To use dbt Core, it's important that you know some basics of the Terminal. In particular, you should understand `cd`, `ls` and `pwd` to navigate through the directory structure of your computer easily.
* Install dbt Core using the [installation instructions](/docs/core/installation-overview) for your operating system.
* Complete [Setting up (in BigQuery)](/guides/bigquery?step=2) and [Loading data (BigQuery)](/guides/bigquery?step=3).
* Complete appropriate Setting up and Loading data steps in the Quickstart for dbt Cloud series. For example, for BigQuery, complete [Setting up (in BigQuery)](/guides/bigquery?step=2) and [Loading data (BigQuery)](/guides/bigquery?step=3).
* [Create a GitHub account](https://github.com/join) if you don't already have one.

### Create a starter project
Expand Down
2 changes: 1 addition & 1 deletion website/docs/guides/set-up-your-databricks-dbt-project.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ Let’s [create a Databricks SQL warehouse](https://docs.databricks.com/sql/admi
5. Click *Create*
6. Configure warehouse permissions to ensure our service principal and developer have the right access.

We are not covering python in this post but if you want to learn more, check out these [docs](https://docs.getdbt.com/docs/build/python-models#specific-data-platforms). Depending on your workload, you may wish to create a larger SQL Warehouse for production workflows while having a smaller development SQL Warehouse (if you’re not using Serverless SQL Warehouses).
We are not covering python in this post but if you want to learn more, check out these [docs](https://docs.getdbt.com/docs/build/python-models#specific-data-platforms). Depending on your workload, you may wish to create a larger SQL Warehouse for production workflows while having a smaller development SQL Warehouse (if you’re not using Serverless SQL Warehouses). As your project grows, you might want to apply [compute per model configurations](/reference/resource-configs/databricks-configs#specifying-the-compute-for-models).

## Configure your dbt project

Expand Down
3 changes: 3 additions & 0 deletions website/docs/reference/artifacts/dbt-artifacts.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,3 +48,6 @@ In the manifest, the `metadata` may also include:
#### Notes:
- The structure of dbt artifacts is canonized by [JSON schemas](https://json-schema.org/), which are hosted at **schemas.getdbt.com**.
- Artifact versions may change in any minor version of dbt (`v1.x.0`). Each artifact is versioned independently.

## Related docs
- [Other artifacts](/reference/artifacts/other-artifacts) files such as `index.html` or `graph_summary.json`.
2 changes: 1 addition & 1 deletion website/docs/reference/artifacts/other-artifacts.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ This file is used to store a compressed representation of files dbt has parsed.

**Produced by:** commands supporting [node selection](/reference/node-selection/syntax)

Stores the networkx representation of the dbt resource DAG.
Stores the network representation of the dbt resource DAG.

### graph_summary.json

Expand Down
2 changes: 1 addition & 1 deletion website/docs/reference/dbt-jinja-functions/return.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "About return function"
sidebar_variable: "return"
sidebar_label: "return"
id: "return"
description: "Read this guide to understand the return Jinja function in dbt."
---
Expand Down
Loading

0 comments on commit cb6bd11

Please sign in to comment.