Skip to content

Commit

Permalink
Merge branch 'current' into nfiann-versions
Browse files Browse the repository at this point in the history
  • Loading branch information
nataliefiann authored Nov 11, 2024
2 parents 2daeed5 + 5d24cbf commit a8aefac
Show file tree
Hide file tree
Showing 27 changed files with 141 additions and 139 deletions.
15 changes: 0 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,18 +62,3 @@ You can click a link available in a Vercel bot PR comment to see and review your

Advisory:
- If you run into an `fatal error: 'vips/vips8' file not found` error when you run `npm install`, you may need to run `brew install vips`. Warning: this one will take a while -- go ahead and grab some coffee!

## Running the Cypress tests locally

Method 1: Utilizing the Cypress GUI
1. `cd` into the repo: `cd docs.getdbt.com`
2. `cd` into the `website` subdirectory: `cd website`
3. Install the required node packages: `npm install`
4. Run `npx cypress open` to open the Cypress GUI, and choose `E2E Testing` as the Testing Type, before finally selecting your browser and clicking `Start E2E testing in {broswer}`
5. Click on a test and watch it run!

Method 2: Running the Cypress E2E tests headlessly
1. `cd` into the repo: `cd docs.getdbt.com`
2. `cd` into the `website` subdirectory: `cd website`
3. Install the required node packages: `npm install`
4. Run `npx cypress run`
67 changes: 0 additions & 67 deletions contributing/developer-blog.md

This file was deleted.

2 changes: 1 addition & 1 deletion website/docs/docs/build/incremental-strategy.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ Click the name of the adapter in the below table for more information about supp
| Data platform adapter | `append` | `merge` | `delete+insert` | `insert_overwrite` | `microbatch` <Lifecycle status="beta"/> |
|-----------------------|:--------:|:-------:|:---------------:|:------------------:|:-------------------:|
| [dbt-postgres](/reference/resource-configs/postgres-configs#incremental-materialization-strategies) |||| ||
| [dbt-redshift](/reference/resource-configs/redshift-configs#incremental-materialization-strategies) |||| | |
| [dbt-redshift](/reference/resource-configs/redshift-configs#incremental-materialization-strategies) |||| | |
| [dbt-bigquery](/reference/resource-configs/bigquery-configs#merge-behavior-incremental-models) | || |||
| [dbt-spark](/reference/resource-configs/spark-configs#incremental-models) ||| |||
| [dbt-databricks](/reference/resource-configs/databricks-configs#incremental-models) ||| || |
Expand Down
27 changes: 3 additions & 24 deletions website/docs/docs/build/snapshots.md
Original file line number Diff line number Diff line change
Expand Up @@ -390,29 +390,6 @@ snapshots:

</VersionBlock>

## Snapshot query best practices

This section outlines some best practices for writing snapshot queries:

- #### Snapshot source data
Your models should then select from these snapshots, treating them like regular data sources. As much as possible, snapshot your source data in its raw form and use downstream models to clean up the data

- #### Use the `source` function in your query
This helps when understanding <Term id="data-lineage">data lineage</Term> in your project.

- #### Include as many columns as possible
In fact, go for `select *` if performance permits! Even if a column doesn't feel useful at the moment, it might be better to snapshot it in case it becomes useful – after all, you won't be able to recreate the column later.

- #### Avoid joins in your snapshot query
Joins can make it difficult to build a reliable `updated_at` timestamp. Instead, snapshot the two tables separately, and join them in downstream models.

- #### Limit the amount of transformation in your query
If you apply business logic in a snapshot query, and this logic changes in the future, it can be impossible (or, at least, very difficult) to apply the change in logic to your snapshots.

Basically – keep your query as simple as possible! Some reasonable exceptions to these recommendations include:
* Selecting specific columns if the table is wide.
* Doing light transformation to get data into a reasonable shape, for example, unpacking a <Term id="json" /> blob to flatten your source data into columns.

## Snapshot meta-fields

Snapshot <Term id="table">tables</Term> will be created as a clone of your source dataset, plus some additional meta-fields*.
Expand Down Expand Up @@ -498,7 +475,9 @@ Snapshot results:

<VersionBlock firstVersion="1.9">

This section is for users on dbt versions 1.8 and earlier. To configure snapshots in versions 1.9 and later, refer to [Configuring snapshots](#configuring-snapshots). The latest versions use an updated snapshot configuration syntax that optimizes performance.
For information about configuring snapshots in dbt versions 1.8 and earlier, select **1.8** from the documentation version picker, and it will appear in this section.

To configure snapshots in versions 1.9 and later, refer to [Configuring snapshots](#configuring-snapshots). The latest versions use an updated snapshot configuration syntax that optimizes performance.

</VersionBlock>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ Please consider the following actions, as the steps you take will depend on the
<Lightbox src="/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/connections-post-rollout-4.png" width="60%"title="Connections de-duplicated"/>

- Normalization
- Undertsand how new connections should be created to avoid local overrides. If you currently use extended attributes to override the warehouse instance in your production environment - you should instead create a new connection for that instance, and wire your production environment to it, removing the need for the local overrides
- Understand how new connections should be created to avoid local overrides. If you currently use extended attributes to override the warehouse instance in your production environment - you should instead create a new connection for that instance, and wire your production environment to it, removing the need for the local overrides
- Create new connections, update relevant environments to target these connections, removing now unecessary local overrides (which may not be all of them!)
- Test the new wiring by triggering jobs or starting IDE sessions

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ Once the connection is saved, a public key will be generated and displayed for t
To configure the SSH tunnel in dbt Cloud, you'll need to provide the hostname/IP of your bastion server, username, and port, of your choosing, that dbt Cloud will connect to. Review the following steps:

- Verify the bastion server has its network security rules set up to accept connections from the [dbt Cloud IP addresses](/docs/cloud/about-cloud/access-regions-ip-addresses) on whatever port you configured.
- Set up the user account by using the bastion servers instance's CLI, The following example uses the username `dbtcloud:`
- Set up the user account by using the bastion servers instance's CLI, The following example uses the username `dbtcloud`:

```shell
sudo groupadd dbtcloud
Expand Down
61 changes: 57 additions & 4 deletions website/docs/docs/cloud/manage-access/audit-log.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ The audit log supports various events for different objects in dbt Cloud. You wi
| Auth Provider Changed | auth_provider.Changed | Authentication provider settings changed |
| Credential Login Succeeded | auth.CredentialsLoginSucceeded | User successfully logged in with username and password |
| SSO Login Failed | auth.SsoLoginFailed | User login via SSO failed |
| SSO Login Succeeded | auth.SsoLoginSucceeded | User successfully logged in via SSO
| SSO Login Succeeded | auth.SsoLoginSucceeded | User successfully logged in via SSO |

### Environment

Expand Down Expand Up @@ -93,7 +93,7 @@ The audit log supports various events for different objects in dbt Cloud. You wi
| ------------- | ----------------------------- | ------------------------------ |
| Group Added | user_group.Added | New Group successfully created |
| Group Changed | user_group.Changed | Group settings changed |
| Group Removed | user_group.Changed | Group successfully removed |
| Group Removed | user_group.Removed | Group successfully removed |

### User

Expand Down Expand Up @@ -149,12 +149,65 @@ The audit log supports various events for different objects in dbt Cloud. You wi

### Credentials

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -------------------------------- |
| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| Credentials Added to Project | credentials.Added | Project credentials added |
| Credentials Changed in Project | credentials.Changed | Credentials changed in project |
| Credentials Removed from Project | credentials.Removed | Credentials removed from project |


### Git integration

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| GitLab Application Changed | gitlab_application.changed | GitLab configuration in dbt Cloud changed |

### Webhooks

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| Webhook Subscriptions Added | webhook_subscription.added | New webhook configured in settings |
| Webhook Subscriptions Changed | webhook_subscription.changed | Existing webhook configuration altered |
| Webhook Subscriptions Removed | webhook_subscription.removed | Existing webhook deleted |


### Semantic Layer

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| Semantic Layer Config Added | semantic_layer_config.added | Semantic Layer config added |
| Semantic Layer Config Changed | semantic_layer_config.changed | Semantic Layer config (not related to credentials) changed |
| Semantic Layer Config Removed | semantic_layer_config.removed | Semantic Layer config removed |
| Semantic Layer Credentials Added | semantic_layer_credentials.added | Semantic Layer credentials added |
| Semantic Layer Credentials Changed| semantic_layer_credentials.changed | Semantic Layer credentials changed. Does not trigger semantic_layer_config.changed|
| Semantic Layer Credentials Removed| semantic_layer_credentials.removed | Semantic Layer credentials removed |

### Extended attributes

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| Extended Attribute Added | extended_attributes.added | Extended attribute added to a project |
| Extended Attribute Changed | extended_attributes.changed | Extended attribute changed or removed |


### Account-scoped personal access token

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| Account Scoped Personal Access Token Created | account_scoped_pat.created | An account-scoped PAT was created |
| Account Scoped Personal Access Token Deleted | account_scoped_pat.deleted | An account-scoped PAT was deleted |

### IP restrictions

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| IP Restrictions Toggled | ip_restrictions.toggled | IP restrictions feature enabled or disabled |
| IP Restrictions Rule Added | ip_restrictions.rule.added | IP restriction rule created |
| IP Restrictions Rule Changed | ip_restrictions.rule.changed | IP restriction rule edited |
| IP Restrictions Rule Removed | ip_restrictions.rule.removed | IP restriction rule deleted |



## Searching the audit log

You can search the audit log to find a specific event or actor, which is limited to the ones listed in [Events in audit log](#events-in-audit-log). The audit log successfully lists historical events spanning the last 90 days. You can search for an actor or event using the search bar, and then narrow your results using the time window.
Expand Down
Loading

0 comments on commit a8aefac

Please sign in to comment.