Skip to content

Commit

Permalink
Merge branch 'current' into mwong-update-sl-env-vars
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Sep 25, 2024
2 parents a98f2f9 + e2197cc commit 33c0f62
Show file tree
Hide file tree
Showing 28 changed files with 206 additions and 137 deletions.
38 changes: 23 additions & 15 deletions website/docs/docs/cloud/configure-cloud-cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,21 +52,29 @@ Once you install the dbt Cloud CLI, you need to configure it to connect to a dbt

The config file looks like this:

```yaml
version: "1"
context:
active-project: "<project id from the list below>"
active-host: "<active host from the list>"
defer-env-id: "<optional defer environment id>"
projects:
- project-id: "<project-id>"
account-host: "<account-host>"
api-key: "<user-api-key>"

- project-id: "<project-id>"
account-host: "<account-host>"
api-key: "<user-api-key>"
```
```yaml
version: "1"
context:
active-project: "<project id from the list below>"
active-host: "<active host from the list>"
defer-env-id: "<optional defer environment id>"
projects:
- project-name: "<project-name>"
project-id: "<project-id>"
account-name: "<account-name>"
account-id: "<account-id>"
account-host: "<account-host>" # for example, "cloud.getdbt.com"
token-name: "<pat-or-service-token-name>"
token-value: "<pat-or-service-token-value>"

- project-name: "<project-name>"
project-id: "<project-id>"
account-name: "<account-name>"
account-id: "<account-id>"
account-host: "<account-host>" # for example, "cloud.getdbt.com"
token-name: "<pat-or-service-token-name>"
token-value: "<pat-or-service-token-value>"
```
3. After downloading the config file and creating your directory, navigate to a dbt project in your terminal:
Expand Down
253 changes: 161 additions & 92 deletions website/docs/docs/cloud/manage-access/about-access.md

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion website/docs/docs/cloud/manage-access/sso-overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ Then, assign all of these (and only these) to the user license. This step will a

## SSO enforcement

* **SSO Enforcement:** If you have SSO turned on in your organization, dbt Cloud will enforce SSO-only logins for all non-admin users. If an Account Admin already has a password, they can continue logging in with a password.
* **SSO Enforcement:** If SSO is turned on in your organization, dbt Cloud will enforce SSO-only logins for all non-admin users. By default, if an Account Admin or Security Admin already has a password, they can continue logging in with a password. To restrict admins from using passwords, turn off **Allow password logins for account administrators** in the **Single sign-on** section of your organization's **Account settings**.
* **SSO Re-Authentication:** dbt Cloud will prompt you to re-authenticate using your SSO provider every 24 hours to ensure high security.

### How should non-admin users log in?
Expand Down
8 changes: 4 additions & 4 deletions website/docs/docs/cloud/migration.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,15 @@ dbt Labs is in the process of rolling out a new cell-based architecture for dbt

We're scheduling migrations by account. When we're ready to migrate your account, you will receive a banner or email communication with your migration date. If you have not received this communication, then you don't need to take action at this time. dbt Labs will share information about your migration with you, with appropriate advance notice, when applicable to your account.

Your account will be automatically migrated on its scheduled date. However, if you use certain features, you must take action before that date to avoid service disruptions.
Your account will be automatically migrated on or after its scheduled date. However, if you use certain features, you must take action before that date to avoid service disruptions.

## Recommended actions

We highly recommended you take these actions:

- Ensure pending user invitations are accepted or note outstanding invitations. Pending user invitations will be voided during the migration and must be resent after it is complete.
- Commit unsaved changes in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). Unsaved changes will be lost during migration.
- Export and download [audit logs](/docs/cloud/manage-access/audit-log) older than 90 days, as they will be lost during migration. If you lose critical logs older than 90 days during the migration, you will have to work with the dbt Labs Customer Support team to recover.
- Ensure pending user invitations are accepted or note outstanding invitations. Pending user invitations might be voided during the migration. You can resend user invitations after the migration is complete.
- Commit unsaved changes in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). Unsaved changes might be lost during migration.
- Export and download [audit logs](/docs/cloud/manage-access/audit-log) older than 90 days, as they will be unavailable from dbt Cloud after the migration is complete. Logs older than 90 days while within the data retention period are not deleted, but you will have to work with the dbt Labs Customer Support team to recover.

## Required actions

Expand Down
33 changes: 15 additions & 18 deletions website/docs/guides/databricks-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,36 +41,33 @@ You can check out [dbt Fundamentals](https://learn.getdbt.com/courses/dbt-fundam

## Create a Databricks workspace

1. Use your existing account or [sign up for a Databricks account](https://databricks.com/). Complete the form with your user information.
1. Use your existing account or [sign up for a Databricks account](https://databricks.com/). Complete the form with your user information and click **Continue**.

<div style={{maxWidth: '400px'}}>
<Lightbox src="/img/databricks_tutorial/images/signup_form.png" title="Sign up for Databricks" />
</div>

2. For the purpose of this tutorial, you will be selecting AWS as our cloud provider but if you use Azure or GCP internally, please choose one of them. The setup process will be similar.
3. Check your email to complete the verification process.
4. After setting up your password, you will be guided to choose a subscription plan. Select the `Premium` or `Enterprise` plan to access the SQL Compute functionality required for using the SQL warehouse for dbt. We have chosen `Premium` for this tutorial. Click **Continue** after selecting your plan.

<div style={{maxWidth: '400px'}}>
<Lightbox src="/img/databricks_tutorial/images/choose_plan.png" title="Choose Databricks Plan" />
2. On the next screen, select your cloud provider. This tutorial uses AWS as the cloud provider, but if you use Azure or GCP internally, please select your platform. The setup process will be similar. Do not select the **Get started with Community Edition** option, as this will not provide the required compute for this guide.

<div style={{maxWidth: '400px'}}>
<Lightbox src="/img/databricks_tutorial/images/choose_provider.png" title="Choose cloud provider" />
</div>

5. Click **Get Started** when you come to this below page and then **Confirm** after you validate that you have everything needed.
3. Check your email and complete the verification process.

4. After completing the verification processes, you will be brought to the first setup screen. Databricks defaults to the `Premium` plan and you can change the trial to `Enterprise` on this page.

<div style={{maxWidth: '400px'}}>
<Lightbox src="/img/databricks_tutorial/images/validate_1.png" />
</div>
<div style={{maxWidth: '400px'}}>
<Lightbox src="/img/databricks_tutorial/images/validate_2.png" />
<Lightbox src="/img/databricks_tutorial/images/choose_plan.png" title="Choose Databricks Plan" />
</div>

6. Now it's time to create your first workspace. A Databricks workspace is an environment for accessing all of your Databricks assets. The workspace organizes objects like notebooks, SQL warehouses, clusters, etc into one place. Provide the name of your workspace and choose the appropriate AWS region and click **Start Quickstart**. You might get the checkbox of **I have data in S3 that I want to query with Databricks**. You do not need to check this off for the purpose of this tutorial.
5. Now, it's time to create your first workspace. A Databricks workspace is an environment for accessing all of your Databricks assets. The workspace organizes objects like notebooks, SQL warehouses, clusters, and more so into one place. Provide the name of your workspace, choose the appropriate AWS region, and click **Start Quickstart**. You might get the checkbox of **I have data in S3 that I want to query with Databricks**. You do not need to check this off for this tutorial.

<div style={{maxWidth: '400px'}}>
<Lightbox src="/img/databricks_tutorial/images/setup_first_workspace.png" title="Setup First Workspace" />
<Lightbox src="/img/databricks_tutorial/images/start_quickstart.png" title="Create AWS resources" />
</div>

7. By clicking on `Start Quickstart`, you will be redirected to AWS and asked to log in if you haven’t already. After logging in, you should see a page similar to this.
6. By clicking on `Start Quickstart`, you will be redirected to AWS and asked to log in if you haven’t already. After logging in, you should see a page similar to this.

<div style={{maxWidth: '400px'}}>
<Lightbox src="/img/databricks_tutorial/images/quick_create_stack.png" title="Create AWS resources" />
Expand All @@ -80,7 +77,7 @@ You can check out [dbt Fundamentals](https://learn.getdbt.com/courses/dbt-fundam
If you get a session error and don’t get redirected to this page, you can go back to the Databricks UI and create a workspace from the interface. All you have to do is click **create workspaces**, choose the quickstart, fill out the form and click **Start Quickstart**.
:::

8. There is no need to change any of the pre-filled out fields in the Parameters. Just add in your Databricks password under **Databricks Account Credentials**. Check off the Acknowledgement and click **Create stack**.
7. There is no need to change any of the pre-filled out fields in the Parameters. Just add in your Databricks password under **Databricks Account Credentials**. Check off the Acknowledgement and click **Create stack**.
<div style={{maxWidth: '400px'}}>
<Lightbox src="/img/databricks_tutorial/images/parameters.png" title="Parameters" />
</div>
Expand All @@ -89,11 +86,11 @@ If you get a session error and don’t get redirected to this page, you can go b
<Lightbox src="/img/databricks_tutorial/images/create_stack.png" title="Capabilities" />
</div>

10. Go back to the Databricks tab. You should see that your workspace is ready to use.
8. Go back to the Databricks tab. You should see that your workspace is ready to use.
<div style={{maxWidth: '400px'}}>
<Lightbox src="/img/databricks_tutorial/images/workspaces.png" title="A Databricks Workspace" />
</div>
11. Now let’s jump into the workspace. Click **Open** and log into the workspace using the same login as you used to log into the account.
9. Now let’s jump into the workspace. Click **Open** and log into the workspace using the same login as you used to log into the account.

## Load data

Expand Down
9 changes: 2 additions & 7 deletions website/docs/reference/node-selection/methods.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,13 +44,8 @@ Use the `resource_type` method to select nodes of a particular type (`model`, `t

```bash
dbt build --select "resource_type:exposure" # build all resources upstream of exposures
dbt list --select "resource_type:test" # list all tests in your project
```

Note: This method doesn't work for sources, so use the [`--resource-type`](/reference/commands/list) option of the list command instead:

```bash
dbt list --resource-type source
dbt list --select "resource_type:test" # list all tests in your project
dbt list --select "resource_type:source" # list all sources in your project
```

### The "path" method
Expand Down
Binary file modified website/static/img/databricks_tutorial/images/choose_plan.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified website/static/img/databricks_tutorial/images/signup_form.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 33c0f62

Please sign in to comment.