diff --git a/website/docs/docs/build/packages.md b/website/docs/docs/build/packages.md
index 9ba4ceeaff5..82ba2c3d74c 100644
--- a/website/docs/docs/build/packages.md
+++ b/website/docs/docs/build/packages.md
@@ -157,9 +157,57 @@ packages:
Where `name: 'dbt_utils'` specifies the subfolder of `dbt_packages` that's created for the package source code to be installed within.
-### Private packages
+## Private packages
-#### SSH Key Method (Command Line only)
+### Native private packages
+
+dbt Cloud supports private packages from [supported](#prerequisites) Git repos leveraging an exisiting [configuration](/docs/cloud/git/git-configuration-in-dbt-cloud) in your environment. Previously, you had to configure a [token](#git-token-method) to retrieve packages from your private repos.
+
+#### Prerequisites
+
+To use native private packages, you must have one of the following Git providers configured in the **Integrations** section of your **Account settings**:
+- [GitHub](/docs/cloud/git/connect-github)
+- [Azure DevOps](/docs/cloud/git/connect-azure-devops)
+- Support for GitLab is coming soon.
+
+
+#### Configuration
+
+Use the `private` key in your `packages.yml` or `dependencies.yml` to clone package repos using your existing dbt Cloud Git integration without having to provision an access token or create a dbt Cloud environment variable:
+
+
+
+```yaml
+packages:
+ - private: dbt-labs/awesome_repo
+ - package: normal packages
+
+ [...]
+```
+
+
+
+You can pin private packages similar to regular dbt packages:
+
+```yaml
+packages:
+ - private: dbt-labs/awesome_repo
+ revision: "0.9.5" # Pin to a tag, branch, or complete 40-character commit hash
+
+```
+
+If you are using multiple Git integrations, disambiguate by adding the provider key:
+
+```yaml
+packages:
+ - private: dbt-labs/awesome_repo
+ provider: "github" # GitHub and Azure are currently supported. GitLab is coming soon.
+
+```
+
+With this method, you can retrieve private packages from an integrated Git provider without any additional steps to connect.
+
+### SSH key method (command line only)
If you're using the Command Line, private packages can be cloned via SSH and an SSH key.
When you use SSH keys to authenticate to your git remote server, you don’t need to supply your username and password each time. Read more about SSH keys, how to generate them, and how to add them to your git provider here: [Github](https://docs.github.com/en/github/authenticating-to-github/connecting-to-github-with-ssh) and [GitLab](https://docs.gitlab.com/ee/user/ssh.html).
@@ -177,7 +225,14 @@ packages:
If you're using dbt Cloud, the SSH key method will not work, but you can use the [HTTPS Git Token Method](https://docs.getdbt.com/docs/build/packages#git-token-method).
-#### Git token method
+### Git token method
+
+:::note
+
+dbt Cloud has [native support](#native-private-packages) for Git hosted private packages with GitHub and Azure DevOps (GitLab coming soon). If you are using a supported [integrated Git environment](/docs/cloud/git/git-configuration-in-dbt-cloud), you no longer need to configure Git tokens to retrieve private packages.
+
+:::
+
This method allows the user to clone via HTTPS by passing in a git token via an environment variable. Be careful of the expiration date of any token you use, as an expired token could cause a scheduled run to fail. Additionally, user tokens can create a challenge if the user ever loses access to a specific repo.
@@ -246,7 +301,7 @@ Read more about creating a Personal Access Token [here](https://confluence.atlas
-#### Configure subdirectory for packaged projects
+## Configure subdirectory for packaged projects
In general, dbt expects `dbt_project.yml` to be located as a top-level file in a package. If the packaged project is instead nested in a subdirectory—perhaps within a much larger mono repo—you can optionally specify the folder path as `subdirectory`. dbt will attempt a [sparse checkout](https://git-scm.com/docs/git-sparse-checkout) of just the files located within that subdirectory. Note that you must be using a recent version of `git` (`>=2.26.0`).
diff --git a/website/docs/docs/deploy/about-ci.md b/website/docs/docs/deploy/about-ci.md
index 1de9365219c..e27d2e7d08e 100644
--- a/website/docs/docs/deploy/about-ci.md
+++ b/website/docs/docs/deploy/about-ci.md
@@ -19,9 +19,9 @@ Refer to the guide [Get started with continuous integration tests](/guides/set-u
icon="dbt-bit"/>
-
\ No newline at end of file
+
diff --git a/website/docs/docs/deploy/deploy-jobs.md b/website/docs/docs/deploy/deploy-jobs.md
index 96ec8a1932e..9a0cc3cfcfa 100644
--- a/website/docs/docs/deploy/deploy-jobs.md
+++ b/website/docs/docs/deploy/deploy-jobs.md
@@ -13,7 +13,7 @@ You can use deploy jobs to build production data assets. Deploy jobs make it eas
- Job run details, including run timing, [model timing data](/docs/deploy/run-visibility#model-timing), and [artifacts](/docs/deploy/artifacts)
- Detailed run steps with logs and their run step statuses
-You can create a deploy job and configure it to run on [scheduled days and times](#schedule-days) or enter a [custom cron schedule](#cron-schedule).
+You can create a deploy job and configure it to run on [scheduled days and times](#schedule-days), enter a [custom cron schedule](#cron-schedule), or [trigger the job after another job completes](#trigger-on-job-completion).
## Prerequisites
@@ -115,11 +115,18 @@ Examples of cron job schedules:
### Trigger on job completion
-To _chain_ deploy jobs together, enable the **Run when another job finishes** option and specify the upstream (parent) job that, when it completes, will trigger your job. You can also use the [Create Job API](/dbt-cloud/api-v2#/operations/Create%20Job) to do this.
+To _chain_ deploy jobs together:
+1. In the **Triggers** section, enable the **Run when another job finishes** option.
+2. Select the project that has the deploy job you want to run after completion.
+3. Specify the upstream (parent) job that, when completed, will trigger your job.
+ - You can also use the [Create Job API](/dbt-cloud/api-v2#/operations/Create%20Job) to do this.
+4. In the **Completes on** option, select the job run status(es) that will [enqueue](/docs/deploy/job-scheduler#scheduler-queue) the deploy job.
-You can set up a configuration where an upstream job triggers multiple downstream (child) jobs and jobs in other projects. You must have proper [permissions](/docs/cloud/manage-access/enterprise-permissions#project-role-permissions) to the project and job to configure the trigger.
+
-For jobs that are triggered to run by another job, a link to the upstream job run is available from your [job's run details](/docs/deploy/run-visibility#job-run-details).
+5. You can set up a configuration where an upstream job triggers multiple downstream (child) jobs and jobs in other projects. You must have proper [permissions](/docs/cloud/manage-access/enterprise-permissions#project-role-permissions) to the project and job to configure the trigger.
+
+If another job triggers your job to run, you can find a link to the upstream job in the [run details section](/docs/deploy/run-visibility#job-run-details).
## Related docs
diff --git a/website/docs/docs/deploy/deployment-overview.md b/website/docs/docs/deploy/deployment-overview.md
index 9382634812f..e9c25f68c08 100644
--- a/website/docs/docs/deploy/deployment-overview.md
+++ b/website/docs/docs/deploy/deployment-overview.md
@@ -33,7 +33,7 @@ Learn how to use dbt Cloud's features to help your team ship timely and quality
diff --git a/website/docs/docs/deploy/jobs.md b/website/docs/docs/deploy/jobs.md
index 08d6cc585ef..1826836d602 100644
--- a/website/docs/docs/deploy/jobs.md
+++ b/website/docs/docs/deploy/jobs.md
@@ -4,21 +4,22 @@ sidebar_label: "About Jobs"
description: "Learn about the different job types in dbt Cloud and what their differences are."
tags: [scheduler]
pagination_next: "docs/deploy/deploy-jobs"
+hide_table_of_contents: true
---
These are the available job types in dbt Cloud:
-- [Deploy jobs](/docs/deploy/deploy-jobs) — To create and set up triggers for building production data assets
-- [Continuous integration (CI) jobs](/docs/deploy/continuous-integration) — To create and set up triggers for checking code changes
-- [Merge jobs](/docs/deploy/merge-jobs) — To create and set up triggers for merged pull requests
+- [Deploy jobs](/docs/deploy/deploy-jobs) — Build production data assets. Runs on a schedule, by API, or after another job completes.
+- [Continuous integration (CI) jobs](/docs/deploy/continuous-integration) — Test and validate code changes before merging. Triggered by commit to a PR or by API.
+- [Merge jobs](/docs/deploy/merge-jobs) — Deploy merged changes into production. Runs after a successful PR merge or by API.
-Below is a comparison table that describes the behaviors of the different job types:
+The following comparison table describes the behaviors of the different job types:
| | **Deploy jobs** | **CI jobs** | **Merge jobs** |
| --- | --- | --- | --- |
| Purpose | Builds production data assets. | Builds and tests new code before merging changes into production. | Build merged changes into production or update state for deferral. |
-| Trigger types | Triggered by a schedule or by API. | Triggered by a commit to a PR or by API. | Triggered by a successful merge into the environment's branch or by API.|
+| Trigger types | Triggered by a schedule, API, or the successful completion of another job. | Triggered by a commit to a PR or by API. | Triggered by a successful merge into the environment's branch or by API.|
| Destination | Builds into a production database and schema. | Builds into a staging database and ephemeral schema, lived for the lifetime of the PR. | Builds into a production database and schema. |
| Execution mode | Runs execute sequentially, so as to not have collisions on the underlying DAG. | Runs execute in parallel to promote team velocity. | Runs execute sequentially, so as to not have collisions on the underlying DAG. |
| Efficiency run savings | Detects over-scheduled jobs and cancels unnecessary runs to avoid queue clog. | Cancels existing runs when a newer commit is pushed to avoid redundant work. | N/A |
| State comparison | Only sometimes needs to detect state. | Almost always needs to compare state against the production environment to build on modified code and its dependents. | Almost always needs to compare state against the production environment to build on modified code and its dependents. |
-| Job run duration | Limit is 24 hours. | Limit is 24 hours. | Limit is 24 hours. |
\ No newline at end of file
+| Job run duration | Limit is 24 hours. | Limit is 24 hours. | Limit is 24 hours. |
diff --git a/website/static/img/docs/deploy/deploy-job-completion.jpg b/website/static/img/docs/deploy/deploy-job-completion.jpg
new file mode 100644
index 00000000000..67b76950df3
Binary files /dev/null and b/website/static/img/docs/deploy/deploy-job-completion.jpg differ