diff --git a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md index 98276776019..7990cf6752f 100644 --- a/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md +++ b/website/docs/best-practices/how-we-build-our-metrics/semantic-layer-3-build-semantic-models.md @@ -293,7 +293,7 @@ semantic_models: Let's review the basics of semantic models: -- đŸ§± Consist off **entities, dimensions, and measures**. +- đŸ§± Consist of **entities, dimensions, and measures**. - đŸ«‚ Describe the **semantics and relationships of objects** in the warehouse. - 1ïžâƒŁ Correspond to a **single logical model** in your dbt project. diff --git a/website/docs/docs/build/dimensions.md b/website/docs/docs/build/dimensions.md index 6b665f4251c..affd74f81aa 100644 --- a/website/docs/docs/build/dimensions.md +++ b/website/docs/docs/build/dimensions.md @@ -124,7 +124,7 @@ mf query --metrics users_created,users_deleted --group-by metric_time__year --or You can set `is_partition` for time to define specific time spans. Additionally, use the `type_params` section to set `time_granularity` to adjust aggregation details (hourly, daily, weekly, and so on). - + diff --git a/website/docs/docs/build/metrics-overview.md b/website/docs/docs/build/metrics-overview.md index 586402b6847..38b9b22bdb2 100644 --- a/website/docs/docs/build/metrics-overview.md +++ b/website/docs/docs/build/metrics-overview.md @@ -90,7 +90,7 @@ import SLCourses from '/snippets/_sl-course.md'; -## Default granularity for metircs +## Default granularity for metrics It's possible to define a default time granularity for metrics if it's different from the granularity of the default aggregation time dimensions (`metric_time`). This is useful if your time dimension has a very fine grain, like second or hour, but you typically query metrics rolled up at a coarser grain. The granularity can be set using the `time_granularity` parameter on the metric, and defaults to `day`. If day is not available because the dimension is defined at a coarser granularity, it will default to the defined granularity for the dimension. diff --git a/website/docs/docs/cloud/account-settings.md b/website/docs/docs/cloud/account-settings.md new file mode 100644 index 00000000000..6d35da3b5f3 --- /dev/null +++ b/website/docs/docs/cloud/account-settings.md @@ -0,0 +1,50 @@ +--- +title: "Account settings in dbt Cloud" +sidebar_label: "Account settings" +description: "Learn how to enable account settings for your dbt Cloud users." +--- + +The following sections describe the different **Account settings** available from your dbt Cloud account in the sidebar (under your account name on the lower left-hand side). + + + +## Git repository caching + +At the start of every job run, dbt Cloud clones the project's Git repository so it has the latest versions of your project's code and runs `dbt deps` to install your dependencies. + +For improved reliability and performance on your job runs, you can enable dbt Cloud to keep a cache of the project's Git repository. So, if there's a third-party outage that causes the cloning operation to fail, dbt Cloud will instead use the cached copy of the repo so your jobs can continue running as scheduled. + +dbt Cloud caches your project's Git repo after each successful run and retains it for 8 days if there are no repo updates. It caches all packages regardless of installation method and does not fetch code outside of the job runs. + +dbt Cloud will use the cached copy of your project's Git repo under these circumstances: + +- Outages from third-party services (for example, the [dbt package hub](https://hub.getdbt.com/)). +- Git authentication fails. +- There are syntax errors in the `packages.yml` file. You can set up and use [continuous integration (CI)](/docs/deploy/continuous-integration) to find these errors sooner. +- If a package doesn't work with the current dbt version. You can set up and use [continuous integration (CI)](/docs/deploy/continuous-integration) to identify this issue sooner. + +To use, select the **Enable repository caching** option from your account settings. + + + +## Partial parsing + +At the start of every dbt invocation, dbt reads all the files in your project, extracts information, and constructs an internal manifest containing every object (model, source, macro, and so on). Among other things, it uses the `ref()`, `source()`, and `config()` macro calls within models to set properties, infer dependencies, and construct your project's DAG. When dbt finishes parsing your project, it stores the internal manifest in a file called `partial_parse.msgpack`. + +Parsing projects can be time-consuming, especially for large projects with hundreds of models and thousands of files. To reduce the time it takes dbt to parse your project, use the partial parsing feature in dbt Cloud for your environment. When enabled, dbt Cloud uses the `partial_parse.msgpack` file to determine which files have changed (if any) since the project was last parsed, and then it parses _only_ the changed files and the files related to those changes. + +Partial parsing in dbt Cloud requires dbt version 1.4 or newer. The feature does have some known limitations. Refer to [Known limitations](/reference/parsing#known-limitations) to learn more about them. + +To use, select the **Enable partial parsing between deployment runs** option from your account settings. + + + +## Account access to Advanced CI features + +[Advanced CI](/docs/deploy/advanced-ci) features, such as [compare changes](/docs/deploy/advanced-ci#compare-changes), allow dbt Cloud account members to view details about the changes between what's in the production environment and the pull request. + +To use Advanced CI features, your dbt Cloud account must have access to them. Ask your dbt Cloud administrator to enable Advanced CI features on your account, which they can do by choosing the **Enable account access to Advanced CI** option from the account settings. + +Once enabled, the **Run compare changes** option becomes available in the CI job settings for you to select. + + \ No newline at end of file diff --git a/website/docs/docs/dbt-versions/2023-release-notes.md b/website/docs/docs/dbt-versions/2023-release-notes.md index d0af7f70126..60f4fd42929 100644 --- a/website/docs/docs/dbt-versions/2023-release-notes.md +++ b/website/docs/docs/dbt-versions/2023-release-notes.md @@ -110,7 +110,7 @@ Archived release notes for dbt Cloud from 2023 Now available for dbt Cloud Enterprise plans is a new option to enable Git repository caching for your job runs. When enabled, dbt Cloud caches your dbt project's Git repository and uses the cached copy instead if there's an outage with the Git provider. This feature improves the reliability and stability of your job runs. - To learn more, refer to [Repo caching](/docs/deploy/deploy-environments#git-repository-caching). + To learn more, refer to [Repo caching](/docs/cloud/account-settings#git-repository-caching). diff --git a/website/docs/docs/dbt-versions/release-notes.md b/website/docs/docs/dbt-versions/release-notes.md index 79adea9bfbb..ca128402743 100644 --- a/website/docs/docs/dbt-versions/release-notes.md +++ b/website/docs/docs/dbt-versions/release-notes.md @@ -309,7 +309,7 @@ The following features are new or enhanced as part of our [dbt Cloud Launch Show January also saw some refreshed content, either aligning with new product features or requests from the community: - - Native support for [partial parsing in dbt Cloud](https://docs.getdbt.com/docs/dbt-cloud-environments#partial-parsing) + - Native support for [partial parsing in dbt Cloud](https://docs.getdbt.com/docs/cloud/account-settings#partial-parsing) - Updated guidance on using dots or underscores in the [Best practice guide for models](https://docs.getdbt.com/best-practices/how-we-style/1-how-we-style-our-dbt-models) - Updated [PrivateLink for VCS docs](https://docs.getdbt.com/docs/cloud/secure/vcs-privatelink) - Added a new `job_runner` role in our [Enterprise project role permissions docs](https://docs.getdbt.com/docs/cloud/manage-access/enterprise-permissions#project-role-permissions) @@ -326,7 +326,7 @@ The following features are new or enhanced as part of our [dbt Cloud Launch Show By default, dbt parses all the files in your project at the beginning of every dbt invocation. Depending on the size of your project, this operation can take a long time to complete. With the new partial parsing feature in dbt Cloud, you can reduce the time it takes for dbt to parse your project. When enabled, dbt Cloud parses only the changed files in your project instead of parsing all the project files. As a result, your dbt invocations will take less time to run. - To learn more, refer to [Partial parsing](/docs/dbt-cloud-environments#partial-parsing). + To learn more, refer to [Partial parsing](/docs/cloud/account-settings#partial-parsing). diff --git a/website/docs/docs/deploy/about-ci.md b/website/docs/docs/deploy/about-ci.md index 51b3b160fae..1de9365219c 100644 --- a/website/docs/docs/deploy/about-ci.md +++ b/website/docs/docs/deploy/about-ci.md @@ -6,7 +6,7 @@ pagination_next: "docs/deploy/continuous-integration" hide_table_of_contents: true --- -Use [CI jobs](/docs/deploy/ci-jobs) in dbt Cloud to set up automation for testing code changes before merging to production. Additionally, [enable Advanced CI features](/docs/dbt-cloud-environments#account-access-to-advanced-ci-features) for these jobs to evaluate whether the code changes are producing the appropriate data changes you want by reviewing the comparison differences dbt provides. +Use [CI jobs](/docs/deploy/ci-jobs) in dbt Cloud to set up automation for testing code changes before merging to production. Additionally, [enable Advanced CI features](/docs/cloud/account-settings#account-access-to-advanced-ci-features) for these jobs to evaluate whether the code changes are producing the appropriate data changes you want by reviewing the comparison differences dbt provides. Refer to the guide [Get started with continuous integration tests](/guides/set-up-ci?step=1) for more information. diff --git a/website/docs/docs/deploy/ci-jobs.md b/website/docs/docs/deploy/ci-jobs.md index 0c4cad9e674..649d64d30c7 100644 --- a/website/docs/docs/deploy/ci-jobs.md +++ b/website/docs/docs/deploy/ci-jobs.md @@ -15,7 +15,7 @@ dbt Labs recommends that you create your CI job in a dedicated dbt Cloud [deploy - CI features: - For both the [concurrent CI checks](/docs/deploy/continuous-integration#concurrent-ci-checks) and [smart cancellation of stale builds](/docs/deploy/continuous-integration#smart-cancellation) features, your dbt Cloud account must be on the [Team or Enterprise plan](https://www.getdbt.com/pricing/). - [Advanced CI](/docs/deploy/advanced-ci) features: - - For the [compare changes](/docs/deploy/advanced-ci#compare-changes) feature, your dbt Cloud account must have access to Advanced CI. Please ask your [dbt Cloud administrator to enable](/docs/dbt-cloud-environments#account-access-to-advanced-ci-features) this for you. + - For the [compare changes](/docs/deploy/advanced-ci#compare-changes) feature, your dbt Cloud account must have access to Advanced CI. Please ask your [dbt Cloud administrator to enable](/docs/cloud/account-settings#account-access-to-advanced-ci-features) this for you. - Set up a [connection with your Git provider](/docs/cloud/git/git-configuration-in-dbt-cloud). This integration lets dbt Cloud run jobs on your behalf for job triggering. - If you're using a native [GitLab](/docs/cloud/git/connect-gitlab) integration, you need a paid or self-hosted account that includes support for GitLab webhooks and [project access tokens](https://docs.gitlab.com/ee/user/project/settings/project_access_tokens.html). If you're using GitLab Free, merge requests will trigger CI jobs but CI job status updates (success or failure of the job) will not be reported back to GitLab. diff --git a/website/docs/guides/adapter-creation.md b/website/docs/guides/adapter-creation.md index f737afa0392..066d27a7aaa 100644 --- a/website/docs/guides/adapter-creation.md +++ b/website/docs/guides/adapter-creation.md @@ -556,6 +556,108 @@ While much of dbt's adapter-specific functionality can be modified in adapter ma See [this GitHub discussion](https://github.com/dbt-labs/dbt-core/discussions/5468) for information on the macros required for `GRANT` statements: +### Behavior change flags + +Starting in `dbt-adapters==1.5.0` and `dbt-core==1.8.7`, adapter maintainers can implement their own behavior change flags. Refer to [Behavior changes](https://docs.getdbt.com/reference/global-configs/behavior-changes)for more information. + +Behavior Flags are not intended to be long-living feature flags. They should be implemented with the expectation that the behavior will be the default within an expected period of time. To implement a behavior change flag, you must provide a name for the flag, a default setting (`True` / `False`), an optional source, and a description and/or a link to the flag's documentation on docs.getdbt.com. + +We recommend having a description and documentation link whenever possible. The description and/or docs should provide end users context for why the flag exists, why they may see a warning, and why they may want to utilize the behavior flag. Behavior change flags can be implemented by overwriting `_behavior_flags()` on the adapter in `impl.py`: + + + +```python +class ABCAdapter(BaseAdapter): + ... + @property + def _behavior_flags(self) -> List[BehaviorFlag]: + return [ + { + "name": "enable_new_functionality_requiring_higher_permissions", + "default": False, + "source": "dbt-abc", + "description": ( + "The dbt-abc adapter is implementing a new method for sourcing metadata. " + "This is a more performant way for dbt to source metadata but requires higher permissions on the platform. " + "Enabling this without granting the requisite permissions will result in an error. " + "This feature is expected to be required by Spring 2025." + ), + "docs_url": "https://docs.getdbt.com/reference/global-configs/behavior-changes#abc-enable_new_functionality_requiring_higher_permissions", + } + ] +``` + + + +Once a behavior change flag has been implemented, it can be referenced on the adapter both in `impl.py` and in Jinja macros: + + + +```python +class ABCAdapter(BaseAdapter): + ... + def some_method(self, *args, **kwargs): + if self.behavior.enable_new_functionality_requiring_higher_permissions: + # do the new thing + else: + # do the old thing +``` + + + + + +```sql +{% macro some_macro(**kwargs) %} + {% if adapter.behavior.enable_new_functionality_requiring_higher_permissions %} + {# do the new thing #} + {% else %} + {# do the old thing #} + {% endif %} +{% endmacro %} +``` + + + +Every time the behavior flag evaluates to `False,` it warns the user, informing them that a change will occur in the future. + +This warning doesn't display when the flag evaluates to `True` as the user is already in the new experience. + +Recognizing that the warnings can be disruptive and are not always necessary, you can evaluate the flag without triggering the warning. Simply append `.no_warn` to the end of the flag. + + + + +```python + class ABCAdapter(BaseAdapter): + ... + def some_method(self, *args, **kwargs): + if self.behavior.enable_new_functionality_requiring_higher_permissions.no_warn: + # do the new thing + else: + # do the old thing +``` + + + + + +```sql +{% macro some_macro(**kwargs) %} + {% if adapter.behavior.enable_new_functionality_requiring_higher_permissions.no_warn %} + {# do the new thing #} + {% else %} + {# do the old thing #} + {% endif %} +{% endmacro %} +``` + + + +It's best practice to evaluate a behavior flag as few times as possible. This will make it easier to remove once the behavior change has matured. + +As a result, evaluating the flag earlier in the logic flow is easier. Then, take either the old or the new path. While this may create some duplication in code, using behavior flags in this way provides a safer way to implement a change, which we are already admitting is risky or even breaking in nature. + ### Other files #### `profile_template.yml` diff --git a/website/docs/guides/core-to-cloud-1.md b/website/docs/guides/core-to-cloud-1.md index 99c6ed82bf1..efed66c862a 100644 --- a/website/docs/guides/core-to-cloud-1.md +++ b/website/docs/guides/core-to-cloud-1.md @@ -76,9 +76,9 @@ This section outlines the steps to set up your dbt Cloud account and configure i ### Additional configuration Explore these additional configurations for performance and reliability improvements: -1. In **Account settings**, enable [partial parsing](/docs/deploy/deploy-environments#partial-parsing) to only reparse changed files, saving time. +1. In **Account settings**, enable [partial parsing](/docs/cloud/account-settings#partial-parsing) to only reparse changed files, saving time. -2. In **Account settings**, enable [Git repo caching](/docs/deploy/deploy-environments#git-repository-caching) for job reliability & third-party outage protection. +2. In **Account settings**, enable [Git repo caching](/docs/cloud/account-settings#git-repository-caching) for job reliability & third-party outage protection. ## Data platform setup @@ -142,7 +142,7 @@ The most common data environments are production, staging, and development. The - Streamlining the process of switching between development, staging, and production contexts. - Making it easy to configure environments through the dbt Cloud UI instead of manually editing the `profiles.yml` file. You can also [set up](/reference/dbt-jinja-functions/target) or [customize](/docs/build/custom-target-names) target names in dbt Cloud. - Adding `profiles.yml` attributes to dbt Cloud environment settings with [Extended Attributes](/docs/dbt-cloud-environments#extended-attributes). -- Using [Git repo caching](/docs/dbt-cloud-environments#git-repository-caching) to protect you from third-party outages, Git auth failures, and more. +- Using [Git repo caching](/docs/cloud/account-settings#git-repository-caching) to protect you from third-party outages, Git auth failures, and more. ### Initial setup steps 1. **Set up development environment** — Set up your [development](/docs/dbt-cloud-environments#create-a-development-environment) environment and [development credentials](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud#access-the-cloud-ide). You’ll need this to access your dbt project and start developing. diff --git a/website/docs/guides/core-to-cloud-3.md b/website/docs/guides/core-to-cloud-3.md index 0ea22de8478..7d482d54471 100644 --- a/website/docs/guides/core-to-cloud-3.md +++ b/website/docs/guides/core-to-cloud-3.md @@ -99,11 +99,11 @@ dbt Cloud provides robust orchestration that enables you to schedule, run, and m ### Tips -- Enable [partial parsing](/docs/deploy/deploy-environments#partial-parsing) between jobs in dbt Cloud to significantly speed up project parsing by only processing changed files, optimizing performance for large projects. +- Enable [partial parsing](/docs/cloud/account-settings#partial-parsing) between jobs in dbt Cloud to significantly speed up project parsing by only processing changed files, optimizing performance for large projects. - [Run multiple CI/CD](/docs/deploy/continuous-integration) jobs at the same time which will not block production runs. The Job scheduler automatically cancels stale runs when a newer commit is pushed. This is because each PR will run in its own schema. - dbt Cloud automatically [cancels](/docs/deploy/job-scheduler#run-cancellation-for-over-scheduled-jobs) a scheduled run if the existing run is still executing. This prevents unnecessary, duplicative executions. -- Protect you and your data freshness from third-party outages by enabling dbt Cloud’s [Git repository caching](/docs/deploy/deploy-environments#git-repository-caching), which keeps a cache of the project's Git repository. -- [Link deploy jobs](/docs/deploy/deploy-jobs#trigger-on-job-completion--) across dbt Cloud projects by configuring your job or using the [Create Job API](/dbt-cloud/api-v2#/operations/Create%20Job) to do this. +- Protect you and your data freshness from third-party outages by enabling dbt Cloud’s [Git repository caching](/docs/cloud/account-settings#git-repository-caching), which keeps a cache of the project's Git repository. +- [Link deploy jobs](/docs/deploy/deploy-jobs#trigger-on-job-completion) across dbt Cloud projects by configuring your job or using the [Create Job API](/dbt-cloud/api-v2#/operations/Create%20Job) to do this. - [Rerun your jobs](/docs/deploy/retry-jobs) from the start or the point of failure if your dbt job run completed with a status of **`Error.`** ### Caveats diff --git a/website/docs/reference/commands/retry.md b/website/docs/reference/commands/retry.md index 8da5d5a77a6..68d18dfd77a 100644 --- a/website/docs/reference/commands/retry.md +++ b/website/docs/reference/commands/retry.md @@ -10,6 +10,8 @@ Retry works with the following commands: - [`build`](/reference/commands/build) - [`compile`](/reference/commands/compile) +- [`clone`](/reference/commands/clone) +- [`docs generate`](/reference/commands/cmd-docs#dbt-docs-generate) - [`seed`](/reference/commands/seed) - [`snapshot`](/reference/commands/build) - [`test`](/reference/commands/test) diff --git a/website/docs/reference/dbt-jinja-functions/adapter.md b/website/docs/reference/dbt-jinja-functions/adapter.md index 7d2ae696a78..54e1e31fd84 100644 --- a/website/docs/reference/dbt-jinja-functions/adapter.md +++ b/website/docs/reference/dbt-jinja-functions/adapter.md @@ -190,7 +190,7 @@ Drops a schema (or equivalent) in the target database. If the target schema does ```sql -{% do adapter.drop_schema(api.Relation.create(database=target.database, schema="my_schema"))) %} +{% do adapter.drop_schema(api.Relation.create(database=target.database, schema="my_schema")) %} ``` diff --git a/website/docs/reference/global-configs/behavior-changes.md b/website/docs/reference/global-configs/behavior-changes.md index 4abdd7d2104..8e9e93b8488 100644 --- a/website/docs/reference/global-configs/behavior-changes.md +++ b/website/docs/reference/global-configs/behavior-changes.md @@ -41,7 +41,12 @@ By contrast, behavior change migrations happen slowly, over the course of months These flags _must_ be set in the `flags` dictionary in `dbt_project.yml`. They configure behaviors closely tied to project code, which means they should be defined in version control and modified through pull or merge requests, with the same testing and peer review. -The following example displays the current flags and their current default values in the latest dbt Cloud and dbt Core versions. To opt out of a specific behavior change, set the values of the flag to `False` in `dbt_project.yml`. You'll continue to see warnings for legacy behaviors that you have opted out of explicitly until you either resolve them (switch the flag to `True`) or choose to silence the warnings using the `warn_error_options.silence` flag. +The following example displays the current flags and their current default values in the latest dbt Cloud and dbt Core versions. To opt out of a specific behavior change, set the values of the flag to `False` in `dbt_project.yml`. You will continue to see warnings for legacy behaviors you’ve opted out of, until you either: + +- Resolve the issue (by switching the flag to `True`) +- Silence the warnings using the `warn_error_options.silence` flag + +Here's an example of the available behavior change flags with their default values: @@ -50,6 +55,7 @@ flags: require_explicit_package_overrides_for_builtin_materializations: False require_model_names_without_spaces: False source_freshness_run_project_hooks: False + restrict_direct_pg_catalog_access: False ``` @@ -61,7 +67,7 @@ When we use dbt Cloud in the following table, we're referring to accounts that h | require_explicit_package_overrides_for_builtin_materializations | 2024.04.141 | 2024.06.192 | 1.6.14, 1.7.14 | 1.8.0 | | require_resource_names_without_spaces | 2024.05.146 | TBD* | 1.8.0 | 1.9.0 | | source_freshness_run_project_hooks | 2024.03.61 | TBD* | 1.8.0 | 1.9.0 | -| [Redshift] restrict_direct_pg_catalog_access | 2024.09.242 | TBD* | dbt-redshift v1.9.0 | 1.9.0 | +| [Redshift] [restrict_direct_pg_catalog_access](#redshift-restrict_direct_pg_catalog_access) | 2024.09.242 | TBD* | dbt-redshift v1.9.0 | 1.9.0 | When the dbt Cloud Maturity is "TBD," it means we have not yet determined the exact date when these flags' default values will change. Affected users will see deprecation warnings in the meantime, and they will receive emails providing advance warning ahead of the maturity date. In the meantime, if you are seeing a deprecation warning, you can either: - Migrate your project to support the new behavior, and then set the flag to `True` to stop seeing the warnings. diff --git a/website/docs/reference/global-configs/usage-stats.md b/website/docs/reference/global-configs/usage-stats.md index 62ead8834a6..73610c29586 100644 --- a/website/docs/reference/global-configs/usage-stats.md +++ b/website/docs/reference/global-configs/usage-stats.md @@ -25,6 +25,6 @@ For full transparency, you can see all the event definitions in [`tracking.py`]( send_anonymous_usage_stats: False ``` - dbt Core users can also use the` DO_NOT_TRACK` environment variable to enable or disable sending anonymous data. For more information, see [Environment variables](/docs/build/environment-variables). + dbt Core users can also use the `DO_NOT_TRACK` environment variable to enable or disable sending anonymous data. For more information, see [Environment variables](/docs/build/environment-variables). `DO_NOT_TRACK=1` is the same as `DBT_SEND_ANONYMOUS_USAGE_STATS=False` diff --git a/website/sidebars.js b/website/sidebars.js index fe1118b3be2..3ecff4567ce 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -48,6 +48,7 @@ const sidebarSettings = { link: { type: "doc", id: "docs/cloud/about-cloud-setup" }, items: [ "docs/cloud/about-cloud-setup", + "docs/cloud/account-settings", "docs/dbt-cloud-environments", "docs/cloud/migration", { diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index 7c6cf2f9431..6addd6a3a7a 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -82,44 +82,5 @@ If you're developing in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in #### Only the **top-level keys** are accepted in extended attributes This means that if you want to change a specific sub-key value, you must provide the entire top-level key as a JSON block in your resulting YAML. For example, if you want to customize a particular field within a [service account JSON](/docs/core/connect-data-platform/bigquery-setup#service-account-json) for your BigQuery connection (like 'project_id' or 'client_email'), you need to provide an override for the entire top-level `keyfile_json` main key/attribute using extended attributes. Include the sub-fields as a nested JSON block. -### Git repository caching -At the start of every job run, dbt Cloud clones the project's Git repository so it has the latest versions of your project's code and runs `dbt deps` to install your dependencies. - -For improved reliability and performance on your job runs, you can enable dbt Cloud to keep a cache of the project's Git repository. So, if there's a third-party outage that causes the cloning operation to fail, dbt Cloud will instead use the cached copy of the repo so your jobs can continue running as scheduled. - -dbt Cloud caches your project's Git repo after each successful run and retains it for 8 days if there are no repo updates. It caches all packages regardless of installation method and does not fetch code outside of the job runs. - -dbt Cloud will use the cached copy of your project's Git repo under these circumstances: - -- Outages from third-party services (for example, the [dbt package hub](https://hub.getdbt.com/)). -- Git authentication fails. -- There are syntax errors in the `packages.yml` file. You can set up and use [continuous integration (CI)](/docs/deploy/continuous-integration) to find these errors sooner. -- If a package doesn't work with the current dbt version. You can set up and use [continuous integration (CI)](/docs/deploy/continuous-integration) to identify this issue sooner. - -To enable Git repository caching, select **Account settings** from the gear menu and enable the **Repository caching** option. - - - -### Partial parsing - -At the start of every dbt invocation, dbt reads all the files in your project, extracts information, and constructs an internal manifest containing every object (model, source, macro, and so on). Among other things, it uses the `ref()`, `source()`, and `config()` macro calls within models to set properties, infer dependencies, and construct your project's DAG. When dbt finishes parsing your project, it stores the internal manifest in a file called `partial_parse.msgpack`. - -Parsing projects can be time-consuming, especially for large projects with hundreds of models and thousands of files. To reduce the time it takes dbt to parse your project, use the partial parsing feature in dbt Cloud for your environment. When enabled, dbt Cloud uses the `partial_parse.msgpack` file to determine which files have changed (if any) since the project was last parsed, and then it parses _only_ the changed files and the files related to those changes. - -Partial parsing in dbt Cloud requires dbt version 1.4 or newer. The feature does have some known limitations. Refer to [Known limitations](/reference/parsing#known-limitations) to learn more about them. - -To enable, select **Account settings** from the gear menu and enable the **Partial parsing** option. - - - -### Account access to Advanced CI features - -[Advanced CI](/docs/deploy/advanced-ci) features, such as [compare changes](/docs/deploy/advanced-ci#compare-changes), allow dbt Cloud account members to view details about the changes between what's in the production environment and the pull request. - -To use Advanced CI features, your dbt Cloud account must have access to them. Ask your dbt Cloud administrator to enable Advanced CI features on your account, which they can do by selecting **Account settings** from the gear menu and choosing the **Enable account access to Advanced CI** option. - -Once enabled, the **Run compare changes** option becomes available in the CI job settings for you to select. - - diff --git a/website/static/img/docs/dbt-cloud/example-sidebar-account-settings.png b/website/static/img/docs/dbt-cloud/example-sidebar-account-settings.png new file mode 100644 index 00000000000..9b2ba860145 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/example-sidebar-account-settings.png differ