diff --git a/website/docs/docs/build/environment-variables.md b/website/docs/docs/build/environment-variables.md index 8f3c968e50b..955bb79ed22 100644 --- a/website/docs/docs/build/environment-variables.md +++ b/website/docs/docs/build/environment-variables.md @@ -139,10 +139,14 @@ _The following variables are currently only available for GitHub, GitLab, and Az Environment variables can be used in many ways, and they give you the power and flexibility to do what you want to do more easily in dbt Cloud. -#### Clone private packages + + Now that you can set secrets as environment variables, you can pass git tokens into your package HTTPS URLs to allow for on-the-fly cloning of private repositories. Read more about enabling [private package cloning](/docs/build/packages#private-packages). -#### Dynamically set your warehouse in your Snowflake connection + + + + Environment variables make it possible to dynamically change the Snowflake virtual warehouse size depending on the job. Instead of calling the warehouse name directly in your project connection, you can reference an environment variable which will get set to a specific virtual warehouse at runtime. For example, suppose you'd like to run a full-refresh job in an XL warehouse, but your incremental job only needs to run in a medium-sized warehouse. Both jobs are configured in the same dbt Cloud environment. In your connection configuration, you can use an environment variable to set the warehouse name to `{{env_var('DBT_WAREHOUSE')}}`. Then in the job settings, you can set a different value for the `DBT_WAREHOUSE` environment variable depending on the job's workload. @@ -163,7 +167,10 @@ However, there are some limitations when using env vars with Snowflake OAuth Con Something to note, if you supply an environment variable in the account/host field, Snowflake OAuth Connection will **fail** to connect. This happens because the field doesn't pass through Jinja rendering, so dbt Cloud simply passes the literal `env_var` code into a URL string like `{{ env_var("DBT_ACCOUNT_HOST_NAME") }}.snowflakecomputing.com`, which is an invalid hostname. Use [extended attributes](/docs/deploy/deploy-environments#deployment-credentials) instead. ::: -#### Audit your run metadata + + + + Here's another motivating example that uses the dbt Cloud run ID, which is set automatically at each run. This additional data field can be used for auditing and debugging: ```sql @@ -189,3 +196,13 @@ select *, from users_aggregated ``` + + + + + +import SLEnvVars from '/snippets/_sl-env-vars.md'; + + + + diff --git a/website/docs/docs/build/metricflow-commands.md b/website/docs/docs/build/metricflow-commands.md index 1f50e501261..55472ba53ce 100644 --- a/website/docs/docs/build/metricflow-commands.md +++ b/website/docs/docs/build/metricflow-commands.md @@ -8,7 +8,7 @@ tags: [Metrics, Semantic Layer] Once you define metrics in your dbt project, you can query metrics, dimensions, and dimension values, and validate your configs using the MetricFlow commands. -MetricFlow allows you to define and query metrics in your dbt project in the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), or [dbt Core](/docs/core/installation-overview). To experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and dynamically query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. +MetricFlow allows you to define and query metrics in your dbt project in the [dbt Cloud](/docs/cloud/about-develop-dbt) or [dbt Core](/docs/core/installation-overview). To experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and dynamically query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. MetricFlow is compatible with Python versions 3.8, 3.9, 3.10, and 3.11. @@ -18,33 +18,18 @@ MetricFlow is a dbt package that allows you to define and query metrics in your Using MetricFlow with dbt Cloud means you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. -**dbt Cloud jobs** — MetricFlow commands aren't supported in dbt Cloud jobs yet. However, you can add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`python -m pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs. +dbt Cloud jobs support the `dbt sl validate` command to [automatically test your semantic nodes](/docs/deploy/ci-jobs#semantic-validations-in-ci). You can also add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`python -m pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs. - + -- MetricFlow [commands](#metricflow-commands) are embedded in the dbt Cloud CLI. This means you can immediately run them once you install the dbt Cloud CLI and don't need to install MetricFlow separately. -- You don't need to manage versioning — your dbt Cloud account will automatically manage the versioning for you. - - - - - -:::info -You can create metrics using MetricFlow in the dbt Cloud IDE and run the [dbt sl validate](/docs/build/validation#validations-command) command. Support for running more MetricFlow commands in the IDE will be available soon. -::: +In dbt Cloud, run MetricFlow commands directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) or in the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation). +For dbt Cloud CLI users, MetricFlow commands are embedded in the dbt Cloud CLI, which means you can immediately run them once you install the dbt Cloud CLI and don't need to install MetricFlow separately. You don't need to manage versioning because your dbt Cloud account will automatically manage the versioning for you. - - -:::tip Use dbt Cloud CLI for semantic layer development - -You can use the dbt Cloud CLI for the experience in defining and querying metrics in your dbt project. - -A benefit to using the dbt Cloud is that you won't need to manage versioning — your dbt Cloud account will automatically manage the versioning. -::: + You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-started) from [PyPI](https://pypi.org/project/dbt-metricflow/). You need to use `pip` to install MetricFlow on Windows or Linux operating systems: @@ -54,31 +39,37 @@ You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-star **Note**, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow. - +Something to note, MetricFlow `mf` commands return an error if you have a Metafont latex package installed. To run `mf` commands, uninstall the package. + -Something to note, MetricFlow `mf` commands return an error if you have a Metafont latex package installed. To run `mf` commands, uninstall the package. - ## MetricFlow commands MetricFlow provides the following commands to retrieve metadata and query metrics. - + -You can use the `dbt sl` prefix before the command name to execute them in the dbt Cloud CLI. For example, to list all metrics, run `dbt sl list metrics`. For a complete list of the MetricFlow commands and flags, run the `dbt sl --help` command in your terminal. +You can use the `dbt sl` prefix before the command name to execute them in the dbt Cloud IDE or dbt Cloud CLI. For example, to list all metrics, run `dbt sl list metrics`. + +dbt Cloud CLI users can run `dbt sl --help` in the terminal for a complete list of the MetricFlow commands and flags. + +The following table lists the commands compatible with the dbt Cloud IDE and dbt Cloud CLI: + +|
Command
|
Description
| dbt Cloud IDE | dbt Cloud CLI | +|---------|-------------|---------------|---------------| +| [`list`](#list) | Retrieves metadata values. | ✅ | ✅ | +| [`list metrics`](#list-metrics) | Lists metrics with dimensions. | ✅ | ✅ | +| [`list dimensions`](#list) | Lists unique dimensions for metrics. | ✅ | ✅ | +| [`list dimension-values`](#list-dimension-values) | List dimensions with metrics. | ✅ | ✅ | +| [`list entities`](#list-entities) | Lists all unique entities. | ✅ | ✅ | +| [`list saved-queries`](#list-saved-queries) | Lists available saved queries. Use the `--show-exports` flag to display each export listed under a saved query or `--show-parameters` to show the full query parameters each saved query uses. | ✅ | ✅ | +| [`query`](#query) | Query metrics, saved queries, and dimensions you want to see in the command line interface. Refer to [query examples](#query-examples) to help you get started. | ✅ | ✅ | +| [`validate`](#validate) | Validates semantic model configurations. | ✅ | ✅ | +| [`export`](#export) | Runs exports for a singular saved query for testing and generating exports in your development environment. You can also use the `--select` flag to specify particular exports from a saved query. | ❌ | ✅ | +| [`export-all`](#export-all) | Runs exports for multiple saved queries at once, saving time and effort. | ❌ | ✅ | -- [`list`](#list) — Retrieves metadata values. -- [`list metrics`](#list-metrics) — Lists metrics with dimensions. -- [`list dimensions`](#list) — Lists unique dimensions for metrics. -- [`list dimension-values`](#list-dimension-values) — List dimensions with metrics. -- [`list entities`](#list-entities) — Lists all unique entities. -- [`list saved-queries`](#list-saved-queries) — Lists available saved queries. Use the `--show-exports` flag to display each export listed under a saved query. -- [`query`](#query) — Query metrics, saved queries, and dimensions you want to see in the command line interface. Refer to [query examples](#query-examples) to help you get started. -- [`export`](#export) — Runs exports for a singular saved query for testing and generating exports in your development environment. You can also use the `--select` flag to specify particular exports from a saved query. -- [`export-all`](#export-all) — Runs exports for multiple saved queries at once, saving time and effort. -- [`validate`](#validate) — Validates semantic model configurations. - ## View query history in Explorer To enhance your discovery, you can view your model query history in various locations within dbt Explorer: -- [View from Performance charts](#view-from-performance-charts) -* [View from Project lineage](#view-from-project-lineage) -- [View from Model list](#view-from-model-list) +- [View from Performance charts](#view-from-performance-charts) +* [View from Project lineage](#view-from-project-lineage) +- [View from Model list](#view-from-model-list) ### View from Performance charts @@ -92,7 +95,7 @@ To enhance your discovery, you can view your model query history in various loca 4. Click on a model for more details and go to the **Performance** tab. 5. On the **Performance** tab, scroll down to the **Model performance** section. 6. Select the **Consumption queries** tab to view the consumption queries over a given time for that model. - + ### View from Project lineage diff --git a/website/docs/docs/dbt-cloud-apis/admin-cloud-api.md b/website/docs/docs/dbt-cloud-apis/admin-cloud-api.md index d4ec0a82d5f..fe024e60831 100644 --- a/website/docs/docs/dbt-cloud-apis/admin-cloud-api.md +++ b/website/docs/docs/dbt-cloud-apis/admin-cloud-api.md @@ -24,14 +24,14 @@ link="/dbt-cloud/api-v2-legacy" icon="pencil-paper"/> diff --git a/website/docs/docs/dbt-versions/release-notes.md b/website/docs/docs/dbt-versions/release-notes.md index f3a0c53844e..7c2614b2c10 100644 --- a/website/docs/docs/dbt-versions/release-notes.md +++ b/website/docs/docs/dbt-versions/release-notes.md @@ -20,6 +20,11 @@ Release notes are grouped by month for both multi-tenant and virtual private clo ## September 2024 +- **Enhancement**: You can now run [Semantic Layer commands](/docs/build/metricflow-commands) commands in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). The supported commands are `dbt sl list`, `dbt sl list metrics`, `dbt sl list dimension-values`, `dbt sl list saved-queries`, `dbt sl query`, `dbt sl list dimensions`, `dbt sl list entities`, and `dbt sl validate`. +- **New**: Microsoft Excel, a dbt Semantic Layer integration, is now generally available. The integration allows you to connect to Microsoft Excel to query metrics and collaborate with your team. Available for [Excel Desktop](https://pages.store.office.com/addinsinstallpage.aspx?assetid=WA200007100&rs=en-US&correlationId=4132ecd1-425d-982d-efb4-de94ebc83f26) or [Excel Online](https://pages.store.office.com/addinsinstallpage.aspx?assetid=WA200007100&rs=en-US&correlationid=4132ecd1-425d-982d-efb4-de94ebc83f26&isWac=True). For more information, refer to [Microsoft Excel](/docs/cloud-integrations/semantic-layer/excel). +- **New**: [Data health tile](/docs/collaborate/data-tile) is now generally available in dbt Explorer. Data health tiles provide a quick at-a-glance view of your data quality, highlighting potential issues in your data. You can embed these tiles in your dashboards to quickly identify and address data quality issues in your dbt project. +- **New**: dbt Explorer's Model query history feature is now in Preview for dbt Cloud Enterprise customers. Model query history allows you to view the count of consumption queries for a model based on the data warehouse's query logs. This feature provides data teams insight, so they can focus their time and infrastructure spend on the worthwhile used data products. To learn more, refer to [Model query history](/docs/collaborate/model-query-history). +- **Enhancement**: You can now use [Extended Attributes](/docs/dbt-cloud-environments#extended-attributes) and [Environment Variables](/docs/build/environment-variables) when connecting to the Semantic Layer. If you set a value directly in the Semantic Layer Credentials, it will have a higher priority than Extended Attributes. When using environment variables, the default value for the environment will be used. If you're using exports, job environment variable overrides aren't supported yet, but they will be soon. - **New:** There are two new [environment variable defaults](/docs/build/environment-variables#dbt-cloud-context) — `DBT_CLOUD_ENVIRONMENT_NAME` and `DBT_CLOUD_ENVIRONMENT_TYPE`. - **New:** The [Amazon Athena warehouse connection](/docs/cloud/connect-data-platform/connect-amazon-athena) is available as a public preview for dbt Cloud accounts that have upgraded to [`versionless`](/docs/dbt-versions/versionless-cloud). diff --git a/website/docs/docs/deploy/ci-jobs.md b/website/docs/docs/deploy/ci-jobs.md index 649d64d30c7..420672c9595 100644 --- a/website/docs/docs/deploy/ci-jobs.md +++ b/website/docs/docs/deploy/ci-jobs.md @@ -14,6 +14,7 @@ dbt Labs recommends that you create your CI job in a dedicated dbt Cloud [deploy - You have a dbt Cloud account. - CI features: - For both the [concurrent CI checks](/docs/deploy/continuous-integration#concurrent-ci-checks) and [smart cancellation of stale builds](/docs/deploy/continuous-integration#smart-cancellation) features, your dbt Cloud account must be on the [Team or Enterprise plan](https://www.getdbt.com/pricing/). + - The [SQL linting](/docs/deploy/continuous-integration#sql-linting) feature is currently available in beta to a limited group of users and is gradually being rolled out. If you're in the beta, the **Linting** option is available for use. - [Advanced CI](/docs/deploy/advanced-ci) features: - For the [compare changes](/docs/deploy/advanced-ci#compare-changes) feature, your dbt Cloud account must have access to Advanced CI. Please ask your [dbt Cloud administrator to enable](/docs/cloud/account-settings#account-access-to-advanced-ci-features) this for you. - Set up a [connection with your Git provider](/docs/cloud/git/git-configuration-in-dbt-cloud). This integration lets dbt Cloud run jobs on your behalf for job triggering. @@ -35,6 +36,7 @@ To make CI job creation easier, many options on the **CI job** page are set to d 1. Options in the **Execution settings** section: - **Commands** — By default, it includes the `dbt build --select state:modified+` command. This informs dbt Cloud to build only new or changed models and their downstream dependents. Importantly, state comparison can only happen when there is a deferred environment selected to compare state to. Click **Add command** to add more [commands](/docs/deploy/job-commands) that you want to be invoked when this job runs. + - **Linting** — Enable this option for dbt to [lint the SQL files](/docs/deploy/continuous-integration#sql-linting) in your project as the first step in `dbt run`. If this check runs into an error, dbt can either **Fail job run** or **Continue running job**. - **Run compare changes** — Enable this option to compare the last applied state of the production environment (if one exists) with the latest changes from the pull request, and identify what those differences are. To enable record-level comparison and primary key analysis, you must add a [primary key constraint](/reference/resource-properties/constraints) or [uniqueness test](/reference/resource-properties/data-tests#unique). Otherwise, you'll receive a "Primary key missing" error message in dbt Cloud. To review the comparison report, navigate to the [Compare tab](/docs/deploy/run-visibility#compare-tab) in the job run's details. A summary of the report is also available from the pull request in your Git provider (see the [CI report example](#example-ci-report)). diff --git a/website/docs/docs/deploy/continuous-integration.md b/website/docs/docs/deploy/continuous-integration.md index 49d050f3146..2119724e609 100644 --- a/website/docs/docs/deploy/continuous-integration.md +++ b/website/docs/docs/deploy/continuous-integration.md @@ -34,6 +34,7 @@ The [dbt Cloud scheduler](/docs/deploy/job-scheduler) executes CI jobs different - **Concurrent CI checks** — CI runs triggered by the same dbt Cloud CI job execute concurrently (in parallel), when appropriate. - **Smart cancellation of stale builds** — Automatically cancels stale, in-flight CI runs when there are new commits to the PR. - **Run slot treatment** — CI runs don't consume a run slot. +- **SQL linting** — When enabled, automatically lints all SQL files in your project as a run step before your CI job builds. ### Concurrent CI checks @@ -54,3 +55,11 @@ When you push a new commit to a PR, dbt Cloud enqueues a new CI run for the late ### Run slot treatment CI runs don't consume run slots. This guarantees a CI check will never block a production run. + +### SQL linting + +When enabled for your CI job, dbt invokes [SQLFluff](https://sqlfluff.com/) which is a modular and configurable SQL linter that warns you of complex functions, syntax, formatting, and compilation errors. By default, it lints all the SQL files in your project. + +If the linter runs into errors, you can specify whether dbt should fail the job or continue running it. When failing jobs, it helps reduce compute costs by avoiding builds for pull requests that don't meet your SQL code quality CI check. + +To override the default linting behavior, create an `.sqlfluff` config file in your project and add your linting rules to it. dbt Cloud will use the rules defined in the config file when linting. For details about linting rules, refer to [Custom Usage](https://docs.sqlfluff.com/en/stable/gettingstarted.html#custom-usage) in the SQLFluff documentation. diff --git a/website/docs/docs/use-dbt-semantic-layer/exports.md b/website/docs/docs/use-dbt-semantic-layer/exports.md index 65ef1c6e3db..5d6e4c0d996 100644 --- a/website/docs/docs/use-dbt-semantic-layer/exports.md +++ b/website/docs/docs/use-dbt-semantic-layer/exports.md @@ -54,7 +54,7 @@ Before you're able to run exports in development or production, you'll need to m There are two ways to run an export: -- [Run exports in development](#exports-in-development) using the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) to test the output before production (You can configure exports in the dbt Cloud IDE, however running them directly in the IDE isn't supported yet). +- [Run exports in development](#exports-in-development) using the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) to test the output before production (You can configure exports in the dbt Cloud IDE, however running them directly in the IDE isn't supported yet). If you're using the dbt Cloud IDE, use `dbt build` to run exports. Make sure you have the [environment variable](#set-environment-variable) enabled. - [Run exports in production](#exports-in-production) using the [dbt Cloud job scheduler](/docs/deploy/job-scheduler) to write these queries within your data platform. ## Exports in development diff --git a/website/docs/guides/sl-snowflake-qs.md b/website/docs/guides/sl-snowflake-qs.md index fb72ee0057e..cf9bca98147 100644 --- a/website/docs/guides/sl-snowflake-qs.md +++ b/website/docs/guides/sl-snowflake-qs.md @@ -339,9 +339,7 @@ If you used Partner Connect, you can skip to [initializing your dbt project](#in ## Initialize your dbt project and start developing -This guide assumes you use the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to develop your dbt project and define metrics. However, the dbt Cloud IDE doesn't support using [MetricFlow commands](/docs/build/metricflow-commands) to query or preview metrics (support coming soon). - -To query and preview metrics in your development tool, you can use the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) to run the [MetricFlow commands](/docs/build/metricflow-commands). +This guide assumes you use the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to develop your dbt project, define metrics, and query and preview metrics using [MetricFlow commands](/docs/build/metricflow-commands). Now that you have a repository configured, you can initialize your project and start development in dbt Cloud using the IDE: diff --git a/website/docs/reference/dbt-jinja-functions/set.md b/website/docs/reference/dbt-jinja-functions/set.md index d85e0539924..fa4de60e968 100644 --- a/website/docs/reference/dbt-jinja-functions/set.md +++ b/website/docs/reference/dbt-jinja-functions/set.md @@ -27,6 +27,10 @@ __Args__: {% do log(my_set) %} {# None #} ``` +``` +{% set email_id = "'admin@example.com'" %} +``` + ### set_strict The `set_strict` context method can be used to convert any iterable to a sequence of iterable elements that are unique (a set). The difference to the `set` context method is that the `set_strict` method will raise an exception on a `TypeError`, if the provided value is not a valid iterable and cannot be converted to a set. diff --git a/website/docs/reference/project-configs/require-dbt-version.md b/website/docs/reference/project-configs/require-dbt-version.md index 42dc49c4546..97b42e036ec 100644 --- a/website/docs/reference/project-configs/require-dbt-version.md +++ b/website/docs/reference/project-configs/require-dbt-version.md @@ -93,7 +93,7 @@ In the following example, the project will only run with dbt v1.5: ```yml -require-dbt-version: 1.5 +require-dbt-version: "1.5.0" ``` diff --git a/website/docs/reference/resource-configs/target_schema.md b/website/docs/reference/resource-configs/target_schema.md index 893686a7513..ffa95df9be7 100644 --- a/website/docs/reference/resource-configs/target_schema.md +++ b/website/docs/reference/resource-configs/target_schema.md @@ -4,9 +4,9 @@ description: "Target_schema - Read this in-depth guide to learn about configurat datatype: string --- -:::note +:::info -For [versionless](/docs/dbt-versions/core-upgrade/upgrading-to-v1.8#versionless) dbt Cloud accounts and dbt Core v1.9+, this functionality is no longer required. Use the [schema](/reference/resource-configs/schema) config as an alternative to define a custom schema while still respecting the `generate_schema_name` macro. +For [versionless](/docs/dbt-versions/core-upgrade/upgrading-to-v1.8#versionless) dbt Cloud accounts and dbt Core v1.9+, this configuration is no longer required. Use the [schema](/reference/resource-configs/schema) config as an alternative to define a custom schema while still respecting the `generate_schema_name` macro. ::: @@ -33,12 +33,14 @@ snapshots: ## Description -The schema that dbt should build a [snapshot](/docs/build/snapshots) into. Snapshots build into the same `target_schema`, no matter who is running them. +The schema that dbt should build a [snapshot](/docs/build/snapshots) into. When `target_schema` is provided, snapshots build into the same `target_schema`, no matter who is running them. On **BigQuery**, this is analogous to a `dataset`. ## Default -This is a **required** parameter, no default is provided. + +This is a required parameter, no default is provided. +For versionless dbt Cloud accounts and dbt Core v1.9+, this is not a required parameter. ## Examples ### Build all snapshots in a schema named `snapshots` @@ -53,38 +55,10 @@ snapshots: -### Use a target-aware schema -Use the [`{{ target }}` variable](/reference/dbt-jinja-functions/target) to change which schema a snapshot is built in. - -Note: consider whether this use-case is right for you, as downstream `refs` will select from the `dev` version of a snapshot, which can make it hard to validate models that depend on snapshots (see above [FAQ](#faqs)) - - - -```yml -snapshots: - +target_schema: "{% if target.name == 'prod' %}snapshots{% else %}{{ target.schema }}{% endif %}" - -``` - - + ### Use the same schema-naming behavior as models -Leverage the [`generate_schema_name` macro](/docs/build/custom-schemas) to build snapshots in schemas that follow the same naming behavior as your models. - -Notes: -* This macro is not available when configuring from the `dbt_project.yml` file, so must be configured in a snapshot config block. -* Consider whether this use-case is right for you, as downstream `refs` will select from the `dev` version of a snapshot, which can make it hard to validate models that depend on snapshots (see above [FAQ](#faqs)) - +For native support of environment-aware snapshots, upgrade to dbt Core version 1.9+ and remove any existing `target_schema` configuration. - - -```sql -{{ - config( - target_schema=generate_schema_name('snapshots') - ) -}} -``` - - + \ No newline at end of file diff --git a/website/docs/reference/resource-configs/updated_at.md b/website/docs/reference/resource-configs/updated_at.md index 896405bf063..69abb463842 100644 --- a/website/docs/reference/resource-configs/updated_at.md +++ b/website/docs/reference/resource-configs/updated_at.md @@ -27,6 +27,12 @@ snapshots: +:::caution + +You will get a warning if the data type of the `updated_at` column does not match the adapter-configured default. + +::: + ## Description A column within the results of your snapshot query that represents when the record row was last updated. diff --git a/website/docs/reference/resource-properties/unit-testing-versions.md b/website/docs/reference/resource-properties/unit-testing-versions.md index 4d28e19e71d..39ef241c122 100644 --- a/website/docs/reference/resource-properties/unit-testing-versions.md +++ b/website/docs/reference/resource-properties/unit-testing-versions.md @@ -27,7 +27,7 @@ unit_tests: - name: test_is_valid_email_address model: my_model versions: - include: + exclude: - 1 ... diff --git a/website/sidebars.js b/website/sidebars.js index 3ecff4567ce..5dbf3caf036 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -648,12 +648,12 @@ const sidebarSettings = { }, { type: "link", - label: "API v2 (beta docs)", + label: "API v2", href: "/dbt-cloud/api-v2", }, { type: "link", - label: "API v3 (beta docs)", + label: "API v3", href: "/dbt-cloud/api-v3", }, ], diff --git a/website/snippets/_new-sl-setup.md b/website/snippets/_new-sl-setup.md index b9c64bc36f6..9079de6b80f 100644 --- a/website/snippets/_new-sl-setup.md +++ b/website/snippets/_new-sl-setup.md @@ -1,3 +1,5 @@ +import SLEnvVars from '/snippets/_sl-env-vars.md'; + You must be part of the Owner group and have the correct [license](/docs/cloud/manage-access/seats-and-users) and [permissions](/docs/cloud/manage-access/self-service-permissions) to set up the Semantic Layer at the environment and project level. - Enterprise plan: - Developer license with Account Admin permissions, or @@ -37,7 +39,7 @@ This credential controls the physical access to underlying data accessed by the 2. Click the **Add Semantic Layer credential** button. 3. In the **1. Add credentials** section, enter the credentials specific to your data platform that you want the Semantic Layer to use. - Use credentials with minimal privileges. The Semantic Layer requires read access to the schema(s) containing the dbt models used in your semantic models for downstream applications - - Note, environment variables such as `{{env_var('DBT_WAREHOUSE') }}`, aren't supported in the dbt Semantic Layer yet. You must use the actual credentials. + - @@ -68,7 +70,7 @@ We recommend configuring credentials and service tokens to reflect your teams an Note that: - Admins can link multiple service tokens to a single credential within a project, but each service token can only be linked to one credential per project. - When you send a request through the APIs, the service token of the linked credential will follow access policies of the underlying view and tables used to build your semantic layer requests. -- [Environment variables](/docs/build/environment-variables), like `{{env_var('DBT_WAREHOUSE') }}` aren't supported in the dbt Semantic Layer yet. You must use the actual credentials instead. +- To add multiple credentials and map them to service tokens: diff --git a/website/snippets/_packages_or_dependencies.md b/website/snippets/_packages_or_dependencies.md index 8d21768b0bf..3cd0361a099 100644 --- a/website/snippets/_packages_or_dependencies.md +++ b/website/snippets/_packages_or_dependencies.md @@ -7,7 +7,7 @@ Starting from dbt v1.6, we added a new configuration file called `dependencies.y If your dbt project doesn't require the use of Jinja within the package specifications, you can simply rename your existing `packages.yml` to `dependencies.yml`. However, something to note is if your project's package specifications use Jinja, particularly for scenarios like adding an environment variable or a [Git token method](/docs/build/packages#git-token-method) in a private Git package specification, you should continue using the `packages.yml` file name. -Examine the following tabs to understand the differences and determine when should use to `dependencies.yml` or `packages.yml`. +Examine the following tabs to understand the differences and determine when to use `dependencies.yml` or `packages.yml` (or both at the same time). diff --git a/website/snippets/_sl-env-vars.md b/website/snippets/_sl-env-vars.md new file mode 100644 index 00000000000..eddb3952782 --- /dev/null +++ b/website/snippets/_sl-env-vars.md @@ -0,0 +1,5 @@ +Use [Extended Attributes](/docs/dbt-cloud-environments#extended-attributes) and [Environment Variables](/docs/build/environment-variables) when connecting to the Semantic Layer. If you set a value directly in the Semantic Layer Credentials, it will have a higher priority than Extended Attributes. When using environment variables, the default value for the environment will be used. + +For example, set the warehouse by using `{{env_var('DBT_WAREHOUSE')}}` in your Semantic Layer credentials. + +Similarly, if you set the account value using `{{env_var('DBT_ACCOUNT')}}` in Extended Attributes, dbt will check both the Extended Attributes and the environment variable. diff --git a/website/snippets/_sl-partner-links.md b/website/snippets/_sl-partner-links.md index 71daaaa1d0a..28e4dc24b39 100644 --- a/website/snippets/_sl-partner-links.md +++ b/website/snippets/_sl-partner-links.md @@ -17,7 +17,7 @@ The following tools integrate with the dbt Semantic Layer: icon="google-sheets-logo-icon"/> diff --git a/website/snippets/_sl-test-and-query-metrics.md b/website/snippets/_sl-test-and-query-metrics.md index 0b637550cbb..9d996554b31 100644 --- a/website/snippets/_sl-test-and-query-metrics.md +++ b/website/snippets/_sl-test-and-query-metrics.md @@ -1,6 +1,6 @@ To work with metrics in dbt, you have several tools to validate or run commands. Here's how you can test and query metrics depending on your setup: -- [**dbt Cloud IDE users**](#dbt-cloud-ide-users) — Currently, running MetricFlow commands directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) isn't supported, but is coming soon. You can view metrics visually through the DAG in the **Lineage** tab without directly running commands. +- [**dbt Cloud IDE users**](#dbt-cloud-ide-users) — Run [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to query/preview metrics. View metrics visually in the **Lineage** tab. - [**dbt Cloud CLI users**](#dbt-cloud-cli-users) — The [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) enables you to run [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) to query and preview metrics directly in your command line interface. - **dbt Core users** — Use the MetricFlow CLI for command execution. While this guide focuses on dbt Cloud users, dbt Core users can find detailed MetricFlow CLI setup instructions in the [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) page. Note that to use the dbt Semantic Layer, you need to have a [Team or Enterprise account](https://www.getdbt.com/). @@ -8,7 +8,9 @@ Alternatively, you can run commands with SQL client tools like DataGrip, DBeaver ### dbt Cloud IDE users -You can view your metrics in the dbt Cloud IDE by viewing them in the **Lineage** tab. The dbt Cloud IDE **Status button** (located in the bottom right of the editor) displays an **Error** status if there's an error in your metric or semantic model definition. You can click the button to see the specific issue and resolve it. +You can use the `dbt sl` prefix before the command name to execute them in dbt Cloud. For example, to list all metrics, run `dbt sl list metrics`. For a complete list of the MetricFlow commands available in the dbt Cloud IDE, refer to the [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commandss) page. + +The dbt Cloud IDE **Status button** (located in the bottom right of the editor) displays an **Error** status if there's an error in your metric or semantic model definition. You can click the button to see the specific issue and resolve it. Once viewed, make sure you commit and merge your changes in your project. diff --git a/website/static/img/docs/collaborate/dbt-explorer/enable-query-history-success.jpg b/website/static/img/docs/collaborate/dbt-explorer/enable-query-history-success.jpg new file mode 100644 index 00000000000..5b25372ab59 Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/enable-query-history-success.jpg differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/enable-query-history.jpg b/website/static/img/docs/collaborate/dbt-explorer/enable-query-history.jpg new file mode 100644 index 00000000000..80df94bc860 Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/enable-query-history.jpg differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/model-consumption-lenses.jpg b/website/static/img/docs/collaborate/dbt-explorer/model-consumption-lenses.jpg index 9bf6c7ca0e3..a406bc25a7c 100644 Binary files a/website/static/img/docs/collaborate/dbt-explorer/model-consumption-lenses.jpg and b/website/static/img/docs/collaborate/dbt-explorer/model-consumption-lenses.jpg differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/model-consumption-list.jpg b/website/static/img/docs/collaborate/dbt-explorer/model-consumption-list.jpg index 653fe7a2f43..e2583a01c8e 100644 Binary files a/website/static/img/docs/collaborate/dbt-explorer/model-consumption-list.jpg and b/website/static/img/docs/collaborate/dbt-explorer/model-consumption-list.jpg differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/model-consumption-queries.jpg b/website/static/img/docs/collaborate/dbt-explorer/model-consumption-queries.jpg new file mode 100644 index 00000000000..6622fd3993f Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/model-consumption-queries.jpg differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/most-consumed-models.jpg b/website/static/img/docs/collaborate/dbt-explorer/most-consumed-models.jpg index 9e14db15f90..f95461108d7 100644 Binary files a/website/static/img/docs/collaborate/dbt-explorer/most-consumed-models.jpg and b/website/static/img/docs/collaborate/dbt-explorer/most-consumed-models.jpg differ diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png index 23c18953bf1..4455d52f1a8 100644 Binary files a/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png differ