From 130e00a3a2d915c7503760f1dc04398da5738e97 Mon Sep 17 00:00:00 2001
From: Mirna Wong <89008547+mirnawong1@users.noreply.github.com>
Date: Mon, 25 Nov 2024 17:10:15 +0000
Subject: [PATCH 01/19] Update 06-upgrading-to-v1.9.md
update upgrade guide!
---
.../dbt-versions/core-upgrade/06-upgrading-to-v1.9.md | 8 +++++---
1 file changed, 5 insertions(+), 3 deletions(-)
diff --git a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
index 7ac5a743995..2f027ac3c45 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
@@ -29,7 +29,8 @@ Features and functionality new in dbt v1.9.
### Microbatch `incremental_strategy`
:::info
-While microbatch is in "beta", this functionality is still gated behind an env var, which will change to a behavior flag when 1.9 is GA. To use microbatch, set `DBT_EXPERIMENTAL_MICROBATCH` to `true` wherever you're running dbt Core.
+
+If you use a custom microbatch macro, set the [`require_batched_execution_for_custom_microbatch_strategy`](/reference/global-configs/behavior-changes#custom-microbatch-strategy) behavior flag in your `dbt_project.yml` to enable batched execution. If you don't have a custom microbatch macro, you don't need to set this flag as dbt will handle microbatching automatically for any model using the microbatch strategy.
:::
Incremental models are, and have always been, a *performance optimization* — for datasets that are too large to be dropped and recreated from scratch every time you do a `dbt run`. Learn more about [incremental models](/docs/build/incremental-models-overview).
@@ -83,6 +84,7 @@ You can read more about each of these behavior changes in the following links:
- (Introduced, disabled by default) [`skip_nodes_if_on_run_start_fails` project config flag](/reference/global-configs/behavior-changes#behavior-change-flags). If the flag is set and **any** `on-run-start` hook fails, mark all selected nodes as skipped.
- `on-run-start/end` hooks are **always** run, regardless of whether they passed or failed last time.
- (Introduced, disabled by default) [[Redshift] `restrict_direct_pg_catalog_access`](/reference/global-configs/behavior-changes#redshift-restrict_direct_pg_catalog_access). If the flag is set the adapter will use the Redshift API (through the Python client) if available, or query Redshift's `information_schema` tables instead of using `pg_` tables.
+- (Introduced, disabled by default) [`require_batched_execution_for_custom_microbatch_strategy`](/reference/global-configs/behavior-changes#custom-microbatch-strategy). Set to `True` in your `dbt_project.yml` if you use a custom microbatch macro to enable batched execution. If you don't have a custom microbatch macro, you don't need to set this flag as dbt will handle microbatching automatically for any model using the microbatch strategy.
## Adapter specific features and functionalities
@@ -92,7 +94,7 @@ You can read more about each of these behavior changes in the following links:
### Snowflake
-- Iceberg Table Format support will be available on three out of the box materializations: table, incremental, dynamic tables.
+- Iceberg Table Format support will be available on three out-of-the-box materializations: table, incremental, dynamic tables.
### Bigquery
@@ -107,7 +109,7 @@ You can read more about each of these behavior changes in the following links:
We also made some quality-of-life improvements in Core 1.9, enabling you to:
-- Maintain data quality now that dbt returns an an error (versioned models) or warning (unversioned models) when someone [removes a contracted model by deleting, renaming, or disabling](/docs/collaborate/govern/model-contracts#how-are-breaking-changes-handled) it.
+- Maintain data quality now that dbt returns an error (versioned models) or warning (unversioned models) when someone [removes a contracted model by deleting, renaming, or disabling](/docs/collaborate/govern/model-contracts#how-are-breaking-changes-handled) it.
- Document [data tests](/reference/resource-properties/description).
- Use `ref` and `source` in [foreign key constraints](/reference/resource-properties/constraints).
- Use `dbt test` with the `--resource-type` / `--exclude-resource-type` flag, making it possible to include or exclude data tests (`test`) or unit tests (`unit_test`).
From d687274ba77dc9474a5b71ffda195aea4750f99b Mon Sep 17 00:00:00 2001
From: Matt Shaver <60105315+matthewshaver@users.noreply.github.com>
Date: Mon, 25 Nov 2024 12:45:00 -0500
Subject: [PATCH 02/19] Core v1.9 RC updates
---
website/dbt-versions.js | 1 -
.../docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md | 2 +-
website/snippets/core-versions-table.md | 3 ++-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/website/dbt-versions.js b/website/dbt-versions.js
index 825af8ac6ee..f84184a486c 100644
--- a/website/dbt-versions.js
+++ b/website/dbt-versions.js
@@ -20,7 +20,6 @@ exports.versions = [
},
{
version: "1.9",
- isPrerelease: true,
},
{
version: "1.8",
diff --git a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
index 7ac5a743995..64e7c2dc05e 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
@@ -1,5 +1,5 @@
---
-title: "Upgrading to v1.9 (beta)"
+title: "Upgrading to v1.9"
id: upgrading-to-v1.9
description: New features and changes in dbt Core v1.9
displayed_sidebar: "docs"
diff --git a/website/snippets/core-versions-table.md b/website/snippets/core-versions-table.md
index 743b59c6bb7..c1fa718e83e 100644
--- a/website/snippets/core-versions-table.md
+++ b/website/snippets/core-versions-table.md
@@ -2,7 +2,8 @@
| dbt Core | Initial release | Support level and end date |
|:-------------------------------------------------------------:|:---------------:|:-------------------------------------:|
-| [**v1.8**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.8) | May 9 2024 | Active Support — May 8, 2025 |
+| [**v1.9**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.9) | Release candidate | TBA |
+| [**v1.8**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.8) | May 9 2024 | Active Support — May 8, 2025|
| [**v1.7**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.7) | Nov 2, 2023 |
**dbt Core and dbt Cloud Developer & Team customers:** End of Life
**dbt Cloud Enterprise customers:** Critical Support until further notice 1
|
| [**v1.6**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.6) | Jul 31, 2023 | End of Life ⚠️ |
| [**v1.5**](/docs/dbt-versions/core-upgrade/upgrading-to-v1.5) | Apr 27, 2023 | End of Life ⚠️ |
From cb12b636f5506b0166c7a9563dfeeff2f58535ee Mon Sep 17 00:00:00 2001
From: Aeriel Soriano <27873562+aerielsoriano@users.noreply.github.com>
Date: Tue, 26 Nov 2024 04:28:02 +0000
Subject: [PATCH 03/19] Update metricflow-time-spine.md (#6539)
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Updated the WHERE clause for the BigQuery Daily code block. It throws an
error when using the original suggestion.
## What are you changing in this pull request and why?
## Checklist
- [ ] I have reviewed the [Content style
guide](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/content-style-guide.md)
so my content adheres to these guidelines.
- [ ] The topic I'm writing about is for specific dbt version(s) and I
have versioned it according to the [version a whole
page](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#adding-a-new-version)
and/or [version a block of
content](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#versioning-blocks-of-content)
guidelines.
- [ ] I have added checklist item(s) to this list for anything anything
that needs to happen before this PR is merged, such as "needs technical
review" or "change base branch."
- [ ] The content in this PR requires a dbt release note, so I added one
to the [release notes
page](https://docs.getdbt.com/docs/dbt-versions/dbt-cloud-release-notes).
---
🚀 Deployment available! Here are the direct links to the updated files:
-
https://docs-getdbt-com-git-aerielsoriano-patch-1-dbt-labs.vercel.app/docs/build/metricflow-time-spine
---
website/docs/docs/build/metricflow-time-spine.md | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/website/docs/docs/build/metricflow-time-spine.md b/website/docs/docs/build/metricflow-time-spine.md
index 5f16af38023..48e46caeec2 100644
--- a/website/docs/docs/build/metricflow-time-spine.md
+++ b/website/docs/docs/build/metricflow-time-spine.md
@@ -179,8 +179,8 @@ final as (
select *
from final
-- filter the time spine to a specific range
-where date_day > dateadd(year, -4, current_timestamp())
-and date_day < dateadd(day, 30, current_timestamp())
+where date_day > date_add(DATE(current_timestamp()), INTERVAL -4 YEAR)
+and date_day < date_add(DATE(current_timestamp()), INTERVAL 30 DAY)
```
From 127cc4a7e4d592f33379b2cf95a50f7ddc0c396d Mon Sep 17 00:00:00 2001
From: Joel Labes
Date: Tue, 26 Nov 2024 18:00:13 +1300
Subject: [PATCH 04/19] Update python-models.md to reflect changed default
behaviour
---
website/docs/docs/build/python-models.md | 13 +------------
1 file changed, 1 insertion(+), 12 deletions(-)
diff --git a/website/docs/docs/build/python-models.md b/website/docs/docs/build/python-models.md
index 2267da192a9..8e4b1d46451 100644
--- a/website/docs/docs/build/python-models.md
+++ b/website/docs/docs/build/python-models.md
@@ -673,18 +673,7 @@ def model(dbt, session: snowpark.Session):
-**About "sprocs":** dbt submits Python models to run as _stored procedures_, which some people call _sprocs_ for short. By default, dbt will create a named sproc containing your model's compiled Python code, and then _call_ it to execute. Snowpark has an Open Preview feature for _temporary_ or _anonymous_ stored procedures ([docs](https://docs.snowflake.com/en/sql-reference/sql/call-with.html)), which are faster and leave a cleaner query history. You can switch this feature on for your models by configuring `use_anonymous_sproc: True`. We plan to switch this on for all dbt + Snowpark Python models starting with the release of dbt Core version 1.4.
-
-
-
-```yml
-# I asked Snowflake Support to enable this Private Preview feature,
-# and now my dbt-py models run even faster!
-models:
- use_anonymous_sproc: True
-```
-
-
+**About "sprocs":** dbt submits Python models to run as _stored procedures_, which some people call _sprocs_ for short. By default, dbt will use Snowpark's _temporary_ or _anonymous_ stored procedures ([docs](https://docs.snowflake.com/en/sql-reference/sql/call-with.html)), which are faster and leave a cleaner query history than creating and then calling a named sproc containing your model's compiled Python code. You can switch this feature off for your models by configuring `use_anonymous_sproc: False`.
**Docs:** ["Developer Guide: Snowpark Python"](https://docs.snowflake.com/en/developer-guide/snowpark/python/index.html)
From f928147a1e565990aa299e3fd79ea5641cc3ed00 Mon Sep 17 00:00:00 2001
From: Joel Labes
Date: Tue, 26 Nov 2024 18:02:24 +1300
Subject: [PATCH 05/19] remove extra space
---
website/docs/docs/build/python-models.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/website/docs/docs/build/python-models.md b/website/docs/docs/build/python-models.md
index 8e4b1d46451..131bf48cdda 100644
--- a/website/docs/docs/build/python-models.md
+++ b/website/docs/docs/build/python-models.md
@@ -673,7 +673,7 @@ def model(dbt, session: snowpark.Session):
-**About "sprocs":** dbt submits Python models to run as _stored procedures_, which some people call _sprocs_ for short. By default, dbt will use Snowpark's _temporary_ or _anonymous_ stored procedures ([docs](https://docs.snowflake.com/en/sql-reference/sql/call-with.html)), which are faster and leave a cleaner query history than creating and then calling a named sproc containing your model's compiled Python code. You can switch this feature off for your models by configuring `use_anonymous_sproc: False`.
+**About "sprocs":** dbt submits Python models to run as _stored procedures_, which some people call _sprocs_ for short. By default, dbt will use Snowpark's _temporary_ or _anonymous_ stored procedures ([docs](https://docs.snowflake.com/en/sql-reference/sql/call-with.html)), which are faster and leave a cleaner query history than creating and then calling a named sproc containing your model's compiled Python code. You can switch this feature off for your models by configuring `use_anonymous_sproc: False`.
**Docs:** ["Developer Guide: Snowpark Python"](https://docs.snowflake.com/en/developer-guide/snowpark/python/index.html)
From 41cd75e6ad95c22bca2a596b93ee3bf155c5e5d4 Mon Sep 17 00:00:00 2001
From: Mirna Wong <89008547+mirnawong1@users.noreply.github.com>
Date: Tue, 26 Nov 2024 09:57:46 +0000
Subject: [PATCH 06/19] Update website/docs/docs/build/python-models.md
---
website/docs/docs/build/python-models.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/website/docs/docs/build/python-models.md b/website/docs/docs/build/python-models.md
index 131bf48cdda..c3222fb76b8 100644
--- a/website/docs/docs/build/python-models.md
+++ b/website/docs/docs/build/python-models.md
@@ -673,7 +673,7 @@ def model(dbt, session: snowpark.Session):
-**About "sprocs":** dbt submits Python models to run as _stored procedures_, which some people call _sprocs_ for short. By default, dbt will use Snowpark's _temporary_ or _anonymous_ stored procedures ([docs](https://docs.snowflake.com/en/sql-reference/sql/call-with.html)), which are faster and leave a cleaner query history than creating and then calling a named sproc containing your model's compiled Python code. You can switch this feature off for your models by configuring `use_anonymous_sproc: False`.
+**About "sprocs":** dbt submits Python models to run as _stored procedures_, which some people call _sprocs_ for short. By default, dbt will use Snowpark's _temporary_ or _anonymous_ stored procedures ([docs](https://docs.snowflake.com/en/sql-reference/sql/call-with.html)), which are faster and keep query history cleaner than named sprocs containing your model's compiled Python code. To disable this feature, set `use_anonymous_sproc: False` in your model configuration.
**Docs:** ["Developer Guide: Snowpark Python"](https://docs.snowflake.com/en/developer-guide/snowpark/python/index.html)
From 0c43c6d729c6787bb4f0688b932b72fce9bc5e83 Mon Sep 17 00:00:00 2001
From: mirnawong1
Date: Tue, 26 Nov 2024 10:16:03 +0000
Subject: [PATCH 07/19] add entry
---
website/docs/docs/dbt-versions/release-notes.md | 1 +
1 file changed, 1 insertion(+)
diff --git a/website/docs/docs/dbt-versions/release-notes.md b/website/docs/docs/dbt-versions/release-notes.md
index 536c45ea045..55116db68ba 100644
--- a/website/docs/docs/dbt-versions/release-notes.md
+++ b/website/docs/docs/dbt-versions/release-notes.md
@@ -19,6 +19,7 @@ Release notes are grouped by month for both multi-tenant and virtual private clo
\* The official release date for this new format of release notes is May 15th, 2024. Historical release notes for prior dates may not reflect all available features released earlier this year or their tenancy availability.
## November 2024
+- **Fix**: Job environment variable overrides in credentials are now respected for Exports. Previously, they were ignored.
- **Behavior change**: If you use a custom microbatch macro, set a [`require_batched_execution_for_custom_microbatch_strategy` behavior flag](/reference/global-configs/behavior-changes#custom-microbatch-strategy) in your `dbt_project.yml` to enable batched execution. If you don't have a custom microbatch macro, you don't need to set this flag as dbt will handle microbatching automatically for any model using the [microbatch strategy](/docs/build/incremental-microbatch#how-microbatch-compares-to-other-incremental-strategies).
- **Enhancement**: For users that have Advanced CI's [compare changes](/docs/deploy/advanced-ci#compare-changes) feature enabled, you can optimize performance when running comparisons by using custom dbt syntax to customize deferral usage, exclude specific large models (or groups of models with tags), and more. Refer to [Compare changes custom commands](/docs/deploy/job-commands#compare-changes-custom-commands) for examples of how to customize the comparison command.
- **New**: SQL linting in CI jobs is now generally available in dbt Cloud. You can enable SQL linting in your CI jobs, using [SQLFluff](https://sqlfluff.com/), to automatically lint all SQL files in your project as a run step before your CI job builds. SQLFluff linting is available on [dbt Cloud Versionless](/docs/dbt-versions/versionless-cloud) and to dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) accounts. Refer to [SQL linting](/docs/deploy/continuous-integration#sql-linting) for more information.
From 3f6c668047b91fcf9847a0c701199ec3d10afe23 Mon Sep 17 00:00:00 2001
From: Mirna Wong <89008547+mirnawong1@users.noreply.github.com>
Date: Tue, 26 Nov 2024 10:40:00 +0000
Subject: [PATCH 08/19] Update configure-auto-exposures.md
add clarifying info about connecting to single tableau site. raised in community slack
---
website/docs/docs/cloud-integrations/configure-auto-exposures.md | 1 +
1 file changed, 1 insertion(+)
diff --git a/website/docs/docs/cloud-integrations/configure-auto-exposures.md b/website/docs/docs/cloud-integrations/configure-auto-exposures.md
index 51a776ffe6b..746bef62e44 100644
--- a/website/docs/docs/cloud-integrations/configure-auto-exposures.md
+++ b/website/docs/docs/cloud-integrations/configure-auto-exposures.md
@@ -26,6 +26,7 @@ To access the features, you should meet the following:
4. You have [admin permissions](/docs/cloud/manage-access/enterprise-permissions) in dbt Cloud to edit project settings or production environment settings.
5. Use Tableau as your BI tool and enable metadata permissions or work with an admin to do so. Compatible with Tableau Cloud or Tableau Server with the Metadata API enabled.
- If you're using Tableau Server, you need to [allowlist dbt Cloud's IP addresses](/docs/cloud/about-cloud/access-regions-ip-addresses) for your dbt Cloud region.
+ - Currently, you can only connect to a single Tableau site on the same server.
## Set up in Tableau
From c850706723a3faff80198d352484e553bceaa1e7 Mon Sep 17 00:00:00 2001
From: Mirna Wong <89008547+mirnawong1@users.noreply.github.com>
Date: Tue, 26 Nov 2024 10:41:05 +0000
Subject: [PATCH 09/19] Update auto-exposures.md
---
website/docs/docs/collaborate/auto-exposures.md | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/website/docs/docs/collaborate/auto-exposures.md b/website/docs/docs/collaborate/auto-exposures.md
index 0e393c911cc..a4518a7cba1 100644
--- a/website/docs/docs/collaborate/auto-exposures.md
+++ b/website/docs/docs/collaborate/auto-exposures.md
@@ -9,12 +9,12 @@ image: /img/docs/cloud-integrations/auto-exposures/explorer-lineage.jpg
# Auto-exposures
-As a data team, it’s critical that you have context into the downstream use cases and users of your data products. Auto-exposures integrates natively with Tableau (Power BI coming soon) and auto-generates downstream lineage in dbt Explorer for a richer experience.
+As a data team, it’s critical that you have context into the downstream use cases and users of your data products. Auto-exposures integrate natively with Tableau (Power BI coming soon) and auto-generate downstream lineage in dbt Explorer for a richer experience.
-Auto-exposures helps users understand how their models are used in downstream analytics tools to inform investments and reduce incidents — ultimately building trust and confidence in data products. It imports and auto-generates exposures based on Tableau dashboards, with user-defined curation.
+Auto-exposures help users understand how their models are used in downstream analytics tools to inform investments and reduce incidents — ultimately building trust and confidence in data products. It imports and auto-generates exposures based on Tableau dashboards, with user-defined curation.
## Supported plans
-Auto-exposures is available on [Versionless](/docs/dbt-versions/versionless-cloud) and for [dbt Cloud Enterprise](https://www.getdbt.com/pricing/) plans.
+Auto-exposures is available on [Versionless](/docs/dbt-versions/versionless-cloud) and for [dbt Cloud Enterprise](https://www.getdbt.com/pricing/) plans. Currently, you can only connect to a single Tableau site on the same server.
:::info Tableau Server
If you're using Tableau Server, you need to [allowlist dbt Cloud's IP addresses](/docs/cloud/about-cloud/access-regions-ip-addresses) for your dbt Cloud region.
From a93b4978a43e0de1660cf12734820966bd9c209b Mon Sep 17 00:00:00 2001
From: Mirna Wong <89008547+mirnawong1@users.noreply.github.com>
Date: Tue, 26 Nov 2024 11:20:33 +0000
Subject: [PATCH 10/19] Update
website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
Co-authored-by: Grace Goheen <53586774+graciegoheen@users.noreply.github.com>
---
.../docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
index b9dbfc92d78..8b809877870 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.9.md
@@ -84,7 +84,7 @@ You can read more about each of these behavior changes in the following links:
- (Introduced, disabled by default) [`skip_nodes_if_on_run_start_fails` project config flag](/reference/global-configs/behavior-changes#behavior-change-flags). If the flag is set and **any** `on-run-start` hook fails, mark all selected nodes as skipped.
- `on-run-start/end` hooks are **always** run, regardless of whether they passed or failed last time.
- (Introduced, disabled by default) [[Redshift] `restrict_direct_pg_catalog_access`](/reference/global-configs/behavior-changes#redshift-restrict_direct_pg_catalog_access). If the flag is set the adapter will use the Redshift API (through the Python client) if available, or query Redshift's `information_schema` tables instead of using `pg_` tables.
-- (Introduced, disabled by default) [`require_batched_execution_for_custom_microbatch_strategy`](/reference/global-configs/behavior-changes#custom-microbatch-strategy). Set to `True` in your `dbt_project.yml` if you use a custom microbatch macro to enable batched execution. If you don't have a custom microbatch macro, you don't need to set this flag as dbt will handle microbatching automatically for any model using the microbatch strategy.
+- (Introduced, disabled by default) [`require_batched_execution_for_custom_microbatch_strategy`](/reference/global-configs/behavior-changes#custom-microbatch-strategy). Set to `True` if you use a custom microbatch macro to enable batched execution. If you don't have a custom microbatch macro, you don't need to set this flag as dbt will handle microbatching automatically for any model using the microbatch strategy.
## Adapter specific features and functionalities
From 2dafcfce982b9a478e55478167c4757dd189d5e4 Mon Sep 17 00:00:00 2001
From: Mirna Wong <89008547+mirnawong1@users.noreply.github.com>
Date: Tue, 26 Nov 2024 12:34:51 +0000
Subject: [PATCH 11/19] Update website/docs/docs/collaborate/auto-exposures.md
Co-authored-by: nataliefiann <120089939+nataliefiann@users.noreply.github.com>
---
website/docs/docs/collaborate/auto-exposures.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/website/docs/docs/collaborate/auto-exposures.md b/website/docs/docs/collaborate/auto-exposures.md
index a4518a7cba1..28bf5bd37b1 100644
--- a/website/docs/docs/collaborate/auto-exposures.md
+++ b/website/docs/docs/collaborate/auto-exposures.md
@@ -14,7 +14,7 @@ As a data team, it’s critical that you have context into the downstream use ca
Auto-exposures help users understand how their models are used in downstream analytics tools to inform investments and reduce incidents — ultimately building trust and confidence in data products. It imports and auto-generates exposures based on Tableau dashboards, with user-defined curation.
## Supported plans
-Auto-exposures is available on [Versionless](/docs/dbt-versions/versionless-cloud) and for [dbt Cloud Enterprise](https://www.getdbt.com/pricing/) plans. Currently, you can only connect to a single Tableau site on the same server.
+Auto-exposures is available on [Versionless](/docs/dbt-versions/versionless-cloud) and [dbt Cloud Enterprise](https://www.getdbt.com/pricing/) plans. Currently, you can only connect to a single Tableau site on the same server.
:::info Tableau Server
If you're using Tableau Server, you need to [allowlist dbt Cloud's IP addresses](/docs/cloud/about-cloud/access-regions-ip-addresses) for your dbt Cloud region.
From e4a8ef115eabf42dbd2bb6aaf3fa6c9bc41b8e4d Mon Sep 17 00:00:00 2001
From: mirnawong1
Date: Tue, 26 Nov 2024 14:19:52 +0000
Subject: [PATCH 12/19] add full unified
---
website/docs/reference/dbt-classes.md | 9 +++++++--
1 file changed, 7 insertions(+), 2 deletions(-)
diff --git a/website/docs/reference/dbt-classes.md b/website/docs/reference/dbt-classes.md
index 13f9263e545..5c9f72209c1 100644
--- a/website/docs/reference/dbt-classes.md
+++ b/website/docs/reference/dbt-classes.md
@@ -98,9 +98,14 @@ col.numeric_type('numeric', 12, 4) # numeric(12,4)
### Properties
-- **name**: Returns the name of the column
+- **char_size**: Returns the maximum size for character varying columns
+- **column**: Returns the name of the column
+- **data_type**: Returns the data type of the column (with size/precision/scale included)
+- **dtype**: Returns the data type of the column (without any size/precision/scale included)
+- **name**: Returns the name of the column (alias for column)
+- **numeric_precision**: Returns the maximum precision for fixed decimal columns
+- **numeric_scale**: Returns the maximum scale for fixed decimal columns
- **quoted**: Returns the name of the column wrapped in quotes
-- **data_type**: Returns the data type of the column
### Instance methods
From 2404d28dd631695fb6e717571db68ea6eedd50ef Mon Sep 17 00:00:00 2001
From: mirnawong1
Date: Tue, 26 Nov 2024 14:47:40 +0000
Subject: [PATCH 13/19] add log level file
---
website/docs/reference/global-configs/logs.md | 25 +++++++++++++------
1 file changed, 17 insertions(+), 8 deletions(-)
diff --git a/website/docs/reference/global-configs/logs.md b/website/docs/reference/global-configs/logs.md
index 682b9fc8393..e9363561339 100644
--- a/website/docs/reference/global-configs/logs.md
+++ b/website/docs/reference/global-configs/logs.md
@@ -66,19 +66,28 @@ See [structured logging](/reference/events-logging#structured-logging) for more
The `LOG_LEVEL` config sets the minimum severity of events captured in the console and file logs. This is a more flexible alternative to the `--debug` flag. The available options for the log levels are `debug`, `info`, `warn`, `error`, or `none`.
-Setting the `--log-level` will configure console and file logs.
+- Setting the `--log-level` will configure console and file logs.
+ ```text
+ dbt --log-level debug run
+ ```
-```text
-dbt --log-level debug run
-```
+- Setting the `LOG_LEVEL` to none will disable logs from being sent to either the console or file logs.
+
+ ```text
+ dbt --log-level none
+ ```
-To set the file log level as a different value than the console, use the `--log-level-file` flag.
+- To set the file log level as a different value than the console, use the `--log-level-file` flag.
+ ```text
+ dbt --log-level-file error run
+ ```
-```text
-dbt --log-level-file error run
-```
+- To only disable writing to the logs file but keep console logs, set `LOG_LEVEL_FILE` config to none.
+ ```text
+ dbt --log-level-file none
+ ```
### Debug-level logging
From 15e15dbf4e286d6a4da25c54a01c3f7d84aaba40 Mon Sep 17 00:00:00 2001
From: Mirna Wong <89008547+mirnawong1@users.noreply.github.com>
Date: Tue, 26 Nov 2024 14:55:06 +0000
Subject: [PATCH 14/19] Update website/docs/reference/global-configs/logs.md
Co-authored-by: Matt Shaver <60105315+matthewshaver@users.noreply.github.com>
---
website/docs/reference/global-configs/logs.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/website/docs/reference/global-configs/logs.md b/website/docs/reference/global-configs/logs.md
index e9363561339..85969a5bc02 100644
--- a/website/docs/reference/global-configs/logs.md
+++ b/website/docs/reference/global-configs/logs.md
@@ -72,7 +72,7 @@ The `LOG_LEVEL` config sets the minimum severity of events captured in the conso
dbt --log-level debug run
```
-- Setting the `LOG_LEVEL` to none will disable logs from being sent to either the console or file logs.
+- Setting the `LOG_LEVEL` to `none` will disable information from being sent to either the console or file logs.
```text
dbt --log-level none
From e1b0b438185fd73a0a7119101e565212df9924c9 Mon Sep 17 00:00:00 2001
From: Mirna Wong <89008547+mirnawong1@users.noreply.github.com>
Date: Tue, 26 Nov 2024 15:08:28 +0000
Subject: [PATCH 15/19] Update website/docs/reference/dbt-classes.md
---
website/docs/reference/dbt-classes.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/website/docs/reference/dbt-classes.md b/website/docs/reference/dbt-classes.md
index 5c9f72209c1..a6a8c2d4fa6 100644
--- a/website/docs/reference/dbt-classes.md
+++ b/website/docs/reference/dbt-classes.md
@@ -102,7 +102,7 @@ col.numeric_type('numeric', 12, 4) # numeric(12,4)
- **column**: Returns the name of the column
- **data_type**: Returns the data type of the column (with size/precision/scale included)
- **dtype**: Returns the data type of the column (without any size/precision/scale included)
-- **name**: Returns the name of the column (alias for column)
+- **name**: Returns the name of the column (identical to `column`, provided as an alias).
- **numeric_precision**: Returns the maximum precision for fixed decimal columns
- **numeric_scale**: Returns the maximum scale for fixed decimal columns
- **quoted**: Returns the name of the column wrapped in quotes
From f6444f3138dfd1a3ad5d59c0c997cae60a50cecb Mon Sep 17 00:00:00 2001
From: mirnawong1
Date: Tue, 26 Nov 2024 17:19:27 +0000
Subject: [PATCH 16/19] update gif
---
.../dbt-cloud/cloud-ide/dbt-assist-toggle.jpg | Bin 168050 -> 0 bytes
.../docs/dbt-cloud/cloud-ide/dbt-assist.gif | Bin 809991 -> 0 bytes
.../dbt-cloud/cloud-ide/dbt-copilot-doc.gif | Bin 994660 -> 902893 bytes
3 files changed, 0 insertions(+), 0 deletions(-)
delete mode 100644 website/static/img/docs/dbt-cloud/cloud-ide/dbt-assist-toggle.jpg
delete mode 100644 website/static/img/docs/dbt-cloud/cloud-ide/dbt-assist.gif
diff --git a/website/static/img/docs/dbt-cloud/cloud-ide/dbt-assist-toggle.jpg b/website/static/img/docs/dbt-cloud/cloud-ide/dbt-assist-toggle.jpg
deleted file mode 100644
index 50dfbe7f51a9635be96b8ea054c72ee5a0f9ee95..0000000000000000000000000000000000000000
GIT binary patch
literal 0
HcmV?d00001
literal 168050
zcmeEuby%E9@-P}41`7n6!GlYH;2zw96Wrb1U4y$@AV>%rEVw%)K+wV6-R+y~?%nO)
z+Xflr&_kJj`o!&U^k1D
zqgCRkinb$ERcG-J_d+a%Ec5U|NCq3-#+fwCCQ3@(rkPQ(gqd`ty3Tx-J-qhs-9$f?
zsH`Fwl#G8ModutxFQB#~B>ujbGdm97a!r
zO69Z)v-^!1C0zZZ1uLspTiexhb8)sYVdyZ8xVleJ(-mH25*dlbNt(+h86|bMHL@Mj
zoSu#j{UWj;cM>?;3_%NmV_@KN%L}+iO2E-K-6L$#HG`lZoLxu|DjOK9OPa{aLeWCf
zh)^)lSWvK#6g1=)D%2bb?zc1)6cywXGBNQXPzaDuEXbciF3dklpQ+};{v!>m@w1|k
zl8B@voT~>zM(AJt!-^kX$n9$?
z(%RTbALM3jW#h>0#z*#B4Q@#KXEqZV=(j3PmV9LDvI-y(TL)th2O~2hGZ{ZJ2n6DF
zFf!p*6cziUIpiB3nVFN59XAt`tE(%c>nlcE2U8{%E-o%6W>zLvRt88721j=rCw(^t
z8%OfrJNZXHqQ;Jf4(4`F=C(GVpZ)3^*g8A$k&*oz=$}8o!)feh{;!d29RENIf*{k+
z5+)W#W~N^?b22yiFU@|I{NC(0ynY{!_h)C^3g&LcRvMz_))1;fK;!3Rf6e>bF#l2V
zFF=29sp4qtAYy9`Y3RiNuW9+C@t+I-z2R>|s{d<977kX{KacsRqCYkLi37L1gE?e4
z{hvhSXW?b~kGwzX^D_OU@SiCD`*i-63z<)TWL~C!@&-S0Ocqrl6qF#8q^OXx8}xn_
zydSO#Ufrh1pL`0{NR2*pforGVxPwa%o%`mOFHbkTl0P=s?M&OFnb9$3%%tM*5$?1HuHK
zU4)0_>578b%=Lj4GVdCYiC}+)zTXzy%Tl*!wgoms7PM)Bu4@`|*I|H)98kJxZLJ>y
z+~_NRE3y~ntpa0sq-IycRaiJ^7{8px0yP+_dptgs@x6^RBQQYC4p{j|J-=xV>2edsDpTr>L;mm3ZiNVzx|OjFOdHYjwy)0G
ziLBh~?ohUj(M!n6WM)0ta_hsh=ygPp=tn&corAEY<#rxr)XW`rC}E%r^{S8`sVbiu
zi*lIrrS3}VZ>}8w3vv(;^1~q-WFHF%WU{xnFX3oGeZiXAs@RmHoYC%XHlsu6gl3}-
zhIekCBEeAZ^NvoLA7%Nt`LMQ;%Cuew5(O{baI_sC2O`tf;$>s1E1Pw&gwYF}Um-EX
zRR>hshi32LaFTK1k)ij-DzYFLAS3reeScYa@P^gJHA@*&HC7y6hc3=7MNm&j=rB=Y
z^gWySd&hi$5-7XjJ+2D)j(oOGSoSU&uMR(fTpWEMk>vvsryDbtRT*}G;O4Lfg=bxv
z!}c-egT5oTs4NMxsNX%z+ZF!*UVi4
z8MPWJwHyX070b1_(q5TH52v#ag#o!7Hxnm{JE@|h#+1j!Z^ze}RUlU(4&%iusK;ii{Pp^sDfG+j_U
zA|dx!;wvza&6lC&Ejuq!TC~S~bGn&iSz&FxX}8)fsnKYW1R;!!T(-;&20GjA=)A4O
zg~gBck@}@LBy6feOVzw*N9)^V^>!;{8VzPuj%x8!OgKK00iA>YDJ-(F@!1W`+BuXg
z2ulUcX
z`C}@JVPwamfS*x$B7>&tVe6d@jYf4rnP$D*=2%Y7aPsJwIi*x0b7TZA9TkMeIAKIv
z`D|7*3OYVF#%a9n)mAf=q`8$aD;r5_I&pa-q3M=MTKa{mpWjKg9s)8M+wL9QZ+D6|
zhM3yeI;B92&byOwO^08ltwUy`DjiPOCAploW39jq-@}9eM)5p{f)Dn(1~7a}?sWdt
z5lUu42k0Lh9&Q(mU|96g@A{F
z*?vF*DOXWplOv3MH-*Vw0xOEH2m9<MAwZtuui6n&BKIqS0?$+hwfEx-x~7XG
zGtKO;Uz23CTD43nG;Lieci3I)tg6>Pzg};WR9a6G@}<}TDT2S!zaNl3hw299k
zC#hw}M_;@TeTV)u!KANRjx)YdktmFT3;Fcz1am)!;l!SbIvqIHdiOTjGclG3E~y{R+#2>s%utWr
z0{RC1VG)daB5wA8q^5=JN4x6q>*37G{W>4WnP+qs$^=@>!*tU)?_f)9tX1~ALjLDgEVdQ#OG`ttljVXGK_U~~0ssC1
z@Lu|0UTQz6dEmLKT!B&C=
zg}h|W)~Xgh*~)-HPA2+fk!Ga*v_`owNi)>3;Hvl4f@u)$q<=?)P~bBdCcht_)G`1%
zi1NV)bG4R3gGqEFtuXM&sUFN^nd9*KYSJUgx6=%P?rx4Tq6m>`lKu0F_PDSme5+;i1g~5c+5mAk8
zC$dJz3PZ2E4#xRx*`f@J6x5Qm5cZr;!L(5Zx8Dq);hOZB_F&yLhMySNY{T|6MjNVI
z-3WP|*-cgTHG9db=tH&)$81O&;aCGkWN6Tv!)XSNrOkIfwr%}bQb&Nqtu8U9o
z6!rCYnH3!L_`9_^1al(;?m8Bq*I+kObbXR^eTiQRSf-YH-JA_r*E=g#L;xw?`(UoPzOS$&5
z<PIQBTMQG}+_8IFt($-<_nVRE(CoX&rJ5KBqQ!#~)>HyLj!ZY?pjguyWL{
zdZP|M-p{*XFT=LqJnIl@{+%+S?&ddIVt3_M6ru$Qq)BOqgfo&GU!mGZVh78tu
z^Qg7Q2Lx1+ExkMMrznuJ`W2qeIMa^tc-xZeruZn|6(1;u|1_8S-sND9!Tq9dBj%wk
zAsm}pI*Q=O$iuCE#bUMFiQ>-pwKhf3oOVv<-Gay)&oZw`9%lKihDN!+p1oopk=Q2M
zlJ*j57#yc1b>|H9vaCi;LLJuc|cFwKzW9ZpP{dP!hanhkBN-S$V8K
z`jr{k&|jbUnVzN|z~*~?u+!eu2!6?Qg|2PnAoI2B!JLYm^)Zqcwl5>J^3JSLlF
z`sEU+@31#Qt{AcHX8+XhU!5%z1iGTv7Cnz}gNg!D
z{kYx2@bpUoVyMD?)|QLS?!5}yEgsI~=pA>3wo9#Ia6+6f@h5<8%g}2BP!&YOm@pW;
zdf-mvAiN+d&+WX>;cIv@Kh`o2qff(tQ;d>ZU08J{nOtW$f;Vrz-1)
zJ%o8N!qLl-8#B#=#sLO^>GIpP^W2UZ1wHIKYeQY;J3Ga#xXCIpDu1e46>)EhLJSoE
zBYmbFRyo=ipaf2b4P=~f
z`MLv#cS%=Yo%i|?j@m3O)&~^pJ*d7n!IjK2OY=w4{T^V>bxhGql{3d
zPBsOWHYUby(@DE~pR-rQ@ZQG?3UWJ~r+)p)m>6)|GqcwPIXefyBd24WibXN9=<*lI
z_eu63Z(P7We)OAwx+3`TX`pJYFER|~`|c+u{uB`ej6wc~RVJP0)W#6bM(-wt#Zfp`3+N}sztDnKIAR$y-8xJPF
zW!;T|2ESL+`mTt)9h3Xxw(zUhjnvKlL`-BcjA|$bahzp^xpi``g8QGW4Zj&BXc5~{
z;Pl&Dn~oO!&dl?lHKkRwkTsnjvC+5jU50@urio=4ldM|=664`d@5c5%`&KY|HigbE
z7W-oGJC6J?;maV9>EnMY+kZGWtXbxzm)CMGznPrpg
z$!JN%iRUbyb8|O4%ShYmcqUtY@d0D_Mwt?al~SLe7ld^#!KxAjkjxO5U4?0Tz2k)B)B
zzyZr?SLTbVqm)qKc}|9Qu>VNQVoKiyspxsl=|cT|UqCI3FpwlO*Z0*N>tpbik$%k6
ztnALy@&oHlVU4`*9fs%%a^&jqbN}zez1CJdS1b6)sHeq&)^4%w4ZKGEfEu8F+VLd+
zs0d*lN#r(je3yEQ`Vx{O4V;_GQI`g7PtY1AW>U14L7%PBJ
zuS_fhp!*D+$gG(HqyU1$LEXVu%i;Gv&+8>Q?ZOa9T|1NxK?vY++Scnu>EgN-c}edNiyK)iz2#>L#1nH<0^L%T?qz03%G~27D92_43ztC3qbmV1Bs}W(ARG
zdF1@@?r10!=wsIXI_R~C53$ys;50D_LkDs<5Rs!nZFwO9a
zQ&{AU>34+;9ItfAN^q}xc02@oqR}rm!pav^r>`Lxpajt+`aIkY5ZX4AuJuN}T4Z>=
z==K5gAYg>O<;ZnjhO}C}Dc8~F*vfvo$G&K0wG3!QOcshp9^$M!!hQ!az~R>(iqV7~UQ61b;s*Q%-2Y0oso1yGHKCuV|6La%w90j2?{2jIH32GQEQ
zh0~!u;8`?zUdGeq9BhH0Lw%{$%#9M1A=q;XTao-lbvZj6hW&ZIxVbYOug7|^*Xe-i
zZQi_^+%BGEZK(=eqdLcK3Os*t?xCteA*W~b{w_hoJJ!9J9KEf^t@-Mkjb1vFgE?o1
zX}+t03u0TA-Kw#*&MV3oN0ZTX23My>P430X0U{=~y3D2$(RnxWW!b3LSc|!0NX`vz
zl-w&s2QzsrWN)jDFZlQ-tBoBOupU_N1#cg|>DJlI(z>6E5WiJ!aSV(
zuxjWW1CIXlIR;YV+Yi0{r9Ui(69`-vW_!RN_!hn9iE5tQI>zdPfpi!G_YJ(K+t`X9
zOq?HB65hZkD@x-pW3le((q18E*KjZ{7H*o$7X1$jioanjK-&N0NJGN0+JZ
z3Y=L2Pm0<1J9y;Ds8eCjU`7CafM>5x|4PWrUb5VV@_ac~v-
zHaSfw3_jhgjfoz#f?KnAXI353YHhNbIqhNvyZxY)M1eM64BeE0w0GunPVSyFb|qzE
z5@B42it2kJc3gSBp^HxkSJifFRE>2m4F1JIifpzdJ5G`!vmfNW1@e7gb<~)a^5dj1
zNSicXW!M_WhEVxd|47+h3NKbT6Q5g*saIqp>@*sUq*E>*>zBz(`crocJlz{c
z4zxWzHl4KleJ5V|2(~8o!U+Q4x2#CQbe%yq4aU5t%iaNYO2?oM?oOtTHTZcmlx&hZ
z&?J8370~P09|`vvA^h|E1#7&?V0)4SP|)*En~wfNMtnNd!~UU@tsLLM7=cH#$jy%U
zES@_tw-URxkRsUT=b&^!s{jP}W(x&9k*)?(Dt`pgRE=Ab9!_SRuooRtioMV$h{D*M
z0nnRa;Md#aiu&;RNWB*a^-Jlo@9%MUHl``89sZozGz}g-eSB
zmz3w?4GiLA!@ei^Jtk^Z2K=^ooKU@*S)^9EIOPKwbdE(*S#a)Wzzyaf
zG%B<|g}8rK+{lMJiVJ{8Nqx8v{i}r<^c4wpoOxzpUW<;8y|kTa!n+QRXRURu7v;HI
z3ApqvV)Un~+CYBXhm%xaqY{3F>DJMn(=W)tDB8r=Fr5hG;uXEeBUD1t>0#~vY{?e{$?h3qx^~G^@2J%YmrD9zZ*4xTqIZA7=H}q_3ae}VPJ|4wcy`az
zHLLYilj(!-e1!3PtSG|O&h*Vj4-Ws3^{cYMqw
zESpEt9i`AS`+8q`r5BA9K5qbGcs861={MMdk&loNv2W~4Otc4i;RYd}qj=&sP^lER
zU-WPR@RkRlJ75fgz1s;c`S$Dc+AE_2EK&R^Y2al*qJn?`^?JEr@H$S=j_?dCj+SO5
z1NfqP`q`FVc`7v=W6)*5$oEE!zT-uIP?-CwR*
zUv$5ig=+|u3io2dnwAToTTBG`CEIA=S6iAZyB51a7idpstV;g51@
zRQ;b%k6%ph=MaZ4A8>W(Ou+PUq6LRA3ApZD8mT?Sgwws_m~KeP8A9^T@6cB499wu^
z7w3q0byRIU^sp#8<9n~Z%3>h!Z`Y=R=a>y5Es~Gk7sl>qz0tR>)3ZfYaa%^m2{F@v3B5fE3x^;;PEC#dfQrUZ|RCsg=2qKO1Z*Dj8bUDI;!4jao#cqgkHaoU%HyR$J!noe~gcHb7^C0i>)vQK#JrlOb6BHF(L?;sI
zSQ=sG@6wCp>Ow^Oz`5HMMpD|2A6=)X>9_3f@-3vP9V!Oi`kfGh^Jmv*ogxqSXJfsb
zQI0D@P=>V=%Lg_SjZq#Fw~v?ewuJJ~6a;-s=hND*Spym{F)&4-Kp%a?P(i4-2#XBr
zWGq4AtTu=|uNRyWUbJ0Y3X-6Q@z<$SEkiS=ihx||TJPAwoOasw4|due6A@6`)QaVG
z@?eYxU67a%o1dP)=t;w)8QvvbnTVAjv~J|s=y&%8>x@G+HyR=3j+x`-idP7+O*XmK
z=g>m%0neM?Kx`w%or$Cr8V-lAv`}WS7kYHahk)~`^|za9IhU
z89#s&oJ~9G!O@-UsOvCA8Z@c1{hrt8OJ)D(*U-
ziL4Z>9P~NIf{cTzEesaL6A$n2~Dcd#zVxbm|F+${9!
z9+$bDjw{oaq?uYkPwod+MA}T(@V6|NR|0ErD}8oPWLqfR{U1FTbUZdxp70xm|7xuQ
zEWiOe_r|y!H*~zi7)JH|NTxo2T2~9yx-tEV(*4Y|JO+4?H;ln1bB6u7JaOp!C|n=K
zi-g8T(RX)=%k}j{8_Qhnk5Hnub)^SFf|bVQsFR)r*FjeAfELIRDa|LYrdvb!#k;Fd
zabMyeiH%V@P4gb_(fE(NngmKJjuHLc*o-$czW2`d-oFa}G=FugPF0fkLmP0p_qdrd
z0BgY{cw@EC>57Xq-3xQ@A@A`Zis;pG>#Xm3akKMTltU7KL!&BMjq1AOq#gHc)tic?miWJ3~Unoigc^pqU4IK5lPR9%x4gM>KQo+bTJ%4xqqwABkC^tV#Bzf}l
zKtn{#XC2Va-)wXDqSucb>JDM@o?Vf2#;ouoQ&=>Nb#Uyz>)#6~xHA4Bf^dZ`ioTvG
zWX5WT?%E=}V5%x%e--74b_Ey>j&(Vxolbv@+e5e_#@8?Zt{xd0I?APvghw`nh`Xze
zRW3LjCQ5JbS(U4uNmxGP1i-JT;jbfY$kq1#*g@ntIuJp_nGFUQt@N726n(A43b!*r
zz~0BT2Kjq4SKx-xHQ!d3@E>0ZU9urLFN!)-+Z2_LBT744ZZV#&
z4pj{LKAlAf3}EsX1HH-g8>qeXbqY@m-X25Wp4Tiqco&A^y3K-ic$;6wemSUpvRLK!
zR)pQCeQL2;1hZRC#mc4g0*nXZ3pal}CMYGHBr~LEjJTgY?}wPp4X4DT2`JGnB#GQ)
zm!;41LV*eL>@;LPGr1rExhp)~Q&0D0aokpromXd2_(wU9TLgh##@2>9-~OsM?c$-I&G
zbiJ9X{>>Xj3m=Z;@ev_G6D}Z>LBq*Alphsg3}YRICa<@XoyRMi5}-#_p>BQ}BM7z4
z