Skip to content

Commit

Permalink
Merge branch 'current' into sl-rn-dec
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Dec 20, 2023
2 parents 27cd555 + 3f5b288 commit ff29b5a
Show file tree
Hide file tree
Showing 4 changed files with 6 additions and 9 deletions.
5 changes: 1 addition & 4 deletions website/docs/docs/core/connect-data-platform/spark-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,6 @@ meta:
<Snippet path="warehouse-setups-cloud-callout" />
<Snippet path="dbt-databricks-for-databricks" />

:::note
See [Databricks setup](#databricks-setup) for the Databricks version of this page.
:::

import SetUpPages from '/snippets/_setup-pages-intro.md';

<SetUpPages meta={frontMatter.meta} />
Expand Down Expand Up @@ -204,6 +200,7 @@ connect_retries: 3


<VersionBlock firstVersion="1.7">

### Server side configuration

Spark can be customized using [Application Properties](https://spark.apache.org/docs/latest/configuration.html). Using these properties the execution can be customized, for example, to allocate more memory to the driver process. Also, the Spark SQL runtime can be set through these properties. For example, this allows the user to [set a Spark catalogs](https://spark.apache.org/docs/latest/configuration.html#spark-sql).
Expand Down
1 change: 0 additions & 1 deletion website/sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,6 @@ const sidebarSettings = {
"docs/cloud/secure/redshift-privatelink",
"docs/cloud/secure/postgres-privatelink",
"docs/cloud/secure/vcs-privatelink",
"docs/cloud/secure/ip-restrictions",
],
}, // PrivateLink
"docs/cloud/billing",
Expand Down
5 changes: 3 additions & 2 deletions website/snippets/dbt-databricks-for-databricks.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
:::info If you're using Databricks, use `dbt-databricks`
If you're using Databricks, the `dbt-databricks` adapter is recommended over `dbt-spark`.
If you're still using dbt-spark with Databricks consider [migrating from the dbt-spark adapter to the dbt-databricks adapter](/guides/migrate-from-spark-to-databricks).
If you're using Databricks, the `dbt-databricks` adapter is recommended over `dbt-spark`. If you're still using dbt-spark with Databricks consider [migrating from the dbt-spark adapter to the dbt-databricks adapter](/guides/migrate-from-spark-to-databricks).

For the Databricks version of this page, refer to [Databricks setup](#databricks-setup).
:::
4 changes: 2 additions & 2 deletions website/snippets/warehouse-setups-cloud-callout.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
:::info `profiles.yml` file is for CLI users only
If you're using dbt Cloud, you don't need to create a `profiles.yml` file. This file is only for CLI users. To connect your data platform to dbt Cloud, refer to [About data platforms](/docs/cloud/connect-data-platform/about-connections).
:::info `profiles.yml` file is for dbt Core users only
If you're using dbt Cloud, you don't need to create a `profiles.yml` file. This file is only for dbt Core users. To connect your data platform to dbt Cloud, refer to [About data platforms](/docs/cloud/connect-data-platform/about-connections).
:::

0 comments on commit ff29b5a

Please sign in to comment.