From 5877b54187ee857b1e938c8594ac5c9ae74d9372 Mon Sep 17 00:00:00 2001 From: Ben Cassell <98852248+benc-db@users.noreply.github.com> Date: Wed, 23 Oct 2024 13:24:00 -0700 Subject: [PATCH] Update website/docs/reference/resource-configs/databricks-configs.md Co-authored-by: Amy Chen <46451573+amychen1776@users.noreply.github.com> --- website/docs/reference/resource-configs/databricks-configs.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/reference/resource-configs/databricks-configs.md b/website/docs/reference/resource-configs/databricks-configs.md index fb0de24724a..fbc3d3df324 100644 --- a/website/docs/reference/resource-configs/databricks-configs.md +++ b/website/docs/reference/resource-configs/databricks-configs.md @@ -69,7 +69,7 @@ We do not yet have a PySpark API to set tblproperties at table creation, so this ### Python submission methods -As of 1.9, there are four options for `submission_method`: +As of 1.9 (or versionless if you're on dbt Cloud), there are four options for `submission_method`: * `all_purpose_cluster`: execute the python model either directly using the [command api](https://docs.databricks.com/api/workspace/commandexecution) or by uploading a notebook and creating a one-off job run * `job_cluster`: creates a new job cluster to execute an uploaded notebook as a one-off job run