Skip to content

Commit

Permalink
removed references to BigQuery from project
Browse files Browse the repository at this point in the history
  • Loading branch information
britt-allen committed Oct 7, 2024
1 parent f4d01a7 commit 3aeb3df
Show file tree
Hide file tree
Showing 9 changed files with 3 additions and 120 deletions.
17 changes: 0 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,23 +53,6 @@ git add poetry.lock
git commit -m "Add poetry.lock"
```

### BigQuery setup

Using the dbt project with BigQuery requires the following setup steps:
* Users must set up their dbt profile locally to develop as themselves.
Instructions for this can be found in the "Local Development" section of the dbt project README.
* Repository owners must set up a service account in GCP for continuous integration to
be able to run against a test GCP project:
* Go to the "Service Accounts" section in the Google Cloud Console.
* Click "Create Service Account"
* Make a new service account with a name like `<my-project-name>-dbt-ci-bot`.
Give that account the IAM role of "BigQuery Metadata Viewer".
* Go to the "Keys" tab and create a new JSON private key.
* In the GitHub page for your new repository, go to `Settings -> Secrets and Variables -> Actions`
* Create a new repository secret called `GOOGLE_CREDENTIALS` and paste the JSON key
into the value for the secret. This should allow CI to authenticate with your
test BigQuery project and view metadata for datasets and tables.

### Snowflake setup

The projects generated from our infrastructure template need read access to the
Expand Down
1 change: 0 additions & 1 deletion ci/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@ channels:
- conda-forge
dependencies:
- dbt-core~=1.4
- dbt-bigquery~=1.4
- dbt-snowflake~=1.4
- mkdocs-material~=9.1.3
- pre-commit
Expand Down
5 changes: 1 addition & 4 deletions copier.yml
Original file line number Diff line number Diff line change
Expand Up @@ -53,11 +53,8 @@ license:
dbt_target:
choices:
- BigQuery
- Snowflake
default: BigQuery
help: |
What data warehouse will you be using with dbt?
default: Snowflake

dbt_profile_name:
type: str
Expand Down
16 changes: 0 additions & 16 deletions {{project_name}}/.github/workflows/docs.yml.jinja
Original file line number Diff line number Diff line change
Expand Up @@ -26,22 +26,6 @@ jobs:
- uses: actions/setup-python@v4
with:
python-version: "3.10"
{% if dbt_target == 'BigQuery' %}
- id: auth
name: Authenticate to Google Cloud
uses: google-github-actions/auth@v1
with:
# To set this up you should create a GCP service account
# with the "BigQuery Metadata Viewer", "BigQuery Job User", and
# "BigQuery Data Viewer" roles. This will allow it to execute read-only
# queries against your data warehouse, which is needed to generate the docs
# Next, create a service account JSON file, and put it into a GitHub actions
# secret under the name GOOGLE_CREDENTIALS.
{% raw %}
credentials_json: ${{ secrets.GOOGLE_CREDENTIALS }}
{% endraw %}
export_environment_variables: true
{% endif %}
- uses: actions/cache@v2
with:
key: {% raw %}${{ github.ref }}
Expand Down
17 changes: 1 addition & 16 deletions {{project_name}}/.github/workflows/pre-commit.yml.jinja
Original file line number Diff line number Diff line change
Expand Up @@ -24,22 +24,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
{% if dbt_target == 'BigQuery' %}
- id: auth
name: Authenticate to Google Cloud
uses: google-github-actions/auth@v1
with:
# To set this up you should create a GCP service account
# with the "BigQuery Metadata Viewer", "BigQuery Job User", and
# "BigQuery Data Viewer" roles. This will allow it to execute read-only
# queries against your data warehouse, which is needed to generate the docs
# Next, create a service account JSON file, and put it into a GitHub actions
# secret under the name GOOGLE_CREDENTIALS.
{% raw %}
credentials_json: ${{ secrets.GOOGLE_CREDENTIALS }}
{% endraw %}
export_environment_variables: true
{% endif %}

- uses: actions/setup-python@v4
with:
python-version: "3.10"
Expand Down
27 changes: 1 addition & 26 deletions {{project_name}}/docs/setup.md.jinja
Original file line number Diff line number Diff line change
Expand Up @@ -98,8 +98,7 @@ run `dbt debug` and inspect the output.
Instructions for writing a `profiles.yml` are documented
[here](https://docs.getdbt.com/docs/get-started/connection-profiles),
as well as specific instructions for
[Snowflake](https://docs.getdbt.com/reference/warehouse-setups/snowflake-setup)
and [BigQuery](https://docs.getdbt.com/reference/warehouse-setups/bigquery-setup).
[Snowflake](https://docs.getdbt.com/reference/warehouse-setups/snowflake-setup).

You can verify that your `profiles.yml` is configured properly by running

Expand All @@ -108,8 +107,6 @@ dbt debug
```

from the dbt project root directory (`transform`).

{% if dbt_target == 'Snowflake' %}
### Snowflake project

A minimal version of a `profiles.yml` for dbt development with is:
Expand All @@ -131,28 +128,6 @@ A minimal version of a `profiles.yml` for dbt development with is:
threads: 4
```


{% elif dbt_target == 'BigQuery' %}
### BigQuery project

A minimal version of a `profiles.yml` for dbt development with BigQuery is:

```yml
dse_bigquery:
target: dev
outputs:
dev:
type: bigquery
method: oauth
project: <project-id> # Project ID to use
dataset: dbt_<your-name> # Test schema for development, don't use prod!
threads: 4
```

This requires you to be authenticated using the `gcloud` CLI tool.

{% endif %}

## Installing `pre-commit` hooks

This project uses [pre-commit](https://pre-commit.com/) to lint, format,
Expand Down
4 changes: 0 additions & 4 deletions {{project_name}}/pyproject.toml.jinja
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,7 @@ readme = "README.md"
python = "^3.10"
mkdocs-material = "~9.1.3"
dbt-core = "~1.8.0"
{% if dbt_target == 'BigQuery' %}
dbt-bigquery = "~1.8.0"
{% elif dbt_target == 'Snowflake' %}
dbt-snowflake = "~1.8.0"
{% endif %}

[tool.poetry.group.dev.dependencies]
pre-commit = "^3.3.1"
Expand Down
17 changes: 0 additions & 17 deletions {{project_name}}/transform/models/_models.yml.jinja
Original file line number Diff line number Diff line change
Expand Up @@ -3,22 +3,6 @@ version: 2
models:
- name: sample_data
description: A sample model
{% if dbt_target == 'BigQuery' %}
columns:
- name: id
description: An integer id
tests:
- not_null
- name: date
description: A date column
tests:
- not_null
- name: value
description: A unique value
tests:
- not_null
- unique
{% elif dbt_target == 'Snowflake' %}
columns:
- name: name
description: A person's name
Expand All @@ -28,4 +12,3 @@ models:
description: Their license plate
- name: email
description: Their email
{% endif %}
19 changes: 0 additions & 19 deletions {{project_name}}/transform/models/sample_data.sql.jinja
Original file line number Diff line number Diff line change
@@ -1,21 +1,3 @@
{% if dbt_target == 'BigQuery' %}
with
dates as (
select *
from
unnest(generate_date_array('2023-01-01', '2023-12-31')) as date
with
offset
as
offset
)

select
offset as id,
date,
generate_uuid() as value
from dates
{% elif dbt_target == 'Snowflake' %}
select
randstr(uniform(10, 30, random(1)), uniform(1, 100000, random(1)))::varchar(
30
Expand All @@ -28,4 +10,3 @@ select
30
) as email
from table(generator(rowcount => 1000))
{% endif %}

0 comments on commit 3aeb3df

Please sign in to comment.