Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NEW: in case of column mismatch, the dataset will be created again #3

Closed
wants to merge 3 commits into from

Conversation

deepansh96
Copy link
Member

Fixes #2

Test Plan

  • Tested locally
  • Tested on staging
  • Tested on production

@pep8speaks
Copy link

pep8speaks commented Jul 16, 2021

Hello @deepansh96! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2022-02-21 09:34:03 UTC

@deepansh96 deepansh96 added the bug Something isn't working label Jul 16, 2021
# download s3 file into lambda /tmp/ directory
file = s3_directory + table_name + ".csv"
local_file_name = "/tmp/" + table_name + ".csv"
s3.download_file(bucket_name, file, local_file_name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it possible to re-use these files in the process_tables function as well and avoid the download there? That may help with reduced run time for lambda.

Comment on lines +43 to +45
if not are_public_columns_consistent or not are_schema_columns_consistent:
client.delete_dataset(dataset_id, delete_contents=True)
client.create_dataset(dataset_id)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have a bit of different thoughts here. Instead of dropping the whole dataset, shouldn't we just drop and create at the table level?

If we do at the table level, I think then we can easily call this function inside process_tables function and avoid duplicate things like making BigQuery connection and S3 download files.

@deepansh96 deepansh96 closed this Apr 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
3 participants