This is a collection of scripts which have been written to serve a one-time or infrequent purpose. They are not intended to be run in an automated fashion and commonly would be used from the command line when needed. They will each have different parameters and possible environment variables which need to be set for the script to function as intended.
Helper script to disable or enable all VZ users through the Auth0 API.
Various scripts used as part of launching the new data model for VZ v2.0. We want to keep these around for the few months after launch (August 2024) but they can be archived in the near future.
Nodejs tool load ArcGIS Online (AGOL) layers into the Vision Zero database.
Script which compares lookup tables between a CRIS extract and the VZ database and generates database migrations. We should run this script after CRIS software releases.
This is the Docker stack used to host the database read replica proxy server, which enables our read-only database users to connect to the read replica instance via the public internet.
These toolbox scripts were deleted from the repository. Most of them are incompatible with v2.0 of our database. These deletions were committed at hash 569511cc0d598120be4146920623f55c1a8501a3
.
This contains a Python script will take a JSON file of crash ids, and download and check the CR3 file stored in S3 for each crash. It will verify the type of file using the libmagic
library will restore the most recent application/pdf
from the S3 version history of the file.
This is a Python script which will use a database view to find the set of crashes which:
- Are on a level 5 centerline,
- Have CR3 data which indicates that the crash occurred on a service road,
- Have a directionality which gives a hint which way off the centerline the crash should be moved.
For these crashes, the script will add 0.00000001 decimal degrees to the latitude of the crash, which is an immaterial movement in geographic terms, but it will cause the lambda function tied to crash-updates to execute. It is this lambda function which will figure out what location to move the crash to and will do so.
This is a python script which can be used to download CR3 files en masse.
It requires that two environment variables to be set:
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
These AWS credentials must have access to the S3 bucket atd-vision-zero-editor
.
Usage example: cat <CSV file> | python ./download_cr3s.py
The program expects to have a CSV file, with headers, piped into stdin
, which it will parse. It will look for a column named crash_id
, but it is flexible in the naming, as it will use a regex to detect variations on that name. In particular, the program will accept the CSV generated by the Vision Zero Editor without modification as input.
The program will collect the unique set of crash IDs from this input and then download the CR3 files from S3 for those files. The files will be saved in a directory which will be created for each invocation of the program in the form of ./downloaded_files/[TIMESTAMP]
.
This is a tool created to address a specific issue the Austin Transportation Department ran into as a result of a CRIS updating certain geospatial resources and reprocessing the entire corps of crashes. This CRIS process caused all crashes to be redelivered to ATD and run through the ETL which loads them into the Vision Zero Crash Database. Due to a particular configuration around the fields movement_id
and travel_direction
, changes to these fields which had been made by VZ stakeholders were reverted to the value provided from CRIS.
While it is unlikely that other installations of the Vision Zero crash system would need this tool chain as-is, it does provide an example implementation of restoring data from the change log entries. As an aside, during the course of this work, there have been certain modifications to the Vision Zero change log system identified which would facilitate this type of data recovery in the future.
To use this toolchain:
- Spin up the postgres image noted in the
Dockerfile
via thedocker-compose.yml
configuration. - Load a current copy of the VZDB and a copy of a backup from prior to the changes you wish to recover from. These go into the databases
current_vz
andpast_vz
respectively. - Create a scratch table to hold your results as shown in
result_table.sql
direct_comparison_change_finder.py
can be run to populate a the result table.find_aux_data.py
can be used to populate auxiliary data from the VZDB into your result scratch table- In the implementation shown here, this result set was presented to stakeholders and further criteria were given, as found in the view created in
view_with_stakeholder_feedback_criteria.sql
. - This view will include a column called
update_statement
which can be used as a series of updates to execute the data restoration.
Gets a list of crash ids we have CR3 PDFs for in our AWS S3 bucket and checks them against crashes in the atd_txdot_crashes table where cr3_stored_flag = 'Y'. Updates cr3_stored_flag to 'N' if those crash IDs aren't in the list of CR3s we have.
An iteration of the lookup table helper script. See get_lookup_table_changes
instead
Tool to inspect CRIS extract CSVs and compare them against the database tables
One off script to help migrate recommendation coordination partners to a related table.