I use to find myself needing a ready-to-go postgres database, usually the latest supported version to test indexing/ partitions/ ideas or anything that was scratching that part of my mind.
The pursue to find a quick docker-compose.yml that would allow me to run "docker compose up -d" and be ready to test whatever I needed was never straight forward. Sometimes I would have conflicting ports, not the customaziation needed either on pg_hba.conf or postgresql.conf. Even on my best day I would find that docker file, I would still have an empty database.. No tables or data to play with..
Relevant information for each custom file provided:
-
- Uses the latest postgres image available
- Set the container memory to 1G
- Set postgres user , password and database to "postgres"
- DB listening on 15432 outside of the container
-
- Added extra line to accept connections from outside of the container
-
- Increased shared_buffers to 128MB
- Changed listen_address to all
- Log all database statements
The database will initiate with a table called user_ which looks like a User information table from a Production application with hopes to allow to simulate real case scenarios.
At this point, our setup was ready we just needed to have data in our table. At this point we decided to create a Python script and use the Faker library to generate some random and usable data.
A requirements file with Psycopg3 and Faker. A python script that can be used to load whatever amount of records we fill necessary to test our scenario.
- Create a virtual Environment
python3 -m venv venv
- Activate/source your new environment
source venv/bin/active
- Install dependencies
pip install -r load-db/requirements.txt
- Run populate script
python load-db/populate-db.py