Skip to content

Commit

Permalink
2030 partitions (#304)
Browse files Browse the repository at this point in the history
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
  • Loading branch information
devsjc and pre-commit-ci[bot] authored Dec 9, 2024
1 parent 9b0efa2 commit 410aa51
Show file tree
Hide file tree
Showing 4 changed files with 2,263 additions and 53 deletions.
13 changes: 0 additions & 13 deletions nowcasting_datamodel/connection.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@
from sqlalchemy.orm.session import Session

from nowcasting_datamodel.models.base import Base_Forecast
from nowcasting_datamodel.models.forecast import get_partitions

logger = logging.getLogger(__name__)

Expand Down Expand Up @@ -38,18 +37,6 @@ def create_all(self):
self.base.metadata.drop_all(self.engine)
self.base.metadata.create_all(self.engine)

def make_partitions(self):
"""Make partitions tables (useful for testing)"""
# get partitions
self.partitions = get_partitions(2019, 1, 2022, 7)

# make partitions
for partition in self.partitions:
if not self.engine.dialect.has_table(
connection=self.engine.connect(), table_name=partition.__table__.name
):
partition.__table__.create(bind=self.engine)

def drop_all(self):
"""Drop all partitions and tables"""
# drop partitions
Expand Down
22 changes: 14 additions & 8 deletions nowcasting_datamodel/migrations/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,17 +2,23 @@

When we change the datamodel of the PV or Forecast database we will need to run a database migration

## 1. old local database
start local database of database before data model changes. This can be done by setting `DB_URL` and `DB_URL_PV` and running
by running ```python nowcasting_datamodel/migrations/app.py --make-migrations```
## 1. Connect to database via SSH
Open an SSH tunnel from the target database instance to your local machine. Then, set
`DB_URL` environment variable to point to the local end of the tunnel e.g.
```bash
$ export DB_URL=postgresql://<user>:<password>@localhost:<port>/forecastdevelopment
$ python nowcasting_datamodel/migrations/app.py --make-migrations
```

## 2. run migrations scripts
by running ```python nowcasting_datamodel/migrations/app.py --run-migrations```
## 2. Commit new migration file
Commit migrations revision to repo using git

## 3. migrations revision
commit migrations revision to repo
## 3a. Run migrations manually
```
$ python nowcasting_datamodel/migrations/app.py --run-migrations
```

## 4. Start AWS task on ECS with docker container from this repo
## 3b. Start AWS task on ECS with docker container from this repo
This will run all the migrations an update the database in production / development
TODO Set up task definition using terraform
The current solution is to oopen ssh tunnel and run the migrations from there
Loading

0 comments on commit 410aa51

Please sign in to comment.