Skip to content

Commit

Permalink
missed renames for MDPDatastore
Browse files Browse the repository at this point in the history
  • Loading branch information
leifdenby committed Sep 12, 2024
1 parent bf8172a commit 90ca400
Show file tree
Hide file tree
Showing 4 changed files with 3 additions and 3 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ There are currently three different datastores implemented in the codebase:
files during train/val/test sampling, with the transformations to facilitate
this implemented within `neural_lam.datastore.MultizarrDatastore`.

3. `neural_lam.datastore.MLLAMDatastore` which can combine multiple zarr
3. `neural_lam.datastore.MDPDatastore` which can combine multiple zarr
datasets either either as a preprocessing step or during sampling, but
offloads the implementation of the transformations the
[mllam-data-prep](https://github.com/mllam/mllam-data-prep) package.
Expand Down Expand Up @@ -156,7 +156,7 @@ The amount of pre-processing required will depend on what kind of datastore you

#### NpyFiles Datastore

#### MLLAM Datastore
#### MDP (mllam-data-prep) Datastore

An overview of how the different pre-processing steps, training and files depend on each other is given in this figure:
<p align="middle">
Expand Down
2 changes: 1 addition & 1 deletion tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ def download_meps_example_reduced_dataset():


DATASTORES_EXAMPLES = dict(
mllam=(DATASTORE_EXAMPLES_ROOT_PATH / "mllam" / "danra.example.yaml"),
mdp=(DATASTORE_EXAMPLES_ROOT_PATH / "mdp" / "danra.example.yaml"),
npyfiles=download_meps_example_reduced_dataset(),
)

Expand Down
File renamed without changes.

0 comments on commit 90ca400

Please sign in to comment.