Skip to content

Commit

Permalink
Linting
Browse files Browse the repository at this point in the history
  • Loading branch information
Sukhil Patel authored and Sukhil Patel committed Jul 1, 2024
2 parents 6df425c + 2b59d1c commit d7a2034
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions scripts/save_concurrent_batches.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
Constructs batches where each batch includes all GSPs and only a single timestamp.
Currently a slightly hacky implementation due to the way the configs are done. This script will use
the same config file currently set to train the model. In the datamodule config file it is possible
the same config file currently set to train the model. In the datamodule config file it is possible
to set the batch_output_dir and number of train/val batches, they can also be overriden in the
command as shown in the example below.
Expand Down Expand Up @@ -164,7 +164,9 @@ def main(config: DictConfig):
with open(f"{config_dm.batch_output_dir}/datamodule.yaml", "w") as f:
f.write(OmegaConf.to_yaml(config.datamodule))

shutil.copyfile(config_dm.configuration, f"{config_dm.batch_output_dir}/data_configuration.yaml")
shutil.copyfile(
config_dm.configuration, f"{config_dm.batch_output_dir}/data_configuration.yaml"
)

dataloader_kwargs = dict(
shuffle=False,
Expand Down

0 comments on commit d7a2034

Please sign in to comment.