You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Generate a loader configuration, applying additional metadata (arbitrary channels and masks) providing initial access to the collected data
Use this data loader to produce usable application datasets for downstream applications (testing with IceNet and another internal application)
This issue will capture everything to do with developing step (3) of this process. To do this, consider the original command for icenet to create a so called "network dataset" from a data loader configuration contained multiple source datasets and generated "other" data:
To approach this, we'll get that working against the original library with the new data loader configuration structure. Then we can drag the processing framework (most of which is agnostic to the underlying application implementation) over where it makes sense. We'd then end up with a command, similarly, such as
Performing the exact same in/out processing but backreferencing the structure of the dataset to generate_and_write (which is responsible for constructing, as it currently is, the (x, y, sample_weight) tuple for all elements iterable out of the data loader. We need to consider output types, in so much as we don't want ML specific logic in here (e.g. tf.data, PyTorch Lightning or other specific logic) so it might be that an assessment leaves this out, but at least the first should be working to close the issue and assessment done on working towards the second.
Refactor IceNet (under this issue to accept the new loader configuration methodology
Ensure extraneous parameter specifying is rationalised (remove time and space configuration parameters within network dataset generation, as these are predetermined earlier in the processing chain, unless there is a good reason to override them)
Assess the creation of the preprocess_dataset_create type commands
The text was updated successfully, but these errors were encountered:
The nature of the library is that we:
This issue will capture everything to do with developing step (3) of this process. To do this, consider the original command for icenet to create a so called "network dataset" from a data loader configuration contained multiple source datasets and generated "other" data:
To approach this, we'll get that working against the original library with the new data loader configuration structure. Then we can drag the processing framework (most of which is agnostic to the underlying application implementation) over where it makes sense. We'd then end up with a command, similarly, such as
Performing the exact same in/out processing but backreferencing the structure of the dataset to generate_and_write (which is responsible for constructing, as it currently is, the (x, y, sample_weight) tuple for all elements iterable out of the data loader. We need to consider output types, in so much as we don't want ML specific logic in here (e.g.
tf.data
, PyTorch Lightning or other specific logic) so it might be that an assessment leaves this out, but at least the first should be working to close the issue and assessment done on working towards the second.preprocess_dataset_create
type commandsThe text was updated successfully, but these errors were encountered: