-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow subgroup argument selection via Enum #37
Comments
Hey @janvainer! Thank you for the kind words! I'm glad this tiny library has been helpful, the objective (well, at least mine) was exactly to provide a composable configuration for ML purposes. From the top of my head, I think it might be feasible by considering a Union of I'll give it a try soon though! In the meantime, I was using something like this for this very purpose: from enum import Enum
from argdantic import ArgParser
# imagine using torch optimizers
# there should be a base class for this
class Optimizer:
def __init__(self, name: str, lr: float):
self.name = name
self.lr = lr
class SGD(Optimizer):
def __init__(self, lr: float):
super().__init__("SGD", lr)
class Adam(Optimizer):
def __init__(self, lr: float):
super().__init__("Adam", lr)
# define an Enum where the values are the classes
# or a partial(Class, fixed arguments)
class Optimizers(Enum):
sgd = SGD
adam = Adam
cli = ArgParser()
# make the user select the class, then use other arguments
# to define the input parameters
@cli.command()
def main(
optimizer: Optimizers = Optimizers.sgd,
lr: float = 0.01,
epochs: int = 10,
batch_size: int = 32,
):
print(optimizer.value(lr))
print(lr)
print(epochs)
print(batch_size)
if __name__ == "__main__":
cli() I agree that this is a workaround, but at least allows this sort of mechanism to an extent. |
Thank you for your response and the code! Yes, it is a bit cumbersome. Please keep me in the loop! ;) I am curious how this unfolds. |
Is your feature request related to a problem? Please describe.
Hi, first of all, this project has a great potential for ML project configuration! Well done <3!!!
There is one usecase that is quite common in ML. It is when you have two different sub-configurations and you want to easily switch between them and then also specify certain sub-options. For example, consider a ML training script where you want to be able to select different optimizers:
Now it would be awesome to somehow specify which optimizer to initialize in the config and also be able to set some of its parameters.
Describe the solution you'd like
How the CLI should look:
There may be an issue regarding naming of the arguments based on the Union type. Instead, perhaps enums could be used:
The CLI would have to check that the enum value itself is a BaseModel and treat it as a nested configuration node to be displayed and available in terminal.
Describe alternatives you've considered
Hydra allows this kind of sub-grouping via subfolders. Simple-parsing solves it via
subgroups
type. Unfortunately, none of them are built on pydantic, so the user has to take care of validation themselves.WDYT about the feature? It would allow quite complex configurations, for example:
Edit: after a bit of thought, perhaps the Annotated type could be better suited 🤔 It would be something like
The advantage is that this can be used outside of CLI world without issues - one would be able to initialize Config without the need to import the enum class. It would be simply
Config(optimizer=SGD(0.1))
instead ofConfig(optimizer=Optimizers.SGD)
. Another advantage is that it would be possible to pass optimizer config that is not pre-defined in the Enum.The text was updated successfully, but these errors were encountered: