Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] validate & simplify hyperparameter space #60

Open
timruhkopf opened this issue Nov 19, 2021 · 4 comments
Open

[FEATURE] validate & simplify hyperparameter space #60

timruhkopf opened this issue Nov 19, 2021 · 4 comments
Assignees
Labels
enhancement New feature or request

Comments

@timruhkopf
Copy link

Is your feature request related to a problem? Please describe.
When writing the Models into the interface, the hyperparameter space is tedious.

Describe the solution you'd like
To validate that the model works appropriately under the search space, it would be nice to be able to sanity check the h.space.

Describe alternatives you've considered
A concise way for declaring complex h.spaces is the Configspace package

It also allows to sample from the space and declare distributions explicit.

Additional context

@timruhkopf timruhkopf added the enhancement New feature or request label Nov 19, 2021
@Frozenmad
Copy link
Contributor

Thanks for your advice!
Indeed, we are also preparing for constructing a unified search space for both neural architecture search and hyper parameter optimization.
We'll consider making the interface more concise (e.g., using ConfigSpace) in the future release!

@timruhkopf
Copy link
Author

Really nice! I look forward to it.

On that note, for working with the current version, I have a related question:
The Search Space tutorial is rather short, can you please elaborate your numerical list parameter, because I am not entirely sure on its purpose and usage.

@general502570
Copy link
Contributor

general502570 commented Nov 20, 2021

Thanks for your advice!

Numerical list is a series of variables in numerical search space. E.g., you want search for the dimensions of multiple layers. You can either assign a fixed length for the list, if so, you need not provide cutPara and cutFunc. Or you can let HPO cut the list to a certain length which is dependent on other parameters (e.g. you also want to search the number of layers at the same time when you search the dimensions). You should provide those parameters’ names in curPara and the function to calculate the cut length in “cutFunc”. You can assign minValue and masValue as either a list or a number. Here is a example

{
    "parameterName": "layers",
    "type": "INTEGER",
    "minValue": 2,
    "maxValue": 4,
    "scalingType": "LINEAR"
},
{
    "parameterName": "dimension",
    "type": "NUMERICAL_LIST",
    "numericalType": "INTEGER",
    "length": 4,
    "cutPara": "layers",
    "cutFunc": lambda x: x,
    "minValue": 16,
    "maxValue": 128,
    "scalingType": "LOG"
}

@timruhkopf
Copy link
Author

Thinking a little bit further and investigating your code, going in the direction of Configspace would also allow you to specify your prior beliefs about the distribution of Parameters - if you would go bayesian. This can alleviate specifying the parameters distribution in the autogl.module.hpo.suggestion..get_new_suggestions method (as e.g. the uniform sampling in Random Search )

I do see however that it is convenient for you to adopt the Advisor syntax for hyperparameterspaces, as you provide a compatibility bridge and partially rely on their models.

Which ever way you go, an awesome package you got there :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants