-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/Tutorial Request: Hyperparameter tuning #257
Comments
Are there specific hyper tuning methods you'd like to see covered? |
Mostly just a gradient method for the continuous parameters. Grid search should be fine for the discrete hyperparameters, given there's only 1 or 2. |
Could you precise the nature of the hyper search you're envisioning? I'm not clear how a gradient method could be applied here for hyper-search as the an EvoTree loss function isn't differentiable with respect to its hyper-parameter. Perhaps you're referring to apply a gradient method to eval metric outcomes to inform on next hyper candidate to test? |
Whoops, this is supposed to be in EvoLinear.jl 😅 (Although, I thought the loss was differentiable with respect to |
Even in the context of EvoLinear, I'm not understanding the applicability of a gradient method for hyper params tuning. |
Grad student descent is definitely not fun, so it would be very nice to have a way to tune hyperparameters efficiently, and a tutorial on how to do this. (MLJTuning.jl lets you do it in theory, but only provides a handful of black-box optimizers like random or grid search.)
The text was updated successfully, but these errors were encountered: