You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello! I am trying to find a solution for tuning "optimizers" and a "learning_rate" hyperparameters in the same hypermodel. I did not find any solution for this case. Could you please tell me if it is possible at all and how to do it? I tried to solve it like shown below, but it does not work. Thanks in advance!
For now, you will have to override the run_trial method in the Tuner class to tune that.
Later we may add a shortcut for tuning that.
We may also need to add a tutorial for this.
How do you do that?
i have this code that tune batch size from #122
but also i want to tune the learning rate
class MyTuner(kt.tuners.Hyperband):
def run_trial(self, trial, *args, **kwargs):
# You can add additional HyperParameters for preprocessing and custom training loops
# via overriding `run_trial`
hp = trial.hyperparameters
kwargs['batch_size'] = trial.hyperparameters.Int('batch_size', 32, 256, step=32)
#kwargs['epochs'] = trial.hyperparameters.Int('epochs', 10, 30)
if "tuner/epochs" in hp.values:
kwargs["epochs"] = hp.values["tuner/epochs"]
kwargs["initial_epoch"] = hp.values["tuner/initial_epoch"]
return super(MyTuner, self).run_trial(trial, *args, **kwargs)
For now, you will have to override the run_trial method in the Tuner class to tune that. Later we may add a shortcut for tuning that. We may also need to add a tutorial for this.
Hello! I am trying to find a solution for tuning "optimizers" and a "learning_rate" hyperparameters in the same hypermodel. I did not find any solution for this case. Could you please tell me if it is possible at all and how to do it? I tried to solve it like shown below, but it does not work. Thanks in advance!
The text was updated successfully, but these errors were encountered: