Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tuning optimizers and learning rate simultaneously #553

Open
LenarNizamov opened this issue Jun 16, 2021 · 3 comments
Open

Tuning optimizers and learning rate simultaneously #553

LenarNizamov opened this issue Jun 16, 2021 · 3 comments
Labels
documentation Improvements or additions to documentation

Comments

@LenarNizamov
Copy link

LenarNizamov commented Jun 16, 2021

Hello! I am trying to find a solution for tuning "optimizers" and a "learning_rate" hyperparameters in the same hypermodel. I did not find any solution for this case. Could you please tell me if it is possible at all and how to do it? I tried to solve it like shown below, but it does not work. Thanks in advance!

from kerastuner import HyperModel
import kerastuner as kt
import keras as kr

class MyHyperModel(HyperModel):

    def __init__(self, num_classes, hp):
        self.num_classes = num_classes

    def build(self, hp):
        model = kr.Sequential()                
        model.add(kr.layers.Dense(units=hp.Int('units_',
                                               min_value=30,
                                               max_value=70,
                                               step=10),
                                  kernel_initializer=hp.Choice('kernel_initializer_', ['random_normal', 'random_uniform', 'zeros']),
                                  bias_initializer=hp.Choice('bias_initializer_', ['random_normal', 'random_uniform', 'zeros']),
                                  activation=hp.Choice('activation_', ['relu', 'sigmoid', 'Tanh'])))
                                                       
        model.add(kr.layers.Dense(self.num_classes,
                                  kernel_initializer=hp.Choice('kernel_initializer_', ['random_normal', 'random_uniform', 'zeros']),
                                  bias_initializer=hp.Choice('bias_initializer_', ['random_normal', 'random_uniform', 'zeros'])))
                   
        model.compile(loss = hp.Choice('loss_', ['mse', 'mae', 'msle']), metrics=['mae'])
        
        return model


hps = kt.HyperParameters()

hps.learning_rate = hps.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])

hps.optimizer = hps.Choice('optimizers_',['adam','RMSprop','SGD'])

hypermodel = MyHyperModel(num_classes=1, hp = hps)
   
tuner = kt.RandomSearch(
    hypermodel,
    objective='mae',
    executions_per_trial = 5,
    max_trials = 1000,
    directory='my_dir',
    project_name='HeatPrediction', overwrite=True)

stop_early = kr.callbacks.EarlyStopping(monitor='mae', patience=5)
tuner.search_space_summary()
@haifeng-jin
Copy link
Collaborator

For now, you will have to override the run_trial method in the Tuner class to tune that.
Later we may add a shortcut for tuning that.
We may also need to add a tutorial for this.

@haifeng-jin haifeng-jin added the documentation Improvements or additions to documentation label Aug 12, 2021
@LenarNizamov
Copy link
Author

Ok, thank you!

@EnriqueGautoSand
Copy link

How do you do that?
i have this code that tune batch size from #122
but also i want to tune the learning rate

class MyTuner(kt.tuners.Hyperband):
  def run_trial(self, trial, *args, **kwargs):
    # You can add additional HyperParameters for preprocessing and custom training loops
    # via overriding `run_trial`
    hp = trial.hyperparameters
    kwargs['batch_size'] = trial.hyperparameters.Int('batch_size', 32, 256, step=32)
    #kwargs['epochs'] = trial.hyperparameters.Int('epochs', 10, 30)
    if "tuner/epochs" in hp.values:
        kwargs["epochs"] = hp.values["tuner/epochs"]
        kwargs["initial_epoch"] = hp.values["tuner/initial_epoch"]
    return super(MyTuner, self).run_trial(trial, *args, **kwargs)

For now, you will have to override the run_trial method in the Tuner class to tune that. Later we may add a shortcut for tuning that. We may also need to add a tutorial for this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

3 participants