-
Notifications
You must be signed in to change notification settings - Fork 396
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to tune the number of epochs and batch_size? #122
Comments
@ogreyesp Thanks for the issue! This comment is updated by @haifeng-jin because it was out-of-date. This is a barebone code for tuning batch size. class MyHyperModel(kt.HyperModel):
def build(self, hp):
model = keras.Sequential()
model.add(layers.Flatten())
model.add(
layers.Dense(
units=hp.Int("units", min_value=32, max_value=512, step=32),
activation="relu",
)
)
model.add(layers.Dense(10, activation="softmax"))
model.compile(
optimizer="adam", loss="categorical_crossentropy", metrics=["accuracy"],
)
return model
def fit(self, hp, model, *args, **kwargs):
return model.fit(
*args,
batch_size=hp.Choice("batch_size", [16, 32]),
**kwargs,
)
tuner = kt.RandomSearch(
MyHyperModel(),
objective="val_accuracy",
max_trials=3,
overwrite=True,
directory="my_dir",
project_name="tune_hypermodel",
) For epochs specifically, I'd alternatively recommend looking at using early stopping during training via passing in the # Will stop training if the "val_loss" hasn't improved in 3 epochs.
tuner.search(x, y, epochs=30, callbacks=[tf.keras.callbacks.EarlyStopping('val_loss', patience=3)]) For n-fold cross validation, you can also just do it in |
Thanks @omalleyt12. Your response is very helpful. |
This project is very important and useful for me. However, the lack of documentation and tutorials is hampering its use. For example, how can I determine the best subset of hyperparameters by conducting a cross validation? |
This comment is updated by @haifeng-jin because it was out-of-date. |
Please see pending PR here with a tutorial: #136 |
Is it possible to do tuning without creating a class? |
Thanks for the explanation on batch size. However, when I retrieve the parameters of the best model by tuner.get_best_hyperparameters()[0] and take a look at the values through .get_config()["values"] the batch_size is not listed there. |
@omalleyt12 @VincBar Was this issue resolved? Using KerasTuner for epoch and batch_size right now, too. Not very keen to have invisible results after 10hrs of running. |
@tolandwehr hey, I dont know if the direct way is solved, but I went around by inculding the batchsize hyperparam in the hypermodel and save it to self.batch_size ( or in my case actually a dictionary with some other stuff) and define a fit function in my hypermodel that then takes this (and whatever else the fit might need). |
@VincBar Sounds interesting. Could you give a code, if still available ^^'? |
other issue: got a NaN/Inf error after some hours of iterations... which is strange, cause I double checked the dataset with
and there were no NaNs
|
I would like to use Bayesian optimization tuner to tune epochs and batch size for a BLSTM model. My data is passed in using a custom data generator, which takes batch size as input. How do I use the Keras tuner in this case? |
hello @ogreyesp, |
I used the following code to optimise the number of epochs and batch size:
Now I want to save the number of epochs and batch size for the best trial that the tuner found. I tried using the following code suggested by @fredshu, but I could not get it working:
How is 'best_trial' defined? I use So how do I save the number of epochs and batch size of the best trial to seperate variables? If it is possible I would also like to save the other optimised hyperparameters. |
I am new to Keras and Tensorflow. I want to simultaneously explore the number of epochs and the CV for my project. Can you please help me to write the custom Tuner? |
@saranyaprakash2012 Can anyone give a code snippet that does that? The above example @omalleyt12 I mean that in the log the Keras Tuner shows it printed as if the batch size was taken into consideration, but the actual log also showed that the training generator ignored the Keras Tuner batch_size and just took a predefined value... Examples:
2/2 [==============================]
but in the hyper parameters of the tuner it showed `Hyperparameter |Value |Best Value So Far learning_rate |0.5 |0.5 decay |0.01 |0.01 momentum |0 |0 batch_size |2 |1 ` |
Try something like this : #create a model class def create_hypermodel(hp):
#Tuner class class MyTuner2(BayesianOptimization):
|
Could you make it more clear about what code goes into what block? Thanks! |
This guide is out of date. |
I had some problems with the below version. Namely, I couldn't make it to run with custom objective.
I added the return statement and it fixed that
|
Curious - is this considered the proper approach for tuning |
Yes, this is the official recommended approach. Thanks |
I am also trying to tune the
But in the search space i got this:
Here is the first trial:
Shouldn't the batch size appears both in the search space and the trial report? How do i know wich batch size is being used? |
Hi,
How I can tune the number of epochs and batch size?
The provided examples always assume fixed values for these two hyperparameters.
The text was updated successfully, but these errors were encountered: