I am using Keras Tuner package.I trying to make hyperparameter tunning with the example explained here https://www.tensorflow.org/tutorials/keras/keras_tuner. Code is functioning very well but when I start code but when I try to start the second and third time I face problems.
tuner.search(X_train, Y_train, epochs=50, validation_split=0.2, callbacks=[stop_early])
# Get the optimal hyperparameters
best_hps=tuner.get_best_hyperparameters(num_trials=1)[0]
print(f"""
The hyperparameter search is complete. The optimal number of units in the first densely-connected
layer is {best_hps.get('units')} and the optimal learning rate for the optimizer
is {best_hps.get('learning_rate')}.
""")
After second execution code don't start and show me result from previous time.
INFO:tensorflow:Oracle triggered exit
The hyperparameter search is complete. The optimal number of units in the first densely-connected
layer is 128 and the optimal learning rate for the optimizer
is 0.001.
So any idea how to solve this problem?
Keras Tuner is saving checkpoints in a directory in your gcs or local dir. This is meant to be used if one wants to resume the search later. Since your search is already completed previously, running the search again will not do anything. You have to delete that directory first to restart search again.
In your example, before tuner search you will have the following:
tuner = kt.Hyperband(model_builder,
objective='val_accuracy',
max_epochs=10,
factor=3,
directory='my_dir',
project_name='intro_to_kt')
That is the directory to delete.
Next time, to automatically delete before you start, you can change that code to :
tuner = kt.Hyperband(model_builder,
objective='val_accuracy',
max_epochs=10,
factor=3,
directory='my_dir',
project_name='intro_to_kt',
# if True, overwrite above directory if search is run again - i.e. don't resume
overwrite = True)