How can I tune the optimization function with Keras Tuner? I want to try SGD, Adam and RMSprop.
I tried:
hp_lr = hp.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])
hp_optimizer = hp.Choice('optimizer', values=[SGD(learning_rate=hp_lr), RMSprop(learning_rate=hp_lr), Adam(learning_rate=hp_lr)])
model.compile(optimizer=hp_optimizer,
loss="sparse_categorical_crossentropy",
metrics=["accuracy"])
but this doesn't work as "A Choice
can contain only one type of value"
Probably the best way is to do something like this:
hp_optimizer = hp.Choice('optimizer', values=['sgd', 'rmsprop', 'adam'])
if hp_optimizer == 'sgd':
optimizer = SGD(learning_rate=hp_lr)
elif hp_optimizer == 'rmsprop':
optimizer = RMSprop(learning_rate=hp_lr)
elif hp_optimzier == 'adam':
optimizer = Adam(learning_rate=hp_lr)
else:
raise
model.compile(optimizer=optimizer,
loss="sparse_categorical_crossentropy",
metrics=["accuracy"])
Obviously you'd want a more descriptive exception (or just leave it since it should never occur anyway). Even if the different optimizers were of the same class, IIRC hp.Choice
only allows ints, floats, bools and strings, so I don't see a way around doing it like this.