pythonkerasneural-networktf.keras

Training multiple models in series in Keras for hyperparameter optimization


The idea is to train multiple models using the same training dataset, changing some parameters each time in order to see which parameter works best. In order to do so I would need every model to be trained from scratch every time.

My current code (simplified) is:

scores= []

for i in range(n):
    model = Sequential()
    model.add(...)
    model.compile(...)
    model.fit(...)
    scores.append([i, model.score(...)])

for score in scores:
    print(score)

It runs as expected printing:

[0, 0.89712456798]
[1, 0.76652347349]
[2, 0.83178943210]
...

but I can't understant if the code does what described above or if, on the contrary, trains models that depend each on the previous one.


Solution

  • Each time you call

    model = Sequential()
    

    your model is re-initialized, so the code sketch above does indeed what you want it to do, i.e. fitting a new model from scratch for each loop iteration.