pythonkerasadam

Why is Adam's get_update only executed once?


Why get_update() is not executed every epochs?

myAdam:

    @interfaces.legacy_get_updates_support
    def get_updates(self, loss, params):
        print("update!")
        #Other code is the same

and this is compile and fit

model.compile(optimizer=Adam(lr=2e-5),loss='binary_crossentropy',metrics=['acc'])
his = model.fit_generator(train_genertor,steps_per_epoch=100,epochs=2,validation_data=val_genertor,validation_steps=50)

out:

update
Epoch 1/2
Epoch 2/2

Why is not

update
Epoch 1/2
update
Epoch 2/2

Solution

  • One important detail that people forget is that TensorFlow's computational model is slightly different from the common computational model. In TensorFlow you build a graph of operations, and then evaluate it with a session, to provide a set of actual inputs to produce outputs.

    This is different from normal code, where you would call functions over and over.

    In the case of get_updates of any Optimizer, the idea is that get_updates builds the operations that run one step of the optimizer, and then the optimizer is iteratively evaluated using a session, so get_updates only runs once to build the computational graph.