I want to build a loss in a "pythonic" way using the eager execution of TF2, but even in eager mode, Keras is passing non-eager tensors.
Code:
def conditional_loss(self, y_true, y_pred):
print(y_true)
return 0
def define_model(self):
self.model = keras.Sequential([
keras.layers.Dense(units=768),
keras.layers.BatchNormalization(),
keras.layers.ReLU(),
keras.layers.Dropout(0.2),
keras.layers.Dense(units=128),
keras.layers.BatchNormalization(),
keras.layers.ReLU(),
keras.layers.Dropout(0.2),
keras.layers.Dense(units=5, activation='softmax')
])
self.model.compile(optimizer='adam',
loss=self.conditional_loss,
metrics=[self.conditional_loss,
keras.metrics.sparse_categorical_accuracy]
)
self.model.fit(
self.train_dataset,
epochs=10,
validation_data=self.test_dataset,
callbacks=[tensorboard_callback, model_callback],
)
If I print y_true
in conditional_loss
TF prints a non-eager tensor.
Tensor("metrics/conditional_loss/Cast:0", shape=(None, 1), dtype=float32)
If I build my own keras.Model()
I can call it with the argumentdynamic=True
to enable eager execution. (Reference). Exists a way to do it in keras.Sequential()
?
To do that you have to call model.compile()
with the argument run_eagerly=True
. Following the question example:
self.model.compile(optimizer='adam',
loss=self.conditional_loss,
metrics=[self.conditional_loss,
keras.metrics.sparse_categorical_accuracy],
run_eagerly=True
)