tensorflowkerasprotocol-bufferstensorflow2.0keras-2

how to properly saving loaded h5 model to pb with TF2


I load a saved h5 model and want to save the model as pb. The model is saved during training with the tf.keras.callbacks.ModelCheckpoint callback function.

TF version: 2.0.0a
edit: same issue also with 2.0.0-beta1

My steps to save a pb:

  1. I first set K.set_learning_phase(0)
  2. then I load the model with tf.keras.models.load_model
  3. Then, I define the freeze_session() function.
  4. (optional I compile the model)
  5. Then using the freeze_session() function with tf.keras.backend.get_session

The error I get, with and without compiling:

AttributeError: module 'tensorflow.python.keras.api._v2.keras.backend' has no attribute 'get_session'

My Question:

  1. Does TF2 not have the get_session anymore? (I know that tf.contrib.saved_model.save_keras_model does not exist anymore and I also tried tf.saved_model.save which not really worked)

  2. Or does get_session only work when I actually train the model and just loading the h5 does not work Edit: Also with a freshly trained session, no get_session is available.

    • If so, how would I go about to convert the h5 without training to pb? Is there a good tutorial?

Thank you for your help


update:

Since the official release of TF2.x graph/session concept has changed. The savedmodel api should be used. You can use the tf.compat.v1.disable_eager_execution() with TF2.x and it will result in a pb file. However, I am not sure what kind of pb file type it is, as saved model composition changed from TF1 to TF2. I will keep digging.


Solution

  • I do save the model to pb from h5 model:

    import logging
    import tensorflow as tf
    from tensorflow.compat.v1 import graph_util
    from tensorflow.python.keras import backend as K
    from tensorflow import keras
    
    # necessary !!!
    tf.compat.v1.disable_eager_execution()
    
    h5_path = '/path/to/model.h5'
    model = keras.models.load_model(h5_path)
    model.summary()
    # save pb
    with K.get_session() as sess:
        output_names = [out.op.name for out in model.outputs]
        input_graph_def = sess.graph.as_graph_def()
        for node in input_graph_def.node:
            node.device = ""
        graph = graph_util.remove_training_nodes(input_graph_def)
        graph_frozen = graph_util.convert_variables_to_constants(sess, graph, output_names)
        tf.io.write_graph(graph_frozen, '/path/to/pb/model.pb', as_text=False)
    logging.info("save pb successfully!")
    

    I use TF2 to convert model like:

    1. pass keras.callbacks.ModelCheckpoint(save_weights_only=True) to model.fit and save checkpoint while training;
    2. After training, self.model.load_weights(self.checkpoint_path) load checkpoint;
    3. self.model.save(h5_path, overwrite=True, include_optimizer=False) save as h5;
    4. convert h5 to pb just like above;