pythontensorflowtensorflow-servingtensorflow2.0tensorflow-estimator

TensorFlow v2: Replacement for tf.contrib.predictor.from_saved_model


So far, I was using tf.contrib.predictor.from_saved_model to load a SavedModel (tf.estimator model class). However, this function has unfortunately been removed in TensorFlow v2. So far, in TensorFlow v1, my coding was the following:

 predict_fn = predictor.from_saved_model(model_dir + '/' + model, signature_def_key='predict')

 prediction_feed_dict = dict()

 for key in predict_fn._feed_tensors.keys():

     #forec_data is a DataFrame holding the data to be fed in 
     for index in forec_data.index:
         prediction_feed_dict[key] = [ [ forec_data.loc[index][key] ] ]

 prediction_complete = predict_fn(prediction_feed_dict)

Using tf.saved_model.load, I unsuccessfully tried the following in TensorFlow v2:

 model = tf.saved_model.load(model_dir + '/' + latest_model)
 model_fn = model.signatures['predict']

 prediction_feed_dict = dict()

 for key in model_fn._feed_tensors.keys(): #<-- no replacement for _feed_tensors.keys() found

     #forec_data is a DataFrame holding the data to be fed in 
     for index in forec_data.index:
         prediction_feed_dict[key] = [ [ forec_data.loc[index][key] ] ]

 prediction_complete = model_fn(prediction_feed_dict) #<-- no idea if this is anyhow close to correct

So my questions are (both in the context of TensorFlow v2):

  1. How can I replace _feed_tensors.keys()?
  2. How to inference in a straightforward way using a tf.estimator model loaded with tf.saved_model.load

Thanks a lot, any help is appreciated.

Note: This question is not a duplicate of the one posted here as the answers provided there all rely on features of TensorFlow v1 that have been removed in TensorFlow v2.

EDIT: The question postet here seems to ask basically the same thing, but until now (2020-01-22) is also unanswered.


Solution

  • Hope you have Saved the Estimator Model using the code similar to that mentioned below:

    input_column = tf.feature_column.numeric_column("x")
    estimator = tf.estimator.LinearClassifier(feature_columns=[input_column])
    
    def input_fn():
      return tf.data.Dataset.from_tensor_slices(
        ({"x": [1., 2., 3., 4.]}, [1, 1, 0, 0])).repeat(200).shuffle(64).batch(16)
    estimator.train(input_fn)
    
    serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
      tf.feature_column.make_parse_example_spec([input_column]))
    export_path = estimator.export_saved_model(
      "/tmp/from_estimator/", serving_input_fn)
    

    You can Load the Model using the code mentioned below:

    imported = tf.saved_model.load(export_path)
    

    To Predict using your Model by passing the Input Features, you can use the below code:

    def predict(x):
      example = tf.train.Example()
      example.features.feature["x"].float_list.value.extend([x])
      return imported.signatures["predict"](examples=tf.constant([example.SerializeToString()]))
    
    print(predict(1.5))
    print(predict(3.5))
    

    For more details, please refer this link in which Saved Models using TF Estimator are explained.