pythontensorflowtensorflow-servingtensorflow-estimator

Use `tf.contrib.predictor` to predict on batches from `tf.estimator.export_savedmodel` for TF 1.13


I found several examples to load my saved estimator, my_estimator.export_savedmodel(export_dir, export_input_fn) model, as a predictor like so, predictor = tf.contrib.predictor.from_saved_model(export_dir). This works great when my tf.train.Example has only one item. How can I make it work for a batch for TF 1.13?

model_input= tf.train.Example(features=tf.train.Features(feature={
      'browser_name': tf.train.Feature(bytes_list=tf.train.BytesList(value=[b"chrome", b"ie"])),
      'version': tf.train.Feature(float_list=tf.train.FloatList(value=[8.0, 11.0]))     
  })).SerializeToString()
predictor({"inputs":[model_input]})

call fails when there are multiple inputs per feature.


Solution

  • Here's a working example with tensorflow 1.13.1 :

    import tensorflow as tf
    import pandas as pd 
    import numpy as np
    
    
    prod_export_dir = 'my_model_dir'
    data =  pd.read_csv('my_data.csv')
    
    predictor = tf.contrib.predictor.from_saved_model(prod_export_dir)
    
    model_input = {}
    for k, v in predictor.feed_tensors.items():
        model_input[k] = np.array(data[k].tolist(), dtype=v.dtype.as_numpy_dtype)
    
    prediction = predictor(model_input)