tensorflowtensorflow-servinginference

How to prepare warmup request file for tensorflow serving?


Current version of tensorflow-serving try to load warmup request from assets.extra/tf_serving_warmup_requests file.

2018-08-16 16:05:28.513085: I tensorflow_serving/servables/tensorflow/saved_model_warmup.cc:83] No warmup data file found at /tmp/faster_rcnn_inception_v2_coco_2018_01_28_string_input_version-export/1/assets.extra/tf_serving_warmup_requests

I wonder if tensorflow provides common api to export request to the location or not? Or should we write request to the location manually?


Solution

  • At this point there is no common API for exporting the warmup data into the assets.extra. It's relatively simple to write a script (similar to below):

    import tensorflow as tf
    from tensorflow_serving.apis import model_pb2
    from tensorflow_serving.apis import predict_pb2
    from tensorflow_serving.apis import prediction_log_pb2
    
    def main():
        with tf.python_io.TFRecordWriter("tf_serving_warmup_requests") as writer:
            request = predict_pb2.PredictRequest(
                model_spec=model_pb2.ModelSpec(name="<add here>"),
                inputs={"examples": tf.make_tensor_proto([<add here>])}
            )
        log = prediction_log_pb2.PredictionLog(
            predict_log=prediction_log_pb2.PredictLog(request=request))
        writer.write(log.SerializeToString())
    
    if __name__ == "__main__":
        main()