tensorflowtensorflow-serving

Tensorflow Serving error serviing tensorflow version 1.5 model


Getting the following error message while run Tensorflow Serving in docker container

2019-12-12 03:25:13.947401: I tensorflow_serving/model_servers/server.cc:85] Building single TensorFlow model file config:  model_name: mymodel model_base_path: /models/mymodel 
2019-12-12 03:25:13.947870: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
2019-12-12 03:25:13.947891: I tensorflow_serving/model_servers/server_core.cc:573]  (Re-)adding model: mymodel
2019-12-12 03:25:14.058166: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: mymodel version: 1}
2019-12-12 03:25:14.058430: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: mymodel version: 1}
2019-12-12 03:25:14.059106: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: mymodel version: 1}
2019-12-12 03:25:14.064459: E tensorflow_serving/util/retrier.cc:37] Loading servable: {name: mymodel  version: 1} failed: Not found: Specified file path does not appear to contain a SavedModel bundle (should have a file called `saved_model.pb`)
Specified file path: /models/mymodel/1

The model has been built using tensorflow v1.5 and does not have a *.pb file. Is it possible to run this version of tensorflow model? any ideas are appreciated. Thanks in advance.


Solution

  • Yes, you can deploy model trained on Tensorflow v1.5 on tfserving.

    TfServing requires the SavedModel format.

    Probably your training script might have some configuration issue. (But it is hard to pinpoint since you haven't provided the code, always try to include the code in your question on SO for better understanding for others)


    Answer to your question,

    For getting the SavedModel format, train your model via official script.

    After training, you would get the following directory structure in your specified model directory.

    <model_dir>
    |
    |----- variables
    |         |------- variables.data-00000-of-00001
    |         |------- variables.index
    |
    |----- saved_model.pb
    

    Then you can directly specify the <model_dir> path to the tfserving and it will use this model.