ctensorflowmachine-learningc-api

Tensorflow c_api decision forest savedModel loaded but status is not TF_OK


I am using tensorflow decision forests. I trained my model using Python and saved the model with SavedModel format. Then for inference I am trying to load the model in C using tensorflow C_API. I found that for this task I need to load the decision forest inference.so file from Python package.

You can use this command in Debian 10 to install Python package:

pip3 install tensorflow-decision-forests

After that in my program I load the inference.so file using TF_LoadLibrary. Then I load the model using TF_LoadSessionFromSavedModel.

Here is the code

#include <stdio.h>
#include <tensorflow/c/c_api.h>

int main() {
  TF_Graph *Graph = TF_NewGraph();
  TF_Status *Status = TF_NewStatus();
  TF_SessionOptions *SessionOpts = TF_NewSessionOptions();
  TF_Buffer *RunOpts = NULL;
  TF_Library *library;

  library = TF_LoadLibrary("/home/user/.local/lib/python3.7/site-packages/tensorflow_decision_forests/tensorflow/ops/inference/inference.so",
                              Status);

  const char *saved_model_dir = "randomforests-model/";
  const char *tags = "serve";
  int ntags = 1;

  TF_Session *Session = TF_LoadSessionFromSavedModel(
      SessionOpts, RunOpts, saved_model_dir, &tags, ntags, Graph, NULL, Status);

  printf("status: %s\n", TF_Message(Status));

  if(TF_GetCode(Status) == TF_OK) {
    printf("loaded\n");
  }else{
    printf("not loaded\n");
  }

  return 0;
}

Output

$ gcc -g main.c -ltensorflow -o main.out
user@debian:/home/code/tftest$ ./main.out 
Hello from TensorFlow C library version 2.7.0-dev20211101
2022-01-22 19:39:28.539621: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: randomforests-mtproto-model-001028/
2022-01-22 19:39:28.547223: I tensorflow/cc/saved_model/reader.cc:107] Reading meta graph with tags { serve }
2022-01-22 19:39:28.547792: I tensorflow/cc/saved_model/reader.cc:148] Reading SavedModel debug info (if present) from: randomforests-mtproto-model-001028/
2022-01-22 19:39:28.548298: I tensorflow/core/platform/cpu_feature_guard.cc:151] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-01-22 19:39:28.598841: I tensorflow/cc/saved_model/loader.cc:210] Restoring SavedModel bundle.
2022-01-22 19:39:28.743885: I tensorflow/cc/saved_model/loader.cc:194] Running initialization op on SavedModel bundle at path: randomforests-mtproto-model-001028/
[INFO kernel.cc:1153] Loading model from path
[INFO decision_forest.cc:617] Model loaded with 300 root(s), 618972 node(s), and 28 input feature(s).
[INFO abstract_model.cc:1063] Engine "RandomForestOptPred" built
[INFO kernel.cc:1001] Use fast generic engine
2022-01-22 19:39:30.922861: I tensorflow/cc/saved_model/loader.cc:283] SavedModel load for tags { serve }; Status: success: OK. Took 2383248 microseconds.

status: No shape inference function exists for op 'SimpleMLLoadModelFromPathWithHandle', did you forget to define it?
not loaded

The problem is that the function output is telling that the model is loading with Status: success: OK but the Status variable is no equal to TF_OK and the related message is No shape inference function exists for op 'SimpleMLLoadModelFromPathWithHandle', did you forget to define it?

So how can I load the model in the right way?


Solution

  • After a long time and didn't get an answer, I asked the question in the TensorFlow's forum and get the answer. It seems that the current version of TensorFlow has a problem with loading decision forests with C_API. So we can use the Yggdrasil library as discussed in the answer.