c++tensorflowbazeltflite

Bazel build tflite-micro: Install third-party packages to build my application


I am currently trying to understand how to initialize third-party software in tflite-micro: https://github.com/tensorflow/tflite-micro/tree/main/third_party

I installed the correct bazel version: sudo apt update && sudo apt install bazel-7.0.0.

But how do I initialize the third party libraries? I tried it like this:

tflite-micro$ bazel build third_party/flatbuffers
WARNING: Target pattern parsing failed.
ERROR: Skipping 'third_party/flatbuffers': no such target '//third_party/flatbuffers:flatbuffers': target 'flatbuffers' not declared in package 'third_party/flatbuffers' defined by /home/user/tflite-micro/third_party/flatbuffers/BUILD (Tip: use `query "//third_party/flatbuffers:*"` to see all the targets in that package)
ERROR: no such target '//third_party/flatbuffers:flatbuffers': target 'flatbuffers' not declared in package 'third_party/flatbuffers' defined by /home/user/tflite-micro/third_party/flatbuffers/BUILD (Tip: use `query "//third_party/flatbuffers:*"` to see all the targets in that package)

which didn't work. Ultimately, I want to build my minimal tflite-micro example:

#include <math.h>

#include "modelData.h"
#include "tensorflow/lite/core/c/common.h"
#include "tensorflow/lite/micro/micro_interpreter.h"
#include "tensorflow/lite/micro/micro_log.h"
#include "tensorflow/lite/micro/micro_mutable_op_resolver.h"
#include "tensorflow/lite/micro/micro_profiler.h"
#include "tensorflow/lite/micro/recording_micro_interpreter.h"
#include "tensorflow/lite/micro/system_setup.h"
#include "tensorflow/lite/schema/schema_generated.h"

TfLiteStatus LoadFloatModelAndPerformInference()
{
  const tflite::Model* model = ::tflite::GetModel( model );
  TFLITE_CHECK_EQ( model->version(), TFLITE_SCHEMA_VERSION );

  HelloWorldOpResolver op_resolver;
  TF_LITE_ENSURE_STATUS( RegisterOps( op_resolver ) );

  // Arena size just a round number. The exact arena usage can be determined
  // using the RecordingMicroInterpreter.
  constexpr int kTensorArenaSize = 3000;
  uint8_t tensor_arena[ kTensorArenaSize ];

  tflite::MicroInterpreter interpreter( model, op_resolver, tensor_arena, kTensorArenaSize );
  TF_LITE_ENSURE_STATUS( interpreter.AllocateTensors() );

  constexpr int kNumTestValues = 2;
  float inputs[ kNumTestValues ] = { 1.0f, 0.0f };

  for (int i = 0; i < kNumTestValues; ++i) {
    interpreter.input(0)->data.f[0] = inputs[i];
    TF_LITE_ENSURE_STATUS( interpreter.Invoke() );
    float y_pred = interpreter.output(0)->data.f[0];
  }

  return kTfLiteOk;
}

int main( int argc, char* argv[] )
{
    tflite::InitializeTarget();
    TF_LITE_ENSURE_STATUS( LoadFloatModelAndPerformInference() );
    return kTfLiteOk;
}

using my CMake script

cmake_minimum_required( VERSION 3.5 FATAL_ERROR )
project( Net)

set( TARGET Net)

add_executable( tensorflowLoader src/tensorflowLoader.cpp )
target_include_directories( ${TARGET} PRIVATE ${CMAKE_CURRENT_LIST_DIR}/tflite-micro/ )
target_include_directories( ${TARGET} PRIVATE ${CMAKE_CURRENT_LIST_DIR}/tflite-micro/third_party/ )
set( CMAKE_CXX_STANDARD 17 )


Solution

  • If you're trying to build everything under third_party/flatbuffers, the command it suggests - bazel query //third_party/flatbuffers:* - is a good idea, but it fails because of the odd setup this repo has.

    Running bazel query @flatbuffers//... --keep_going, I get this output:

    @flatbuffers//:flatbuffer_py_strip_prefix_python_flatbuffers___init___py
    @flatbuffers//:flatbuffer_py_strip_prefix_python_flatbuffers__version_py
    @flatbuffers//:flatbuffer_py_strip_prefix_python_flatbuffers_builder_py
    @flatbuffers//:flatbuffer_py_strip_prefix_python_flatbuffers_compat_py
    @flatbuffers//:flatbuffer_py_strip_prefix_python_flatbuffers_encode_py
    @flatbuffers//:flatbuffer_py_strip_prefix_python_flatbuffers_flexbuffers_py
    @flatbuffers//:flatbuffer_py_strip_prefix_python_flatbuffers_number_types_py
    @flatbuffers//:flatbuffer_py_strip_prefix_python_flatbuffers_packer_py
    @flatbuffers//:flatbuffer_py_strip_prefix_python_flatbuffers_table_py
    @flatbuffers//:flatbuffer_py_strip_prefix_python_flatbuffers_util_py
    @flatbuffers//:flatbuffers
    @flatbuffers//:flatc
    @flatbuffers//:flatc_headers
    @flatbuffers//:flatc_library
    @flatbuffers//:platform_freebsd
    @flatbuffers//:platform_openbsd
    @flatbuffers//:public_headers
    @flatbuffers//:runtime_cc
    @flatbuffers//:runtime_py
    @flatbuffers//:runtime_py_srcs
    @flatbuffers//:windows
    @flatbuffers//grpc/src/compiler:common_headers
    @flatbuffers//grpc/src/compiler:cpp_generator
    @flatbuffers//grpc/src/compiler:distribution
    @flatbuffers//grpc/src/compiler:go_generator
    @flatbuffers//grpc/src/compiler:java_generator
    @flatbuffers//grpc/src/compiler:python_generator
    @flatbuffers//grpc/src/compiler:python_generator_private
    @flatbuffers//grpc/src/compiler:swift_generator
    @flatbuffers//grpc/src/compiler:ts_generator
    @flatbuffers//grpc/tests:grpc_test
    @flatbuffers//reflection:distribution
    @flatbuffers//reflection:reflection_fbs_schema
    @flatbuffers//src:code_generators
    @flatbuffers//src:distribution
    @flatbuffers//src:flatbuffers
    @flatbuffers//src:flatc
    @flatbuffers//src:flatc_library
    @flatbuffers//src:generate_fbs
    

    Go into third_party/flatbuffers/BUILD.oss and see which ones you need. I suspect it's @flatbuffers//:flatbuffers.

    Based on what you're saying, I am guessing this doesn't have to be reproducible, so we can hack away without worrying about integrating this with Bazel?

    Try something like this:

    # Outputs are built into a tree you can find at the path `$(bazel info execution_root)`
    $ bazel build //third_party/flatbuffers/...
    
    # You'll also need to build everything else you're depending on...
    

    Then, copy that execution_root path into your cmake script. That might get you unblocked.

    That path has a hash which won't work across machines though; you'll need to integrate with bazel to get much further.

    Something like this:

    # In the root BUILD file
    cc_binary(
        srcs = ["your_file_name.cc"],
        deps = [
            "@flatbuffers//:flatbuffers", # Or whichever one you find you need
            ... # Other deps you need here.
        ],
        copts = ["-std=c++17"],
    )