I need to build Tensorflow as a static library to include into a product. As of now it seems that there's only support for building a shared/dynamic library with bazel. My current objective is to build a library for MacOS(darwin-arm64), but I'm also going to build one for x86.
Has anyone solved this before?
I've gotten some things to work thanks to this thread: https://github.com/tensorflow/rust/pull/351
What I've done is to compile and leave all of the .a and .lo file cache:
bazel build --config=monolithic --macos_minimum_os=11.0 --cpu=darwin_arm64 -j 1 //tensorflow:libtensorflow_cc.so
And then tried to link them together using libtool with using the param generated by bazel to try and get the needed files sorting out unwanted lines and filtering duplicates:
libtool -static -o libtensorflow_arm64_libtool_SO3.a $(cat bazel-bin/tensorflow/libtensorflow_cc.so.*.params | sed -e 's!-Wl,-force_load,!!' | grep -e '\.a$' -e '\.lo$' | sort -t: -u -k1,1)
Some simple things work with this approach but I can for instance run into this following error whilst running my code interfacing the C-API:
F tensorflow/core/framework/allocator_registry.cc:85] No registered CPU AllocatorFactory
Indeed there's no support whatsoever at the moment for building a static library for the Tensorflow C-API. This is due to bazel being the build tool. At the moment of writing bazel doesn't have support for writing static libraries:
https://github.com/bazelbuild/bazel/issues/1920
This has been an issue for quite some time, and this is also the reason why the entire C-API can't be built as a static library at the moment.
But, there's a way around this. You can build Tensorflow lite as a static library with Cmake
as can be found here in the Tensorflow git repository:
https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite
I also found this thread very helpful: TensorFlow static C API library - how to link with 10 sub-dependencies?
After building this you will also need to include the Google flatbuffer library in your project(which you of course can include in your static library as well): https://github.com/google/flatbuffers
TFlite can run most models and works even for my most complex models I've built. So it is the best way to get Tensorflow to work as a static library at the moment. For more information on TFLite see: https://www.tensorflow.org/lite