I would like to know if it is possible to use openvino in a .net application?
I have converted a yolo network to a onnx network to use with ml.net. What I would like to do next is to implement openvino to see if it speeds up. So far I have converted my onnx model with openvino Model_optimizer but so far could not find any way to implement openvino in a .net app.
Thank you
Up till this moment, there's no official support for OpenVINO integration with .NET application. Instead, the OpenVINO has its own Inference Engine application that supports both C++ and Python. You may refer here for more info.
Performance-wise, since you mention speeding things up, you could try to use OpenVINO Post-Training Optimization Tool to accelerate the inference of deep learning models.
Plus, ensure to choose the right precision for Deep Learning model according to the hardware you are going to use for inferencing.