As a serving solution for production environments, does TensorFlow Serving include any options for securing it's REST API?
I can run-up a REST API endpoint by following these instructions https://github.com/tensorflow/serving#serve-a-tensorflow-model-in-60-seconds
But how can I now restrict access to that API endpoint?
Unfortunately, TF serving does not have a ready-made solution for this. But, if you are using serving your models via kubernetes, you can use network policies.