azureazure-web-app-serviceazure-machine-learning-service

Deploying models to web service in private network on Azure Machine Learning


When my Azure Machine Learning workspace is configured in private network and isolated from puclic network, what happens when deploying models to web service?

I understand that online(real time) endpoint and batch endpoint inherit workspace's network configuration but can't find any official Microsoft documents.

Is the deployment to web-service also same?

real time endpoint : https://learn.microsoft.com/en-us/azure/machine-learning/concept-secure-online-endpoint?view=azureml-api-2&tabs=azure-studio batch endpoint : https://learn.microsoft.com/en-us/azure/machine-learning/how-to-secure-batch-endpoint?view=azureml-api-2

references

real time endpoint : https://learn.microsoft.com/en-us/azure/machine-learning/concept-secure-online-endpoint?view=azureml-api-2&tabs=azure-studio batch endpoint : https://learn.microsoft.com/en-us/azure/machine-learning/how-to-secure-batch-endpoint?view=azureml-api-2


Solution

  • If you observe when you deploy to a web service, it provides 2 options for compute type:

    1. AksCompute
    2. Azure Container Instance

    enter image description here

    Here, you need to configure your private network settings. When you try to create AksCompute, it asks for the Kubernetes Service which you need to configure for a secure Kubernetes inferencing environment.

    Refer to this documentation which explains two inference options that can be secured using a VNet:

    1. Azure Machine Learning managed online endpoints
    2. Azure Kubernetes Service

    You can refer to using Azure Kubernetes Service, which will be used in web service deployment.