I am currently migrating from the Azure Logic Apps Consumption tier to the Standard plan. In the new environment, we are utilizing Service Provider-based connections for accessing Azure Storage accounts, replacing the previously used API Connection resources.
During our time on the Consumption tier, we encountered connection limit issues when reusing the same API Connection object multiple times within a single workflow run. This often resulted in HTTP 429 (Too Many Requests) responses from the storage account.
As part of the migration, we have eliminated API Connections and consolidated multiple connection references into a single Service Provider connection for interacting with Azure Storage and other Azure services.
We would like to understand whether this new approach introduces any rate limiting or concurrency constraints, particularly in terms of the number of simultaneous connections to the Azure Storage account. The workflow triggers every seconds and connects to Storage account 5 times per trigger execution. We have 3 more workflows follow the similar pattern.
Following is the example workflow actions: Workflow.json
"Get_FHIR_Bundle_from_Blob": {
"type": "ServiceProvider",
"inputs": {
"parameters": {
"blobUri": "test.json",
"inferContentType": false
},
"serviceProviderConfiguration": {
"connectionName": "azureblob-1",
"operationId": "readBlobFromUri",
"serviceProviderId": "/serviceProviders/AzureBlob"
},
"retryPolicy": {
"type": "exponential",
"count": 5,
"interval": "PT10S"
}
}
},
"GET_TKE_Bundle_from_Blob": {
"type": "ServiceProvider",
"inputs": {
"parameters": {
"blobUri": "@{uriPath(body('Parse_Event_Data')?['data']?['url'])}",
"inferContentType": false
},
"serviceProviderConfiguration": {
"connectionName": "azureblob-1",
"operationId": "readBlobFromUri",
"serviceProviderId": "/serviceProviders/AzureBlob"
},
"retryPolicy": {
"type": "exponential",
"count": 5,
"interval": "PT10S"
}
}
},
"GET_AKE_Bundle_from_Blob": {
"type": "ServiceProvider",
"inputs": {
"parameters": {
"blobUri": "@{uriPath(body('Parse_Event_Data')?['data']?['url'])}",
"inferContentType": false
},
"serviceProviderConfiguration": {
"connectionName": "azureblob-1",
"operationId": "readBlobFromUri",
"serviceProviderId": "/serviceProviders/AzureBlob"
},
"retryPolicy": {
"type": "exponential",
"count": 5,
"interval": "PT10S"
}
}
},
*connections.json:*
{
"serviceProviderConnections": {
"azureblob-1": {
"displayName": "AzureBlob-MI",
"parameterSetName": "ManagedServiceIdentity",
"parameterValues": {
"authProvider": {
"Type": "ManagedServiceIdentity"
},
"blobStorageEndpoint": "@appsetting('AzureBlob_blobStorageEndpoint')"
},
"serviceProvider": {
"id": "/serviceProviders/AzureBlob"
}
}
}
You should now be facing a higher limit of 20,000
https://learn.microsoft.com/en-us/answers/questions/1301863/is-there-any-limitation-to-concurrent-connections
Shifting the plan type moves away from API limitations and into Blob storage concurrent upload limitations instead(which are much higher)
I have done this type of thing successfully with higher connection frequency than what you describe so you should be alright.