azureazure-functionsazure-cosmosdb

Azure Functions CosmosDB Trigger Input Batching


I have an Azure Function (isolated) with a CosmosDB trigger listening for changes in a container with thousands of documents.

Changes to the documents are made by another Azure Function running on a Timer Trigger, that fetches changes made to our CRM and processes these and applies necessary changes to the CosmosDB documents.

Most of the time, we only have to handle 10-20 changes per timer cycle, which means that our CosmosDB triggered function receives 10-20 input items to process. This works wonderfully and isn't causing any issues.

But, once in a while, our business needs require processing 1000-2000 changes in a given timer cycle. This means that the CosmosDB Azure Function is receiving around 100 input items "per-instance". This is causing timeouts for the function as execution time is surpassing the 10 minute limit for Consumption plans.

I'm aware that we could switch to a Premium plan, but due to the variable load on the functions, this doesn't make financial sense.

Is there a way to configure a limit to the number of documents received from our CosmosDB change trigger that are processed by each instance of the function?

For example:


Solution

  • Using isolated model, the CosmosDBTrigger has a MaxItemsPerInvocation attribute that should work for you (see documentation):

    (Optional) When set, this property sets the maximum number of items received per Function call. If operations in the monitored container are performed through stored procedures, transaction scope is preserved when reading items from the change feed. As a result, the number of items received could be higher than the specified value so that the items changed by the same transaction are returned as part of one atomic batch.

    The transaction scope could be an issue in your case tho.