I've been working a lot with microservices recently and the common pattern is that every service is responsible for its own data. thus service "A" can not access service "B" data directly without talking to service "B" via some http api or message queue.
Now I've started to pick up some work with azure functions for the first time. I've looked at a fair few examples and they all seem to have any old function just dabbling with data in a shared data store (Which seems like we're going back to the old style of having a massive monolithic database).
I was just wondering if there was a common pattern to follow with data storage when using Function as a Service? And where does the responsibilities lie?
The following screen snippet is an example of the event-driven distributed model of the business processors in the cloud-based solutions without using a monolithic database. More details about this concept and technique can be found in my article Using Azure Lease Blob
Note, that the each Business Context has own Lease Blob for holding a state of the processing with references to other resources such as metadata, config, data, results, etc. This concept allows to create a matrix (multi) dimensional business processing model, where each sub-nested process can have own Lease Blob.