I have a very simple pipeline in Azure Data Factory:
It consists of 2 consecutive Copy data activities:
@concat('parking_', formatDateTime(utcnow(), 'yyyy-MM-ddTHH:mm'), '.json')
How can I save or output the filename so that it can be specified in and used by the last Copy data
activity? If I use the same function for specifying the file name, it might fail, as the function specifying the file name might run at a different time.
How can I save or output the filename so that it can be specified in and used by the last Copy data activity?
The easiest way to achieve this is by using event driven trigger for file create d in blob storage.
@triggerBody().fileName
.And after first pipeline run file get stored in blob storage and then this trigger will trigger second pipeline which store the recently added Json files data to SQL database.