azureazure-data-factory

How to pass a dynamic file name from one activity to another?


I have a very simple pipeline in Azure Data Factory:

enter image description here

It consists of 2 consecutive Copy data activities:

  1. Fetch a JSON file from an API, and store in a Data Lake with a file name containing the current date and time. The sink file name for the first activity is @concat('parking_', formatDateTime(utcnow(), 'yyyy-MM-ddTHH:mm'), '.json')
  2. Load the recently fetched JSON file as sink and insert it into a SQL database

How can I save or output the filename so that it can be specified in and used by the last Copy data activity? If I use the same function for specifying the file name, it might fail, as the function specifying the file name might run at a different time.


Solution

  • How can I save or output the filename so that it can be specified in and used by the last Copy data activity?

    The easiest way to achieve this is by using event driven trigger for file create d in blob storage.

    And after first pipeline run file get stored in blob storage and then this trigger will trigger second pipeline which store the recently added Json files data to SQL database.