I followed the instructions here https://learn.microsoft.com/en-us/azure/data-factory/how-to-create-event-trigger? to create a pipeline and trigger but when I add a new file the trigger @triggerBody().folderPath and @triggerBody().fileName
in the body of the trigger looks like this:
{
"outputs": {
"body": {
"RunToken": "AAEAFO....."
},
"headers": {
...
}
},
"originHistoryName": "08585080678291936867780979968CU85",
"endTime": "2023-09-01T03:30:56.2837211Z",
"status": "Succeeded",
"startTime": "2023-09-01T03:30:56.2837211Z",
"clientTrackingId": ".....",
"name": "Trigger_421B8CAF-BE66-42CF-83DA-E3028693F304",
"trackingId": "....."
}
What am i doing wrong?
@triggerBody().folderPath
and @triggerBody().filename
are the properties which fetch the folder path and file name respectively. there is no way to get these values in the trigger body it will just give run token and some other properties refer this similar SO queation
When you add these properties as a value to your pipeline parameters it will fetch that value from trigger run and assign it to your pipeline parameter
There is event load for the trigger which provides you all this information like metadata of blob to get that you can use this trigger run property @trigger().outputs.body.event
It will give you output like below:
{
"TriggerTime": "09/04/2023 12:26:53",
"EventPayload": {
"topic": "/subscriptions/Id/resourceGroups/resource group name/providers/Microsoft.Storage/storageAccounts/dlsg2p",
"subject": "/blobServices/default/containers/pratik/blobs/ParentFolder/SubFolder01/csv1.csv",
"eventType": "Microsoft.Storage.BlobCreated",
"id": "d3c52f27-601f-0028-442b-dfa9de06fe0c",
"data": {
"api": "FlushWithClose",
"requestId": "d3c52f27-601f-0028-442b-dfa9de000000",
"eTag":"0x8DBAD4237409006",
"contentType":"application/octet-stream",
"contentLength": 33,"contentOffset": 0,
"blobType": "BlockBlob",
"blobUrl": "https://dlsg2p.blob.core.windows.net/pratik/ParentFolder/SubFolder01/csv1.csv",
"url": "https://dlsg2p.dfs.core.windows.net/pratik/ParentFolder/SubFolder01/csv1.csv",
"sequencer": "0000000000000000000000000001e5460000000000010361",
"identity": "$superuser",
"storageDiagnostics": {
"batchId": "e672ea68-1006-0068-002b-df8030000000"}},
"dataVersion": "3",
"metadataVersion": "1",
"eventTime": "2023-09-04T12:26:50.9512037Z"}
}
here we are getting filename and folder path: