I'm trying to ingest some Microsoft Flow API data into Azure Log Analytics. Goal is that a Power Automate send a JSON with the Flows details to Log Analytics.
Here is the sample JSON :
{
"body": [
{
"NAME": "XXXXX",
"ID": "/providers/Microsoft.ProcessSimple/environments/XXXXXXX/flows/XXXXXXX/runs/XXXXX",
"TYPE": "Microsoft.ProcessSimple/environments/flows/runs",
"START": "2024-04-23T21:59:59.8317555Z",
"END": "2024-04-23T22:23:08.8817048Z",
"STATUS": "Succeeded"
},
{
"NAME": "XXXXX",
"ID": "/providers/Microsoft.ProcessSimple/environments/XXXXXXX/flows/XXXXXXX/runs/XXXXX",
"TYPE": "Microsoft.ProcessSimple/environments/flows/runs",
"START": "2024-04-22T21:59:59.6368987Z",
"END": "2024-04-22T22:25:59.2561963Z",
"STATUS": "Succeeded"
},
{
"NAME": "XXXXX",
"ID": "/providers/Microsoft.ProcessSimple/environments/XXXXXXX/flows/XXXXXXX/runs/XXXXX",
"TYPE": "Microsoft.ProcessSimple/environments/flows/runs",
"START": "2024-04-21T22:00:00.4246672Z",
"END": "2024-04-21T22:24:54.7721214Z",
"STATUS": "Succeeded"
},
{
"NAME": "XXXXX",
"ID": "/providers/Microsoft.ProcessSimple/environments/XXXXXXX/flows/XXXXXXX/runs/XXXXXX",
"TYPE": "Microsoft.ProcessSimple/environments/flows/runs",
"START": "2024-04-17T09:49:45.8327243Z",
"END": "2024-04-17T09:50:46.3459275Z",
"STATUS": "Succeeded"
}
]
}
First time using KQL, i asked GPT a lot but nothing really work My last attempt was to go with mv-apply instead of mv-expand :
source
| extend parsedJson = parse_json(body)
| mv-apply parsedItem = parsedJson on
(
project
TimeGenerated = todatetime(parsedItem['START']), // Convert 'START' to DateTime
Name = tostring(parsedItem['NAME']),
ID = tostring(parsedItem['ID']),
Type = tostring(parsedItem['TYPE']),
StartTime = tostring(parsedItem['START']),
EndTime = tostring(parsedItem['END']),
Status = tostring(parsedItem['STATUS'])
)
Still no luck, throwing me some mismatch error :
Error occurred while compiling query in query: SyntaxError:0x00000003 at 3:11 : mismatched input 'parsedItem' expecting {<EOF>, ';', '|', '.', '*', '[', '=~', '!~', 'notcontains', 'containscs', 'notcontainscs', '!contains', 'contains_cs', '!contains_cs', 'nothas', 'hascs', 'nothascs', '!has', 'has_cs', '!has_cs', 'startswith', '!startswith', 'startswith_cs', '!startswith_cs', 'endswith', '!endswith', 'endswith_cs', '!endswith_cs', 'matches regex', '/', '%', '+', '-', '<', '>', '<=', '>=', '==', '<>', '!=', 'and', 'between', 'contains', 'has', 'in', '!between', '!in', 'or'}
it seems that inside 'Body' element, each segment is a number, and i believe this is why it's hurting me so much !
You can use below design
in Logic Apps to send data and create a table with custom json:
Taken your input in compose:
Then:
Parse_json:
{
"type": "object",
"properties": {
"body": {
"type": "array",
"items": {
"type": "object",
"properties": {
"NAME": {
"type": "string"
},
"ID": {
"type": "string"
},
"TYPE": {
"type": "string"
},
"START": {
"type": "string"
},
"END": {
"type": "string"
},
"STATUS": {
"type": "string"
}
},
"required": [
"NAME",
"ID",
"TYPE",
"START",
"END",
"STATUS"
]
}
}
}
}
Connection of Azure Log Analytics Data Collector:
Taken values from below:
Output:
Table got created:
Logic App: