power-automatepowerappspowerapps-canvaspower-platformpowerapps-modeldriven

How to prepare complex JSON payload with dynamic attributes in Power Automate Flow


I've one complex problem in Power Automate. Please HELP!!

I'm using two tables 'new_demotable001' and 'new_metadatatable'. One to trigger the flow and another as Mapping Sheet, respectively.

By using both I want to create a JSON payload that I'll use to call external API. Please help me create the payload.

<Details/Prerquisite>

  1. When a new row added in new_demotable001, flow triggers.
  2. The data/attributes we will receive from table are - Note: I've removed unnecessary attributes.
{
  "statecode": 0,
  "_statecode_label": "Active",
  "statuscode": 1,
  "_statuscode_label": "Active",
  "new_empstatus": 100000001,
  "new_empname": "EmpName909",
  "new_empcontact": "9090990909090909",
  "new_empexperience": "100000000,100000001,100000002"
}
  1. Then fetch details from 'new_metadatatable' table using "List Rows".
  2. The data/attributes we will receive from table are - Note: I've removed unnecessary attributes.
{
  "@Microsoft.Dynamics.CRM.totalrecordcount": -1,
  "@Microsoft.Dynamics.CRM.totalrecordcountlimitexceeded": false,
  "@Microsoft.Dynamics.CRM.globalmetadataversion": "11353407",
  "value": [
    {
      "new_sirionattributetype": "TEXT",
      "new_sirionattributename": "title",
      "new_dynamicsattributename": "new_empname"
    },
    {
      "new_sirionattributetype": "NUMERIC",
      "new_sirionattributename": "test1",
      "new_dynamicsattributename": "new_empcontact"
    },
    {
      "new_sirionattributetype": "SINGLE_REF",
      "new_sirionattributename": "orgId",
      "new_dynamicsattributename": "new_empstatus"
    },
    {
      "new_sirionattributetype": "MULTI_REF",
      "new_sirionattributename": "sirionExp",
      "new_dynamicsattributename": "new_empexperience"
    }
  ]
}
  1. Now, I want to create a JSON payload using above attributes. To create payload we will use only those attributes which are present in 'new_metadatatable' and also coming from 'new_demotable001'. -> The Condition I used - contains(triggerOutputs()?['body'], items('Apply_to_each')?['new_dynamicsattributename'])

  2. 'new_sirionattributetype' attribute value defines the structure of Expected Output Payload. If 'new_sirionattributetype' is TEXT/NUMERIC Output Payload will be normal JSON, if it is SINGLE_REF then Output Payload will have object {}, if it is MULTI_REF then Output Payload will have array object [].

  3. For above set of data, the expected output payload should be-

{
    "data": {
        "orgId": {
            "id": 100000001
        },
        "test1": "9090990909090909",
        "title": "EmpName909",
        "sirionExp": [
            {
                "id": 100000000
            },
            {
                "id": 100000001
            },
            {
                "id": 100000002
            }
        ]
    }
}

Please help me with this complex problem.

Also suggest, if this can be acheived by any other method like using script or something.


Solution

  • Here's a simple architecture that works for any payloads irrespective of the values and fields or number of values coming in.

    You can merge them into a single loop and apply conditions for branching. However, i have made it separately for better understanding of what's going on.

    What i did:

    1. filter out the metadata that have a corresponding value in 'data' table
    2. create three groups i.e text & numbers, objects and arrays
    3. merge text and numbers as it is
    4. convert objects into required schema using a compose.
    5. convert arrays to an array of required schema by splitting the items by a ','
    6. to merge the objects, We use Add Property function. Make sure your loop's concurrency is 1 otherwise it will be overwritten.

    Here's the final output:

    enter image description here

    Here's the detailed steps for each group.

    Exracting data from text and number datatypes

    Extracting data from object datatypes

    Extracting data from array types

    dont get confused with the "compose" within each loop. it is like a temporary storage for the JSON variable since power automate does not support self references and incrementing the variables.