I am trying to create a data pipeline in ADF. Where I am creating a copy activity to extract data from a REST API and store it in azure SQL database.
The REST API I am trying to extract have column names and values in different arrays. Following is a sample of how the API looks like.
{
"dataset": {
"id": num_id,
"dataset_code": "CodeX",
"database_code": "BSE",
"name": "copmany",
"description": "company description",
"refreshed_at": "2023-09-01 14:25:40 UTC",
"newest_available_date": "2020-03-20",
"oldest_available_date": "2010-03-18",
"column_names": [
"Date",
"Open",
"High",
"Low",
"Close",
],
"frequency": "daily",
"type": "Time Series",
"premium": false,
"limit": null,
"transform": null,
"column_index": null,
"start_date": "2010-03-18",
"end_date": "2020-03-20",
"data": [
[
"2020-03-20",
4.5,
4.7,
4.45,
4.48
],
[
"2020-03-19",
4.55,
4.67,
4.15,
4.5
]
],
"collapse": null,
"order": null,
"database_id": 3287
}
}
When I click on import schema it gives me all the data including dataset descriptions. following is a snapshot of the same.snapshot of the imported schema
So in collection refference I selected the "$['dataset']['data']" option to only get the values. But I am getting an import schema error saying "Failed" with description "couldn't understand path 0"
EDIT:
I want the output to be save into my azuresql sink to look like follows Date Open Close High Low 2020-03-20 4.5 4.7 4.45 4.48 2020-03-19 4.55 4.67 4.15 4.5
And the desired schema is
Date type: date Open type: decimal Close type: decimal High type: decimal Low type: decimal
Have attached a snapshot of the error msg I am getting while importing schema. snapshot of the error msg
Edit 2: -Here is the snapshot of the projection. image of the projection
Looking at the data you provided first of all the data array has a nested array and has only single value not objects. because of it when you are trying to import schema you are getting this error as
couldn't understand path 0
To resolve this, you need to use the Data flow activity where you can flatten this dataset object flatten the array according to your requirement.
Check below steps:
First you need to add the source in your data flow activity
Take the flatten activity and select the Unroll By
and Unroot by
as dataset.data and map the column as below.
It will give you an Array of data as below:
After this Take derived column transformation to transform the Array to individual values.
Output: