From the solution of the following question: Copy data activity failing when entering the table name dynamically in sink dataset
how to modify the pipeline for copying multiple tables at a time with the similar scenario like:
Dev Subscription (devstacuks01)
UAT subscription (uatstacuks01)
devtable01 data --> uattable01 devtable02 data --> uattable02
For this, you need source and its target table names list like below.
[
{
"source_table":"mytable1",
"target_table":"target1"
},
{
"source_table":"mytable2",
"target_table":"target2"
}
]
In ADF pipeline, create an array parameter and give the above value as default value.
You have mentioned that your storage accounts are from different subscriptions. So, you cannot use single linked service. You need to create two linked services of Table storage type, one for the source storage account and another for the target storage account.
Similarly, create another linked service for the target subscription storage account.
Next, you need to create datasets. In case of datasets as well, you need to two datasets, one for the source linked service and another for the target linked service.
As you want to copy multiple tables, you need to parameterize the table names in the datasets.
Create a string parameter table_name
and use that for the Table property in the dataset.
Similarly, do the same for the target dataset as well.
Now, in the pipeline, take a For-Each activity and give the array parameter that created earlier to the for-each expression.
Inside For-each take copy activity with above datasets as source and sink. Now, give the table names @item().source_table
and @item().target_table
from the for-loop to the dataset parameters in both source and sink of the copy activity respectively.
Now, debug the pipeline and the source tables data will be copied to the target tables in every iteration like below.