azureazure-data-factoryazure-storageazcopyazure-file-share

how to copy the fileshare data between the subscriptions of storage accounts - scenario given in description


In the Same Azure Tenant, we have 2 storage accounts in 2 subscriptions. In 1st Subscription - file share storage account, we have data in different file shares such as:

We need to move the data to the 2nd subscription - file share storage account which has file shares like:

In Simple Context, we have to move data from Fileshare1uat to fileshare1stg and same to next fileshares.

Command I tried is:

azcopy copy 'https://stacc1uat.file.core.windows.net/fileshare1uat/*?sv=2022-11-
02&ss=f&srt=sco&sp=rwdlc&se=2024-04-18T07:48:11Z&st=2024-04-17T23:48:11Z&spr=https&sig=Afcze2DZGJ5fSR2KZF5uoTLK%2Fn%2BQRP9YGxDgoFK44Jc%3D' 
'https://stacc1stg.file.core.windows.net/fileshare1stg/?sv=2022-11-02&ss=f&srt=sco&sp=rwlc&se=2024-04-18T07:50:41Z&st=2024-04-
17T23:50:41Z&spr=https&sig=egUK64ONwPxKrU026tSxgubAtzdYiD75MW8pIDZrlZc%3D' --recursive=true

Error: fatal: from-to argument required, PipeBlob (upload) or BlobPipe (download) is acceptable

I also didn't find any article in implementing above using ADF for copying multiple fileshares from source subscription storage account to destination subscription storage account (specific fileshares).

Please help me.


Solution

  • You can achieve your requirement in ADF like below.

    For this, you need an array containing information about your source and target fileshare names.

    [
    {
    "source_fs":"rakeshfileshare1",
    "target_fs":"rakeshtarget1"
    },
    {
    "source_fs":"rakeshfileshare2",
    "target_fs":"rakeshtarget2"
    }
    ]
    

    In ADF, Go to Manage -> Linked services -> create two new linked services of Fileshare type. Here, one for the source subscription and another for the target subscription storage accounts. Give any fileshare names in that for sample and click on create.

    enter image description here

    After creating two linked services, Go to linked service -> {} to edit the linked service JSON.

    Add the below properties to the JSON.

    "parameters": {
                "filesharename": {
                    "type": "string"
                }
            }
    

    and replace your sample fileshare name with below expression.

    @{linkedService().filesharename}
    

    enter image description here

    Click on apply and similarly, do the same for the second linked service as well.

    Now, the linked services became the parameterized linked services.

    enter image description here

    Using this parameter, we can change the fileshare name to copy from multiple sources to multiple target fileshares.

    Now, create two Binary datasets(source and target) of Fileshare type. Give the above linked services to each.

    In the source dataset, create a string type parameter filesharename like below.

    enter image description here

    Give the above parameter @dataset().filesharename to the linked service parameter.

    enter image description here

    Leave the remaining fields as empty.

    Similarly, do the same for the other Binary dataset(target).

    Now, create a pipeline with a for-each activity. In the pipeline parameters section, create an array type parameter and give the above Fileshares list as default value to it.

    Use this parameter in the For-Each expression.

    enter image description here

    Inside for-each activity, take a copy activity and give the binary datasets as source and target. Pass the @item().source_fs to the source binary dataset parameter and use the wild card file path like below.

    enter image description here

    In the sink, pass the @item().target_fs to the target binary dataset parameter and select Preserve hierarchy as the Copy behavior.

    enter image description here

    Now, debug the pipeline and the pipeline will copy each source fileshare data to its target fileshare in each iteration as per the given fileshare list.

    enter image description here