azureazure-pipelinesazure-data-factory

how to set Dynamic add content in Compression type in copy activity


you can see that. there is option of compression typeI created the dynamic dataset and linker service. In copy activity i used that dynamic dataset and want to pass parameter in copy activity or use Compression type option as parameter/Dynamic, when i click on Compression type there is drop down menu and found option of add dynamic add content, but unable how to add or use this option Here is drop down menu and option of add dynamic content . here you can see that i am unable to add it dynamic when i choose dynamic option from drop down menu

I implement the answer in azure and found some issue , can you please have a look.

first i created two parameters enter image description here

After parameters i edit the json as well, you can review it
enter image description here enter image description here

and pass the parameter as string enter image description here

But the. main issue is that i am unable to get these parameters into datasets You can check this picture. enter image description here


Solution

  • This behavior might be a bug in the datasets of ADF. You can achieve your requirement by changing the dataset JSON as shown below.

    First create the required dataset parameters.

    enter image description here

    Now, go to the dataset JSON by clicking on the icon {} at the top right corner of the dataset and edit the JSON to add the dynamic content expression.

    Here, I have used Binary dataset, and I have edited the dataset JSON as shown below.

    {
        "name": "Binary_source_compression",
        "properties": {
            "linkedServiceName": {
                "referenceName": "ADLS_nov20th_test",
                "type": "LinkedServiceReference"
            },
            "parameters": {
                "comp_type": {
                    "type": "string"
                },
                "comp_level": {
                    "type": "string"
                }
            },
            "annotations": [],
            "type": "Binary",
            "typeProperties": {
                "location": {
                    "type": "AzureBlobFSLocation",
                    "fileName": "mainzip.zip",
                    "fileSystem": "zipsinputcon"
                },
                "compression": {
                    "type": {
                        "value": "@dataset().comp_type",
                        "type": "Expression"
                    },
                    "level": {
                        "value": "@dataset().comp_level",
                        "type": "Expression"
                    }
                }
            }
        }
    }
    

    enter image description here

    Click on ok and now it will show the dynamic expression for these properties at the dataset level.

    enter image description here

    Give the required values to these parameters in the copy activity.

    enter image description here

    Run the pipeline and it will give the desired results.

    UPDATE:

    For the csv file, you can edit the dataset JSON as below.

            "compressionCodec": {
                    "value": "@dataset().comp",
                    "type": "Expression"
                },
                "compressionLevel": {
                    "value": "@dataset().comp_level",
                    "type": "Expression"
                }
    

    enter image description here