azure-data-factoryazure-sql

How to import large column (>4000 characters) from a CSV into database table


We have a large CSV file with couple of columns. One of the column has a large data many times spanning more than 4000 characters and gets truncated when importing using data flow.

  1. ADF data type is String
  2. SQL data type is Nvarchar(max)

What are the approaches which can allow for successful import of data w/o truncation


Solution

  • You can use normal copy activity to dump the data from csv file to SQL. we tried reciprocating the scenario wherein had a column in file with length > 5k and destination as NVARCHAR(max) and it was success without data getting truncated.

    enter image description here

    And @LeonYue, NVARCHAR(max) can store beyond 4k characters which is a whole seperate discussion as mentioned by @Dhruv