azure-devopsazure-sql-database

How to insert CSV file from Azure Repo to Azure SQL Database via Release Pipeline?


I am trying to publish a csv file which is currently stored in an Azure Repo and would like to insert the csv content into an Azure SQL database by a release pipeline. Technically, will it be feasible? As I keep hitting the "Operating system error code 997(Overlapped I/O operation is in progress.)." error.

Sample CSV data:

"True", "123,234325", "abc"

Here is the script that I used:

Bulk Insert TargetTableName
from 'ArtifactAlias\foldername\filename.csv'
with
(
    FIRSTROW = 2,
    FIELDTERMINATOR = ',', 
    ROWTERMINATOR = '\n',
    FIELDQUOTE = '"'
)

Solution

  • Update

    Per your supplement and your possible requirement to protect the , that can be used in each field of your csv contents, I would suggest running the PowerShell script below during agent job to generate a new csv file with no headers, where your current delimiter (,) is changed to be another one (| in my case) that won't conflict with the data in the fields; it also will change the True/False field to be 1/0 for import.

    SampleTable

    CREATE TABLE [dbo].[SampleTable] (
        [IsTrueFalse] BIT NOT NULL,
        [Number] NVARCHAR(50) NOT NULL,
        [Remark] NVARCHAR(50) NOT NULL
    );
    

    sample.csv

    IsTrueFalse,Number,Remark
    "True", "123,234325", "abc"
    "False", "456,567890", "efg"
    

    noheaders.csv

    1| 123,234325| abc
    0| 456,567890| efg
    

    PowerShell script used in release pipeline

    # Define file paths
    $inputFile = "$(System.DefaultWorkingDirectory)\_azuresql\sample.csv"
    $outputFile = "$(System.DefaultWorkingDirectory)\_azuresql\noheaders.csv"
    
    # Read the input file content
    $content = Get-Content $inputFile
    
    # Process each line: replace `",` with `|`, remove quotes, and convert "True"/"False" to 1/0
    $content | Select-Object -Skip 1 | ForEach-Object {
        # Replace `",` with `|` and then remove all remaining quotes
        $line = $_ -replace '",', '|' -replace '"', ''
       
        # Convert "True"/"False" to 1/0 for the IsTrueFalse column
        $fields = $line -split '\|'
        $fields[0] = if ($fields[0] -eq 'True') { '1' } elseif ($fields[0] -eq 'False') { '0' } else { $fields[0] }
        $fields -join '|'
    } | Set-Content $outputFile
    
    echo "Checking the contents of $outputFile..."
    Get-Content $outputFile
    
    echo "Checking bcp version..."
    bcp -v
    
    echo "Running bcp command..."
    bcp SampleTable in "$outputFile" -S $(AzureSQLServer) -d $(AzureSQLDB) -U $(AzureSQLServerAdmin) -P $(AzureSQLServerAdminPWD) -q -c -t "|"  # Pipe delimiter
    

    Image

    Image


    From the expression FIRSTROW = 2 of your SQL script, I could see your csv file should have the first line of headers. For this you may remove the first line and output the expected contents into a new file during the pipeline agent job. Here is the updated PowerShell script for your reference.

    # Define file paths
    $inputFile = "$(System.DefaultWorkingDirectory)\_azuresql\test.csv"
    $outputFile = "$(System.DefaultWorkingDirectory)\_azuresql\noheaders.csv"
    
    # Read the input CSV file, skip the first line (header), and write the result to the output file
    Get-Content $inputFile | Select-Object -Skip 1 | Set-Content $outputFile
    
    echo "Checking bcp version..."
    bcp -v
    
    echo "Running bcp command..."
    bcp Users in "$outputFile" -S $(AzureSQLServer) -d $(AzureSQLDB) -U $(AzureSQLServerAdmin) -P $(AzureSQLServerAdminPWD) -q -c -t ","
    

    Based on your description, it appears that you need to run the SQL script on the SQL rather than on the pipeline agent machine, as discussed in this thread.

    According to this document on how to Load data from CSV file into a database (bcp) - Azure SQL | Microsoft Learn, we could use the BCP tool to import csv contents into an Azure SQL DB table.

    Following that direction, I tested with the sample release pipeline running on the windows-latest Microsoft-hosted agent and managed to import the contents of the csv file.

    1. I created a table Users for test in Azure SQL DB; Image

    2. Added a test.csv file in my repo with the contents like below; Image

    3. In a release pipeline, added the repo as artifacts and ran the PowerShell script;

      echo "Checking bcp version..."
      bcp -v
      
      echo "Running bcp command..."
      bcp Users in "$(System.DefaultWorkingDirectory)/_azuresql/test.csv" -S 
      $(AzureSQLServer) -d $(AzureSQLDB) -U $(AzureSQLServerAdmin) -P 
      $(AzureSQLServerAdminPWD) -q -c -t ","
      

      Image

    4. As the release was succeeded, we could check the update in the table; enter image description here

      Image