I have a database delta .sql file that contains updates to table in a MySql database. I understand that data factory can execute sql statements and I could parse the .sql file a csv split by delimiter of ';' to get the sql statements. However my delta file contains a large number of statements and looping this in ADF would mean it would call such activities potentially hundreds of thousands of times - is it possible to execute statements in bulk?
Reference to an implantation with iteration sql statements: How to process a Database Delta sql file in Azure Data Factory to update a MySQL database
or if there not an efficient way to update through iterating the .sql file? Is the only sensible approach a function app to execute the sql?
Many Thanks.
You can run all the queries in .sql file
in a single batch itself. Below is the approach.
.sql file
. Give the different column delimiter and row delimiter which is not in the file itself. So that all queries will be included in a single row itself. In this demo, I gave &
as delimiter.azure SQL database for MySQL
then you can take script activity and give the below expression. If you use mySQL database
, then you can use lookup activity to execute the SQL script. Give the below expression in the query text box of lookup activity/script activity.@concat(activity('Lookup1').output.value[0].Prop_0,'select 1 as col1')
if the select 1 as col1
is not concatenated with the output of lookup activity, then ADF will throw error. Because Lookup activity expects to return data.