I've made extensive use of bulk insert for onboarding and similar tasks. Unless I'm mistaken, one of the main reasons this runs rapidly is that much of the work is done locally on the server as opposed to sending up a large number of INSERT queries.
Now I'm looking at a similar issue for UPDATE. We are collecting changes to different fields in different tables from a file of the basic form TABLE/FIELD/PKEY=new value. I do not find anything similar to a BULK UPDATE, and looking at previous threads here it seems the solution is to upload ~= 2000 UPDATEs at a time.
I built a DB with 30 tables. Using BULK INSERT I can populate 1k rows in each in < 1 second. I then wrote some UPDATE code using a 2k row limit (so everything was a single batch per table in my DB) and that took 300 seconds. This is not unexpected, each UPDATE includes a WHERE so this is going to be more expensive.
Am I missing a higher performance solution?
Am I missing a higher performance solution?
No there isn't. A common technique is to use a Table Valued Parameter, or a temp table with bulk-copy, both of which should be fast to insert. If they are indexed then the resulting joined update should be pretty fast.
You can also create a json of the changes and then join the main table with an OPENJSON call to do the update once.