I have a dump of a SQL database table which contains only data. It is one long list of INSERT statements. The file is about 10GB and when I try to import with Invoke-Sqlcmd or the SQL Server management studio it fails with the message "Not enough memory". Therefore I split the file into several smaller files of 250MB. All the lines are complete, so no half INSERT statements at the end or beginning of each file because of splitting the files.
When I use Powershell to import the data the first file imports without problems.
Invoke-Sqlcmd -ServerInstance myserver\instance -Database mydatabase -InputFile "C:\temp\files\dbo.Data.00.sql"
Whenever I try to import the next file I get the following error message.
Invoke-Sqlcmd : Could not find stored procedure 'I'. At line:1 char:1 + Invoke-Sqlcmd -ServerInstance myserver\instance -Database mydatabase -I ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidOperation: (:) [Invoke-Sqlcmd], SqlPowerShellSqlExecutionExceptio + FullyQualifiedErrorId : SqlError,Microsoft.SqlServer.Management.PowerShell.GetScriptCommand
It mentions the stored procedure could not be found but it are only INSERT statements. I also tried to specify the database name before the first INSERT statement but that does not change the result.
USE [mydatabase]
Any ideas what is going wrong here.
I managed to work around this issue by copying the dump file to a Linux host and use the following command to split the file into files of 250.000 lines each.
split -l 250000 dbo.Data.sql
There still was a problem with the split files. All files except for the first one contained NUL characters between each character.
I used the following solution to remove the NUL characters. Removing "NUL" characters
By executing the following command for all split files except the first one.
tr < xab -d '\000' > xab.dbo.Data.sql
tr < xac -d '\000' > xac.dbo.Data.sql
etc...