Question:
I have a script with around 45 thousand insert from select statements. When I try and run it, I get an error message stating that I have run out of memory. How can I get this script to run?
Context:
- Added some new data fields to make an app play nice with another app the client uses.
- Got a spreadsheet of data from the client full of data that mapped current data items to values for these new fields.
- Converted spreadsheet to insert statements.
- If I only run some of the statements it works but the entire script does not.
- No. There are no typos.
If there is a different way I should be loading this data feel free to chastise me and let me know.
-
Similar question on SO: (stackoverflow.com/questions/222442/…) Not sure if the answer helpsjumpdart– jumpdart2012年04月19日 14:24:54 +00:00Commented Apr 19, 2012 at 14:24
6 Answers 6
The maximum batch size for SQL Server 2005 is 65,536 * Network Packet Size (NPS), where NPS is usually 4KB. That works out to 256 MB. That would mean that your insert statements would average 5.8 KB each. That doesn't seem right, but maybe there are extraneous spaces or something unusual in there.
My first suggestion would be to put a "GO" statement after every INSERT statement. This will break your single batch of 45,000 INSERT statements into 45,000 separate batches. This should be easier to digest. Be careful, if one of those inserts fails you may have a hard time finding the culprit. You might want to protect yourself with a transaction. You can add those statements quickly if your editor has a good search-and-replace (that will let you search on and replace return characters like \r\n) or a macro facility.
The second suggestion is to use a Wizard to import the data straight from Excel. The wizard builds a little SSIS package for you, behind the scenes, and then runs that. It won't have this problem.
-
2A
GO
after every statement? Well, I guess if you're generating them using another script that's OK. Otherwise, I'd just put one after every 1000INSERT
s. With regards to making the transaction atomic and minimizing the size of the transaction, why not load all the rows into a temp table or table variable and then load them in one shot from there to the target table?Nick Chammas– Nick Chammas2012年04月19日 15:14:18 +00:00Commented Apr 19, 2012 at 15:14 -
A 1000 is just as good as 1, but harder to count. To be honest, he might get away with just one GO statement, at the halfway mark, near statement 21,500. I like the GO fix because it doesn't require complicated editing of the current script, or counting INSERT statements (which might not map directly to line numbers).Darin Strait– Darin Strait2012年04月19日 16:00:25 +00:00Commented Apr 19, 2012 at 16:00
-
2Surely, even a bad approximation of 1000 statements is good enough. :)Nick Chammas– Nick Chammas2012年04月19日 16:13:37 +00:00Commented Apr 19, 2012 at 16:13
-
1Adding the GOs was a quick and easy fix.. 25mb script runs in a little less than 9min with no problems. Wanted to have it as a script to keep it within our standard patch deployment process for when it goes out.spaghetticowboy– spaghetticowboy2012年04月19日 17:58:50 +00:00Commented Apr 19, 2012 at 17:58
BULK INSERT
or bcp
seem more appropriate options than 45,000 insert statements.
If you need to stick with the insert statements, I would consider a few options:
A: Use transactions and wrap batches of 100 or 500 or 1000 statements in each one to minimize the impact on the log and the batch. e.g.
BEGIN TRANSACTION;
INSERT dbo.table(a, ...) SELECT 1, ...
INSERT dbo.table(a, ...) SELECT 2, ...
...
INSERT dbo.table(a, ...) SELECT 500, ...
COMMIT TRANSACTION;
GO
BEGIN TRANSACTION;
INSERT dbo.table(a, ...) SELECT 1, ...
INSERT dbo.table(a, ...) SELECT 2, ...
...
INSERT dbo.table(a, ...) SELECT 500, ...
COMMIT TRANSACTION;
GO
B: Instead of individual insert statements, use UNION ALL
for 100 or 500 statements at a time, e.g.
INSERT dbo.table(a, ...)
SELECT 1, ...
UNION ALL SELECT 2, ...
...
UNION ALL SELECT 500, ...
GO
INSERT dbo.table(a, ...)
SELECT 501, ...
UNION ALL SELECT 502, ...
...
UNION ALL SELECT 1000, ...
GO
I've left error handling out for brevity, but the point is that I would never try to send a single batch of 45,000 individual statements to SQL Server.
-
1Too bad the OP can't use table-value constructors, a 2008+ feature. He'd still have to batch the inserts into groups of 1000 rows, which is the maximum you can group together with a TVC.Nick Chammas– Nick Chammas2012年04月19日 15:06:08 +00:00Commented Apr 19, 2012 at 15:06
-
That was going to be my first suggestion until I saw the version tag.Aaron Bertrand– Aaron Bertrand2012年04月19日 15:42:17 +00:00Commented Apr 19, 2012 at 15:42
-
2@NickChammas - The performance of those degrades non linearly with the number of values clauses BTW. I submitted a connect item with a repro of inserting 1000 rows with 10
VARCHAR(800)
columns on 2008 with a compile time of 12.5 minutes on my 2008 dev instance as it does a lot of unnecessary work comparing values rather than just getting on with inserting them (performs much quicker when parameterised and no values to look at). Though much improved in 2012 the non linear pattern still exists & should be fixed in version after.Martin Smith– Martin Smith2012年04月21日 09:38:47 +00:00Commented Apr 21, 2012 at 9:38
I am not sure why you are getting the out of memory error, but there is an easier approach.
If you can export the data from the spreadsheet into a delimited format (e.g. csv) you can use the data import wizard in SSMS to insert the data for you:
SSMS import data task.
-
thats is helpful but i dont have access to the clients databases. I have to prepare patches and dataloads in scriptsspaghetticowboy– spaghetticowboy2012年12月27日 21:37:26 +00:00Commented Dec 27, 2012 at 21:37
Using multiple SqlBulkCopy, create a temp table. Insert new data into the temp table, then merge the data in the temp table into the existing one. Example using the C# SqlBulkCopy.WriteToServer Method (DataTable). Hope it helps
Yes we could do that, I tried with a BCP (Bulk Copy Program) approach in order to avoid an OutOfMemory issue.
Note: Tried on SQL Server 2014.
In BCP, first we need to export the Source database data to bcp file (in local directory folder) and then need to import that bcp file to the destination database.
Below are the cake walk steps:
Note:
a) Make sure empty table is present in the Destination database
b) Make sure Temp folder is present in C drive
Create a bat file named as Export_Data.bat with the command shown below:
bcp.exe [Source_DataBase_Name].[dbo].[TableName] OUT "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q
pause
Run that bat file, as a result of that a bcp file will get generated in Temp folder
Then Create a another bat file named as Import_Data.bat with the following command:
bcp.exe [Destination_DataBase_Name].[dbo].[TableName] IN "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q
Pause
And here we go!
-
Getting error "A valid table name is required for in, out, or format options." when tried to export data.Sen Jacob– Sen Jacob2019年08月10日 19:19:21 +00:00Commented Aug 10, 2019 at 19:19
-
1Could you paste the command you tried with all the attribute value.Please follow the below example: bcp.exe ExportDB.dbo.AddressCountry OUT "C:\Temp\AddressCountry.bcp" -S "IN-L20054" -U "sa" -P "sa" -n -q In that[ExportDB -> Source DB,AddressCountry-> Table Present in Source DB,IN-L20054 -> Machine Name,"sa" is the username/pwd of DB]Kms– Kms2019年08月10日 19:49:58 +00:00Commented Aug 10, 2019 at 19:49
-
I don't have it now. I ended up using the import data feature in SSMS. Then connected target DB(v14.0) to source DB(v.15.0) using MS OLE DB connection and it was quite fast to import multi-million rows of data. Thanks!Sen Jacob– Sen Jacob2019年08月16日 09:56:23 +00:00Commented Aug 16, 2019 at 9:56
I had a similar issue with SSMS years ago when I had large bulk INSERT
s to populate a temp table from data provided in a Excel file.
I revised my statements to end with a semicolon. The problem went away.Such as: