2

I'm building a program which generates a T-SQL query in following form:

DECLARE @In TABLE (Col CHAR(20))
INSERT INTO @In VALUES value1, value2... value1000
GO
INSERT INTO @In VALUES value1001, value1002...

but the second INSERT statement throws error:

Msg 1087, Level 15, State 2, Line 1
Must declare the table variable "@In".

What am I doing wrong?

asked Mar 17, 2016 at 15:46
4
  • 3
    you are adding the GO batch separator, so the table variable doesn't exist on the second INSERT Commented Mar 17, 2016 at 15:47
  • How can I insert more than 1000 values without using GO separator? Commented Mar 17, 2016 at 15:50
  • either using the way the answer tells you or using INSERT INTO @In SELECT columns FROM SomeTableWith1000Values Commented Mar 17, 2016 at 15:51
  • 1
    @WaldemarGałęzinowski Why do you think you need a GO batch separator? Commented Mar 18, 2016 at 15:38

2 Answers 2

8

You can use VALUES (...), (...):

INSERT INTO table(colA, colN, ...) VALUES
 (col1A, col1B, ...)
 , ...
 , (colnA, colnB, ...)

With @In:

DECLARE @In TABLE (Col CHAR(20))
INSERT INTO @In VALUES 
 ('value1')
 , ('value2')
 , ...
 , ('value1000')

It will insert X rows all at once. GO is not needed. Variables declared before GO don't exist anymore after GO.

answered Mar 17, 2016 at 15:49
3

Simplistically, the GO batch separator should be removed (as stated in @Julien's answer).

Just prove that it does work, try the following:

DECLARE @ValuesPerInsert INT = 1000; -- 1000 works, 1001 fails
DECLARE @SQL NVARCHAR(MAX) = '
DECLARE @In TABLE (Col CHAR(20))';
;WITH cte AS
(
 SELECT TOP (3523)
 ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS [Num],
 (ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) % @ValuesPerInsert) AS [Position]
 FROM [master].[sys].[all_columns]
) -- select * from cte
SELECT @SQL += CASE cte.[Position]
 WHEN 1 THEN N';' + NCHAR(0x0D) + NCHAR(0x0A)
 + N'INSERT INTO @In (Col) VALUES (''Val-'
 + CONVERT(NVARCHAR(10), cte.[Num]) + N''')'
 ELSE ', (''Val-' + CONVERT(NVARCHAR(10), cte.[Num]) + N''')'
 END
FROM cte;
SET @SQL += ';
';
SELECT CONVERT(XML, N'<sql>' + @SQL + N'</sql>') AS [@SQL];
EXEC (@SQL);
GO

That produces the following SQL:

DECLARE @In TABLE (Col CHAR(20));
INSERT INTO @In (Col) VALUES ('Val-1'), ('Val-2'), ('Val-3'), ('Val-4'), ..., ('Val-1000');
INSERT INTO @In (Col) VALUES ('Val-1001'), ('Val-1002'), ('Val-1003'), ..., ('Val-2000');
INSERT INTO @In (Col) VALUES ('Val-2001'), ('Val-2002'), ('Val-2003'), ..., ('Val-3000');
INSERT INTO @In (Col) VALUES ('Val-3001'), ('Val-3002'), ('Val-3003'), ..., ('Val-3523');

And you will see in the "Messages" tab:

(1000 row(s) affected)
(1000 row(s) affected)
(1000 row(s) affected)
(523 row(s) affected)

Of course, there is a 1000 value maximum when using the VALUES list. If you try to do 1001 rows per INSERT, you will get the following error:

Msg 10738, Level 15, State 1, Line 6
The number of row value expressions in the INSERT statement exceeds the maximum allowed number of 1000 row values.

That being said, if you want to insert more than 1000 rows per INSERT statement, you can use the INSERT INTO ... SELECT construct and combine each row with a UNION ALL:

DECLARE @SQL NVARCHAR(MAX) = '
DECLARE @In TABLE (Col CHAR(20));
';
;WITH cte AS
(
 SELECT TOP (3523)
 ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS [Num]
 FROM [master].[sys].[all_columns]
) -- select * from cte
SELECT @SQL += CASE cte.[Num]
 WHEN 1 THEN N'INSERT INTO @In (Col)' + NCHAR(0x0D) + NCHAR(0x0A)
 + N' SELECT ''Val-1'''
 ELSE NCHAR(0x0D) + NCHAR(0x0A) + N' UNION ALL'
 + NCHAR(0x0D) + NCHAR(0x0A) + N' SELECT ''Val-'
 + CONVERT(NVARCHAR(10), cte.[Num]) + N''''
 END
FROM cte;
SET @SQL += ';
';
SELECT CONVERT(XML, N'<sql>' + @SQL + N'</sql>') AS [@SQL];
EXEC (@SQL);
GO

That produces the following SQL:

DECLARE @In TABLE (Col CHAR(20));
INSERT INTO @In (Col)
 SELECT 'Val-1'
 UNION ALL
 SELECT 'Val-2'
 UNION ALL
 SELECT 'Val-3'
 UNION ALL
 SELECT 'Val-4'
 UNION ALL
 SELECT 'Val-5'
 ...
 UNION ALL
 SELECT 'Val-3522'
 UNION ALL
 SELECT 'Val-3523';

And you will see in the "Messages" tab:

(3523 row(s) affected)

But I wouldn't go much above 4500 rows in one statement: you want to avoid lock escalation which locks the entire table (well, unless the table is partitioned and the option to escalate by locking the partition instead of the table is enabled), and that occurs at about 5000 locks.

And that being said, since you are generating this from app code, depending on why you are generating INSERT statements in the first place, you might be able to change your approach and use a Table-Valued Paramater (TVP) instead. Using a TVP would allow you to stream the data from the app directly into a query or Stored Procedure as a Table Variable, in which case you would simply do:

INSERT INTO SchemaName.RealTable (Col)
 SELECT tmp.Col
 FROM @TVPvariable;

But if you need a portable deployment script, then that really isn't an option.

answered Mar 18, 2016 at 15:57
0

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.