I have a table on SQL Asure server with no indexes except clustered. And I insert into this table small amount of data just about 1000 rows and it takes more than 4 minutes. Although, when I do it with SSIS it takes just a second. This is the query:
INSERT INTO DatabaseName.dbo.tmpOurClients
(KodClient,Name ,ShortName,INN,Boss,contact,Email,Adres,Tel,City_ID,DateLop)
SELECT KodClient,Name ,ShortName,INN,Boss,contact,Email,Adres,Tel,City_ID,DateLop
FROM dbo.OurClients
That is how I created the Linked Server:
EXEC sp_addlinkedserver
@server='Azure',
@srvproduct='',
@provider='sqlncli',
@datasrc='ServerName.DATABASE.WINDOWS.NET',
@location='',
@provstr='',
@catalog='db_name'
EXEC sp_addlinkedsrvlogin
@rmtsrvname='Azure',
@useself='false',
@rmtuser='login@server_name.database.windows.net',
@rmtpassword='password'
EXEC sp_serveroption 'Azure', 'Collation Compatible', true;
I've already checked this topic Linked server to SQL Azure very slow but none of suggestions helped me.
Version of local SQL-Server: Microsoft SQL Server 2008 R2 (SP2) - 10.50.4000.0 (X64) Jun 28 2012 08:36:30 Copyright (c) Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.1 (Build 7601: Service Pack 1) (Hypervisor)
So, what can be the problem?
1 Answer 1
I insert into this table small amount of data just about 1000 rows and it takes more than 4 minutes
Remote data modifications through a linked server use the sp_cursor
model. The effect is similar to issuing 1000 separate single inserts (one for each row). If a round trip takes 250ms, 1000 such trips will take 4 minutes and 10 seconds. Using a bulk loading method such as bcp
or SSIS
will generally be more efficient (unless the number of rows to be inserted is tiny).
Another alternative for ad-hoc needs is to build a string containing a single INSERT
statement with multiple rows in the VALUES
clause. The statement is then used with EXECUTE AT
, for example:
-- Note local-to-azure table name used, not four-part syntax
DECLARE @sql varchar(max) = 'INSERT dbo.Test VALUES ';
-- Build a list of 1000 numbers to insert
SELECT @sql += '(' + CONVERT(varchar(11), n) + '),'
FROM dbo.Numbers AS N
WHERE n BETWEEN 1 AND 1000;
-- Remove trailing comma
SET @sql = LEFT(@sql, LEN(@sql) - 1);
-- For debugging
PRINT @sql;
-- Execute the finished statement at the remote server;
EXECUTE (@sql) AT AZURE;
The constructed insert statement executes at the Azure database, so a local name is used for the table (note: building a dynamic insert statement using a four-part target name buys you nothing).
Note that EXECUTE ... AT
requires RPC
and RPC OUT
enabled in the linked server options.
The VALUES
clause for a plan INSERT
statement is limited to 1000 elements. There are ways around that (a VALUES
clause in a CTE does not have that restriction). You could also choose to build 1000-row batches or build single-row insert statements, but if the there is that much data on a regular basis, you would probably be better off using one of the bulk loading methods instead.
-
Thank you for this solution. I have a process that inserts over 211,000 records into an Azure SQL database. It was taking 2 hours and 15 minutes. Implementing this solution in 1000 record batches with a WHILE loop brought that down to 90 seconds.alcor8– alcor82023年12月04日 18:04:36 +00:00Commented Dec 4, 2023 at 18:04
Explore related questions
See similar questions with these tags.