16

I need to move a whole bunch (100+) of large (millions of rows) tables from one SQL2008 database to another.

I originally just used the Import/Export Wizard, but all the destination tables were missing primary and foreign keys, indexes, constraints, triggers, etc. (Identity columns were also converted to plain INTs, but I think I just missed a checkbox in the wizard.)

What's the right way to do this?

If this were just a couple of tables, I would go back to the source, script out the table definition (with all indexes, etc), then run the index creation portions of the script on the destination. But with so many tables, this seems impractical.

If there wasn't quite so much data, I could use the "Create Scripts..." wizard to script out the source, including data, but a 72m row script just doesn't seem like a good idea!

Yasir Arsanukayev
3,1653 gold badges23 silver badges30 bronze badges
asked Feb 3, 2011 at 0:29
2
  • And it's not all the tables in the database? Commented Feb 3, 2011 at 0:35
  • @thursdaysgeek: Its nearly all tables, but the destination database has 100+ tables already. So just restoring from backup with a different name isn't an option. Think of this as basically "merge these two large databases together". Commented Feb 3, 2011 at 0:44

7 Answers 7

14

Scripting out the tables, then using SSIS to transfer the data would be the most reliable and effective way of moving the data to the new database.

answered Feb 3, 2011 at 2:08
9

We actually did it using a lot of manual scripting in conjunction with the Import wizard, but this morning I found a better answer, courtesy of Tibor Karaszi's blog article.

Part of our frustration here was that the SQL 2000 "DTS Import/Export Wizard" actually makes this almost trivially easy by selecting "Copy Objects and Data":

DTS Import Wizard

This third option is the one that contains the ability to include indexes/triggers, etc:

Advanced Options

This option was REMOVED from the SQL 2005/2008 Import Wizard. Why? No idea:

2008 Import Wizard

In 2005/2008, you apparently have to manually create an SSIS package in BIDS and use the Transfer SQL Server Objects Task, which contains all the same options that were in the 2000 wizard:

SSIS Transfer SQL Server Objects Task

answered Feb 3, 2011 at 18:44
1
  • Just wanted to post that I used this SSIS method for another similar task, and it worked great! Commented Feb 7, 2011 at 18:15
8

I'd consider scripting the table out, or use a compare tools (eg Red Gate) to generate the tables in the target database. Without indexes or constraints yet.

Then I'd consider restoring the database with a different name on the same server and doing

 INSERT newdb.dbo.newtable SELECT * FROM olddb.dbo.oldtable

.. for each table, with SET IDENTITY INSERT ON if requirede

Then I'd add indexes and constraints after loading data.

It depends on your comfort level with SSIS (mrdenny's answer) or if you prefer raw SQL.

answered Feb 3, 2011 at 7:36
6

I'd add to Mr Denny's answer: Script out the tables schema then use BCP to move the data. If you're not familiar with SSIS, then using BCP and batches should be easy to do. For millions of rows nothing beats BCP (bulk insert) :).

answered Feb 3, 2011 at 8:30
4

I'm the one who is completely uncomfortable with SSIS.

When the source tables have no identity columns

  1. create a empty database at the target server
  2. create a linked server to the source server on the target server
  3. run the script below on the source database to generate select * into ... statements
  4. run the generated script from the target database
  5. script primary keys, indexes, triggers, functions and procedures from source database
  6. create these objects by the generated script

Now the T-SQL to generate the Select * into ... statements

SET NOCOUNT ON
declare @name sysname
declare @sql varchar(255)
declare db_cursor cursor for
select name from sys.tables order by 1
open db_cursor
fetch next from db_cursor into @name
while @@FETCH_STATUS = 0
begin
 Set @sql = 'select * into [' + @name + '] from [linked_server].[source_db].[dbo].[' + @name + '];'
 print @sql
 fetch next from db_cursor into @name
end
close db_cursor
deallocate db_cursor

This generates a line for each table to copy like

select * into [Table1] from [linked_server].[source_db].[dbo].[Table1];

In case that the tables contain identity columns, I script the tables including identity property and primary keys.

I do not use insert into ... select ... using a linked server in this case, as this is no bulk technique. I'm working on some PowerShell scripts similar to [this SO question1, but I'm still working on error handling. Really big tables can cause out of memory errors, as a whole table is loaded into memory, before it is send via SQLBulkCopy to the database.

Recreation of indexes etc. is similar to the case above. This time I can skip the recreation of the primary keys.

answered Feb 13, 2011 at 15:48
4
  • In case the tables contain identity columns, you can do like in this question. It will save you some manual work. I still prefer bulk insert batches/SSIS, linked server may not be a good solution over a wide network. Commented Feb 14, 2011 at 15:43
  • 1
    @Marian Please take a look at dba.stackexchange.com/questions/297/… if you want to promote SSIS. I didn't try SSIS, but Import Export Wizard failed too (besides linked server). Commented Feb 14, 2011 at 16:11
  • I would've helped with pleasure, but I don't have any Oracle box available to me. Anyway, from what I've managed to read, there are no providers that will support Oracle CLOB.. Commented Feb 14, 2011 at 20:14
  • I am with you on this - I do migrate data sometimes, but never use SSIS. Commented Feb 26, 2012 at 22:54
2

You can use comparison tools that compare database schemas and data and first synchronize a blank database schema with the original db, to create all the tables.

Then, synchronize the data from the original database with the new one (all tables are there, but they are all empty) to insert the records into the tables

I use ApexSQL Diff and ApexSQL Data Diff for this, but there are other similar tools.

The good thing about this process is that you don't have to actually synchronize the databases using the tool, as this can be quite painful for millions of rows.

You can just create an INSERT INTO SQL script (don't be surprised if it's several gigs) and execute it.

As so large scripts cannot even be opened in SQL Server Management Studio, I use sqlcmd or osql

answered Apr 26, 2013 at 19:19
1

As @mrdenny mentioned -

  1. script out tables first with all Indexes, FK's, etc and create blank tables in the destination database.

Instead of using SSIS, use BCP to insert data

  1. bcp out the data using below script. set SSMS in Text Mode and copy the output generated by below script in a bat file.

    -- save below output in a bat file by executing below in SSMS in TEXT mode
    -- clean up: create a bat file with this command --> del D:\BCP\*.dat 
    select '"C:\Program Files\Microsoft SQL Server100円\Tools\Binn\bcp.exe" ' /* path to BCP.exe */
     + QUOTENAME(DB_NAME())+ '.' /* Current Database */
     + QUOTENAME(SCHEMA_NAME(SCHEMA_ID))+'.' 
     + QUOTENAME(name) 
     + ' out D:\BCP\' /* Path where BCP out files will be stored */
     + REPLACE(SCHEMA_NAME(schema_id),' ','') + '_' 
     + REPLACE(name,' ','') 
     + '.dat -T -E -SServerName\Instance -n' /* ServerName, -E will take care of Identity, -n is for Native Format */
    from sys.tables
    where is_ms_shipped = 0 and name <> 'sysdiagrams' /* sysdiagrams is classified my MS as UserTable and we dont want it */
    /*and schema_name(schema_id) <> 'unwantedschema' */ /* Optional to exclude any schema */
    order by schema_name(schema_id)
    
  2. Run the bat file that will generate the .dat files in the folder that you have specified.

  3. Run below script on the

    --- Execute this on the destination server.database from SSMS.
    --- Make sure the change the @Destdbname and the bcp out path as per your environment.
    declare @Destdbname sysname
    set @Destdbname = 'destinationDB' /* Destination Database Name where you want to Bulk Insert in */
    select 'BULK INSERT ' 
    /*Remember Tables must be present on destination database */ 
    + QUOTENAME(@Destdbname) + '.' 
    + QUOTENAME(SCHEMA_NAME(SCHEMA_ID)) 
    + '.' + QUOTENAME(name) 
    + ' from ''D:\BCP\' /* Change here for bcp out path */ 
    + REPLACE(SCHEMA_NAME(schema_id), ' ', '') + '_' + REPLACE(name, ' ', '') 
    + '.dat'' with ( KEEPIDENTITY, DATAFILETYPE = ''native'', TABLOCK )' 
    + char(10) 
    + 'print ''Bulk insert for ' + REPLACE(SCHEMA_NAME(schema_id), ' ', '') + '_' + REPLACE(name, ' ', '') + ' is done... ''' 
    + char(10) + 'go'
     from sys.tables
     where is_ms_shipped = 0
    and name <> 'sysdiagrams' /* sysdiagrams is classified my MS as UserTable and we dont want it */
    and schema_name(schema_id) <> 'unwantedschema' /* Optional to exclude any schema */
     order by schema_name(schema_id) 
    
  4. Run the output using SSMS to insert data back in the tables.

This is very fast bcp method as it uses Native mode.

answered Apr 26, 2013 at 19:45

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.