15

Is it possible to use SqlBulkcopy with Sql Compact Edition e.g. (*.sdf) files?

I know it works with SQL Server 200 Up, but wanted to check CE compatibility.

If it doesnt does anyone else know the fastest way of getting a CSV type file into SQL Server CE without using DataSets (puke here)?

Jon Seigel
12.5k8 gold badges61 silver badges93 bronze badges
asked Oct 22, 2009 at 11:06

4 Answers 4

24

BULKCOPY is not supported in SQL CE. Here is the fastest way if you have a huge number of rows in your table; insert is too slow!

using (SqlCeConnection cn = new SqlCeConnection(yourConnectionString))
{
 if (cn.State == ConnectionState.Closed)
 cn.Open();
 using (SqlCeCommand cmd = new SqlCeCommand())
 {
 cmd.Connection = cn;
 cmd.CommandText = "YourTableName";
 cmd.CommandType = CommandType.TableDirect;
 using (SqlCeResultSet rs = cmd.ExecuteResultSet(ResultSetOptions.Updatable | ResultSetOptions.Scrollable))
 {
 SqlCeUpdatableRecord record = rs.CreateRecord();
 using (var sr = new System.IO.StreamReader(yourTextFilePath))
 {
 string line;
 while ((line = sr.ReadLine()) != null)
 {
 int index = 0;
 string[] values = line.Split('\t');
 //write these lines as many times as the number of columns in the table...
 record.SetValue(index, values[index++] == "NULL" ? null : values[index - 1]);
 record.SetValue(index, values[index++] == "NULL" ? null : values[index - 1]);
 record.SetValue(index, values[index++] == "NULL" ? null : values[index - 1]);
 rs.Insert(record);
 }
 }
 }
 }
}

Benchmark: table with 34370 rows

  • with inserts: 38 rows written per second

  • this way: 260 rows written per second

answered Oct 23, 2009 at 9:57
Sign up to request clarification or add additional context in comments.

6 Comments

This method doesn't support Identity and rowversion columns types
Wow, was struggling with Insert and getting 20 - 30 rows a second, I'm sure this just did all 35,774 rows in less than 1 second (all my data was in an array in memory) so no bottle necks from the source.
I converted my code using SqlCeCommand and parameters to this and while not a 10x speed up it was twice as fast, thanks.
My previous code was trying to insert using entity framework 5 (turned off change tracking and validation, using just one savechanges call at the end) and sqlce, after leaving it to run for an hour and a half and coming back to find it still in progress, I killed the app! Rewrote using the above for the insert portion and it now inserts the same 271,703 records in 27 seconds!
Unless I'm not grokking this (definitely possible), this seems to be getting the values from the database (StreamReader reading from "yourConnectionString"), and then inserting them back into the database...???
|
9

I have a SqlCeBulkCopy library here: http://sqlcebulkcopy.codeplex.com - even support IEnumerable.

answered Apr 20, 2011 at 16:19

7 Comments

Does it support Visual Studio 2008?
Yes - it is a VS 2008 solution.
To use this, do I have to add all five of the .DLLs to my project? Or just some of them? Are they replacements for the "generic" SqlCe .DLLs, or are they additional to those?
They are additional, you do not need to copy the ddl files, they just need to be installed on the PC/Device.
Do you mean DLL files? So all five need to be on the path, or added as references to the project?
|
1

Is possible to increase a lot this kind of operation. To turn this operation usefull (I mean fast and pretty safe), u can use CE DataAdapter.

By sample, no care about keys, the steps listed bellow can help u:

  1. Make sure that sorce and target tables have same fields structure;
  2. Clone a virtual datatable with a datatable from source database (your select);
  3. Create a CE command with the table name as commandtext (TableDirect as commandtype);
  4. Create a CE dataadapter from CE command;
  5. Create a CE commandbuilder from CE dataatapter;
  6. Pass the Insert command from CE commandbuilder to CE dataadapter;
  7. Copy "n" batch rows from your source datatable to the target datatable (the clone), doing something like this:

    '... previous codes
    For Each currentRow In sourceTable.Rows
     'u can do RaiseEvent Processing(currentRow, totalRows) here with DoEvents
     If targetTable.Rows.Count < 100 Then
     targetTable.InportRow(currentRow)
     targetTable.Rows(targetTable.Rows.Count - 1).SetAdded
     Else
     '...Here you wll call the CE DataAdapter's Update method (da.Update(targetTable))
     '...and then be sure you clone the targetTable again, erasing all previous rows.
     '...Do a clone again, don't do just a "clear" in the Rows collection. 
     '...If u have an Autoincrement it will break all Foreign Keys.
     End If
     Next
     '... next codes
    

With this way u can update several rows with no much time.

I've some applications using this method and the average rate is about 1500 rows per second in a table with 5 NTEXT fields (slow) and 800000 rows.

Of course, all depends of your table's structure. IMAGE and NTEXT are both slow datatypes.

P.S.: As I said, this method don't care so much about keys, so be carefull.

marc_s
760k186 gold badges1.4k silver badges1.5k bronze badges
answered Sep 5, 2010 at 18:53

Comments

0

No, I don't think that SqlBulkCopy is supported (see MSDN). Perhaps throw the data in as xml and strip it apart at the server? SQL/XML is pretty good in 2005/2008.

You might also want to look at table-value-parameters, but I doubt that CE supports these.

answered Oct 22, 2009 at 11:09

Comments

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.