Showing posts with label Bulk Copy. Show all posts
Showing posts with label Bulk Copy. Show all posts
Saturday, January 07, 2017
BULK INSERT and csv format, new in vNext 1.1
In SQL Server vNext 1.1 we now have the ability to import a csv via the BULK INSERT command without having to specify the field or row terminator
You still need to specify the format, if you only do something like the following
BULK INSERT AlexaSongs FROM 'c:\Songs played with Alexa.csv'
You will be greeted with these errors
Msg 4832, Level 16, State 1, Line 10
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 10
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 10
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
So let's take a look at how this all works
First create the following table
USE tempdb GO CREATE TABLE AlexaSongs(PlayDate varchar(100), SongName varchar(200), Artist varchar(200), Album varchar(200)) GO
Now grab the csv file from here Songs played with Alexa.csv Either download the whole project and grab the file, or open in raw mode and copy and paste it into a file and save as Songs played with Alexa.csv
Now that you have the file and table ready, let's first take a look at how this was done before vNext 1.1
Here is what it looked like
BULK INSERT AlexaSongs FROM 'c:\Songs played with Alexa.csv' WITH ( FIELDTERMINATOR =',', ROWTERMINATOR = '\n' );
As you can see, we specified a comma as the field terminator and a newline as the row terminator
You could also get it to work by just specifying the field terminator in this case
BULK INSERT AlexaSongs FROM 'c:\Songs played with Alexa.csv' WITH ( FIELDTERMINATOR =',' );
So what does the new syntax look like?
Here is the code that accomplished the same as above but by using the new WITH FORMAT = CSV option
BULK INSERT AlexaSongs FROM 'c:\Songs played with Alexa.csv' WITH (FORMAT = 'CSV');
I guess you could say it is a little cleaner, but all this really is is syntactic sugar
For Azure, it looks like this, I grabbed this straight from this Books On Line Page here
First you need to create a data source
CREATE EXTERNAL DATA SOURCE MyAzureInvoices WITH ( TYPE = BLOB_STORAGE, LOCATION = 'https://newinvoices.blob.core.windows.net', CREDENTIAL = UploadInvoices );
And then you use that data source
BULK INSERT Colors2 FROM 'week3/inv-2017年01月19日.csv' WITH (DATA_SOURCE = 'MyAzureInvoices', FORMAT = 'CSV');
For more examples including accessing data in a CSV file referencing a container in an Azure blob storage location go here https://msdn.microsoft.com/en-us/library/mt805207.aspx
That's all for today
Saturday, October 10, 2015
Your first step in dealing with SQLState = S1000 I/O error while writing a BCP data-file
The other day one job started to fail every time it ran. The error was
Server: FancyServerName
Job: SomeJobName
Step: Step 3) SomeStepName
Message:Executed as user: SomeDomain\SomeServiceAccount. Starting copy... SQLState = S1000, NativeError = 0 Error = [Microsoft][ODBC Driver 11 for SQL Server]I/O error while writing BCP data-file 19 rows copied. Network packet size (bytes): 4096 Clock Time (ms.) Total : 1 Average : (19000.00 rows per sec.). Process Exit Code 1. The step failed.
Time: 2015年10月06日 08:16:20
This was very interesting since the file we were writing had only 19 rows and this was very small, less than a kilobyte. So how do you investigate what the error is? It could be all kinds of things:
Permissions on the drive
Problems with the drive
Memory issues
Issue with a format file
If you do a search you will get all kinds of things back. However the first thing you need to run is the following
Here is what I saw after running xp_fixeddrives
Take a look at the Z drive do you see that? The Z drive is full, there is no space left. It would be nice if the error message indicated that it couldn't write to the file because there was no space left. In a production environment this should not happen because hopefully you have something in place that sends out an alert if the drive is 90% full.
So, if you ever get this error, first thing you have to do is if you have space left on the drive.
I also want to point out that the xp_fixeddrives stored procedure is undocumented so there is no guarantee it will be available in a future version of SQL Server.
Server: FancyServerName
Job: SomeJobName
Step: Step 3) SomeStepName
Message:Executed as user: SomeDomain\SomeServiceAccount. Starting copy... SQLState = S1000, NativeError = 0 Error = [Microsoft][ODBC Driver 11 for SQL Server]I/O error while writing BCP data-file 19 rows copied. Network packet size (bytes): 4096 Clock Time (ms.) Total : 1 Average : (19000.00 rows per sec.). Process Exit Code 1. The step failed.
Time: 2015年10月06日 08:16:20
This was very interesting since the file we were writing had only 19 rows and this was very small, less than a kilobyte. So how do you investigate what the error is? It could be all kinds of things:
Permissions on the drive
Problems with the drive
Memory issues
Issue with a format file
If you do a search you will get all kinds of things back. However the first thing you need to run is the following
EXEC xp_fixeddrives
Here is what I saw after running xp_fixeddrives
drive MB free
C 235644
D 1479234
E 10123
F 10123
G 10123
L 10123
S 117706
T 10123
Z 0
Take a look at the Z drive do you see that? The Z drive is full, there is no space left. It would be nice if the error message indicated that it couldn't write to the file because there was no space left. In a production environment this should not happen because hopefully you have something in place that sends out an alert if the drive is 90% full.
So, if you ever get this error, first thing you have to do is if you have space left on the drive.
I also want to point out that the xp_fixeddrives stored procedure is undocumented so there is no guarantee it will be available in a future version of SQL Server.
Subscribe to:
Posts (Atom)