I have a requirement to read CSV files from an Azure blob storage. So far, this is throwing access denied errors every time I run my query:
CREATE DATABASE SCOPED CREDENTIAL <myScopedCredential>
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'sv=2021年06月08日&ss=b&srt=sco&sp=rl&se=2023年03月31日T09:38:05Z&st=2022年09月01日T02:38:05Z...';
CREATE EXTERNAL DATA SOURCE <myExternalDatasource>
WITH (
TYPE = BLOB_STORAGE
, LOCATION = 'https://<myResource>.blob.core.windows.net/<myContainer>'
, CREDENTIAL= <myScopedCredential> -->
);
SELECT *
FROM OPENROWSET (
BULK '<folderName>/<fileName>.csv'
, DATA_SOURCE = '<myExternalDatasource>'
, FORMAT ='CSV'
, FORMATFILE='<formatFilesFolderName>/<formatfileName>.fmt'
, FORMATFILE_DATA_SOURCE = '<myExternalDatasource>'
, FIRSTROW = 2
) AS test
Below are some more details on how everything was setup:
- The storage account kind is of BlockBlobStorage.
- In the Firewalls and virtual networks setting, it is only Enabled from selected virtual networks and IP addresses. I already added my public IP address, as well as the IP address of Azure SQL Server which I got from here: https://learn.microsoft.com/en-us/azure/azure-sql/database/connectivity-architecture?view=azuresql#gateway-ip-addresses
- The whole process works if I set it to Enabled from all networks. The SQL server and the storage account lives within the same resource group.
- I also configured a VNet that is both added for both of the resource.
- Saw this thread which is exactly similar to my issue, however the accepted answer is not working from my end: https://stackoverflow.com/questions/58340185/cannot-bulk-load-because-the-file-file-csv-could-not-be-opened-operating-syst
I checked all the documentations regarding SAS access keys, database scoped credentials, external data sources and VNet networking and I don't see any limitations for SAS key access to be denied. Did I miss a configuration setup? I find it a little weird that in most cases, they are recommending to setup the storage account to be Enabled from all networks, which might be a security issue.
-
What is the trailing "..." in your SAS string?Dai– Dai2022年10月03日 09:29:25 +00:00Commented Oct 3, 2022 at 9:29
-
Also, are you able to connect to the Storage Account using Azure Storage Explorer and are you able to run the same T-SQL batch from an on-prem SQL Server instance? (i.e. have you confirmed the issue only affects your Azure SQL instance?)Dai– Dai2022年10月03日 09:30:24 +00:00Commented Oct 3, 2022 at 9:30
-
@Dai I cutoff the whole SAS string just to show the starting part. I was able to use it to connect to Azure Storage Explorer and verified that the SAS key works.Dustine Tolete– Dustine Tolete2022年10月03日 22:14:57 +00:00Commented Oct 3, 2022 at 22:14
-
To add more details, this command and approach works if the storage account is set to allow all public access, which I don't think we will be implementing.Dustine Tolete– Dustine Tolete2022年10月03日 22:15:44 +00:00Commented Oct 3, 2022 at 22:15
1 Answer 1
Please try enabling the Managed Service Identity on the Azure SQL Server and then ensuring the Blob Storage allows trusted Microsoft services to connect. Give the SQL Server Managed Service Identity the Storage Blob Data Contributor permissions and use the MSI as your credential. This article outlines the method and I believe it will also work for OPENROWSET.
Explore related questions
See similar questions with these tags.