0

I have .bak file of SQL Server database in my machine. I need to restore it in a remote SQLServer. Because of policy rules, there's no way I can copy the file directly on the machine and I can't place it in a share folder. I've tried to generate a .bacpack file but it gives me an error error due to synonyms.

Any ideas?

asked Apr 19, 2024 at 7:20
2
  • How big is the database on disk, and how many objects exist inside it? Commented Apr 19, 2024 at 12:36
  • 1
    Can't tell if you're kidding, but I've worked with databases that were 10,000x as big, and single tables 5,000x as big. 150 MB is pretty tiny lol. How many tables, views, procedures, functions, etc, exist in it roughly? Commented Apr 19, 2024 at 21:12

2 Answers 2

2

So you have sysadmin access to target server but no OS access?

I so you can

  • create a table with blob varbinary(max) field
  • populate this field with your backup file from the client
  • extract contents of varbinary(max) to server
  • verify your backup via RESTORE VERIFYONLY
  • restore your database

If your backup is big (lets say 150Gb and not 150Mb) you can Split it to pieces by archiver (for example 7za https://www.7-zip.org/) or perform backup your database in multiple files (up to 64)

In first case you should

  • upload/download command line archiver (it's small)
  • enable xp_cmdshell
  • upload/download every piece of backup
  • extract full backup from pieces
  • verify it via RESTORE VERIFYONLY
  • perform restore

Note - you don't have to load everything at once, you can do it piece by piece truncating buffer table after every successful transfer.

Here is the links I used to populate the table https://www.sqlservercentral.com/blogs/t-sql-tuesday-006-blobs-filestream-and-powershell

$server = "Z002\sql2k8"
$database = "AdventureWorks2008"
$query = "INSERT Production.ProductPhoto2 VALUES (@ThumbNailPhoto, @ThumbnailPhotoFileName)"
$filepath = "C:\Users\u00\hotrodbike_black_small.gif"
$ThumbnailPhotoFileName = get-childitem $filepath | select -ExpandProperty Name
 
$connection=new-object System.Data.SqlClient.SQLConnection
$connection.ConnectionString="Server={0};Database={1};Integrated Security=True" -f $server,$database
$command=new-object system.Data.SqlClient.SqlCommand($query,$connection)
$command.CommandTimeout=120
$connection.Open()
 
$fs = new-object System.IO.FileStream($filePath,[System.IO.FileMode]'Open',[System.IO.FileAccess]'Read')
$buffer = new-object byte[] -ArgumentList $fs.Length
$fs.Read($buffer, 0, $buffer.Length)
$fs.Close()
 
$command.Parameters.Add("@ThumbNailPhoto", [System.Data.SqlDbType]"VarBinary", $buffer.Length)
$command.Parameters["@ThumbNailPhoto"].Value = $buffer
$command.Parameters.Add("@ThumbnailPhotoFileName", [System.Data.SqlDbType]"NChar", 50)
$command.Parameters["@ThumbnailPhotoFileName"].Value = $ThumbnailPhotoFileName
$command.ExecuteNonQuery()
 
$connection.Close()

And here is the link I used to extract the data from table to SQL Server https://sqlrambling.net/2020/04/04/saving-and-extracting-blob-data-basic-examples/

## https://social.technet.microsoft.com/wiki/contents/articles/890.export-sql-server-blob-data-with-powershell.aspx
## Export of "larger" Sql Server Blob to file
## with GetBytes-Stream.
# Configuration data
$Server = ".\<Instance>"; # SQL Server Instance.
$Database = "Blob_Test";
$Dest = "C:\BLOBTest\BLOBOut\"; # Path to export to.
$bufferSize = 8192; # Stream buffer size in bytes.
# Select-Statement for name & blob
# with filter.
$Sql = "SELECT [PictureName]
 ,[PictureData]
 FROM dbo.PicturesTest"; 
 
# Open ADO.NET Connection
$con = New-Object Data.SqlClient.SqlConnection;
$con.ConnectionString = "Data Source=$Server;" +
 "Integrated Security=True;" +
 "Initial Catalog=$Database";
$con.Open(); 
 
# New Command and Reader
$cmd = New-Object Data.SqlClient.SqlCommand $Sql, $con;
$rd = $cmd.ExecuteReader(); 
 
# Create a byte array for the stream.
$out = [array]::CreateInstance('Byte', $bufferSize) 
 
# Looping through records
While ($rd.Read())
{
 Write-Output ("Exporting: {0}" -f $rd.GetString(0));
 # New BinaryWriter
 $fs = New-Object System.IO.FileStream ($Dest + $rd.GetString(0)), Create, Write;
 $bw = New-Object System.IO.BinaryWriter $fs; 
 
 $start = 0;
 # Read first byte stream
 $received = $rd.GetBytes(1, $start, $out, 0, $bufferSize - 1);
 While ($received -gt 0)
 {
 $bw.Write($out, 0, $received);
 $bw.Flush();
 $start += $received;
 # Read next byte stream
 $received = $rd.GetBytes(1, $start, $out, 0, $bufferSize - 1);
 } 
 
 $bw.Close();
 $fs.Close();
} 
 
# Closing & Disposing all objects
$fs.Dispose();
$rd.Close();
$cmd.Dispose();
$con.Close(); 
 
Write-Output ("Finished");
answered Apr 19, 2024 at 16:58
1
  • This is... amazing. I am in awe. However, before taking this approach, I would look at using Azure Blob Storage. see here sqlservercentral.com/articles/… If you have sysadmin access to the target SQL Server, that's all you need. Or if you are in the same network, put up a share on YOUR machine and then SQL reach over to yours. Commented Apr 19, 2024 at 17:05
1

What about generate table and data as script using SSMS?

answered Apr 19, 2024 at 7:23
2
  • That is a really good idea. But it doesn't include the data, just the tables. Commented Apr 19, 2024 at 13:01
  • 2
    There's an option in that gui, to include data. Commented Apr 19, 2024 at 16:45

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.