Download objects
Stay organized with collections
Save and categorize content based on your preferences.
This page shows you how to download objects from your buckets in Cloud Storage to persistent storage. You can also download objects into memory.
Required roles
In order to get the required permissions for downloading objects, ask your
administrator to grant you the Storage Object Viewer
(roles/storage.objectViewer) role on the bucket. If you plan on using the
Google Cloud console, ask your administrator to grant you the Storage Admin
(roles/storage.admin) role on the bucket instead.
These roles contain the permissions required to download objects. To see the exact permissions that are required, expand the Required permissions section:
Required permissions
storage.buckets.list- This permission is only required for using the Google Cloud console to perform the tasks on this page.
storage.objects.getstorage.objects.list- This permission is only required for using the Google Cloud console to perform the tasks on this page.
You might also be able to get these permissions with other predefined roles or custom roles.
For instructions on granting roles on buckets, see Set and manage IAM policies on buckets.
Download an object from a bucket
Complete the following instructions to download an object from a bucket:
Console
- In the Google Cloud console, go to the Cloud Storage Buckets page.
In the list of buckets, click the name of the bucket that contains the object you want to download.
The Bucket details page opens, with the Objects tab selected.
Navigate to the object, which may be located in a folder.
Click the Download icon associated with the object.
Your browser settings control the download location for the object.
To learn how to get detailed error information about failed Cloud Storage operations in the Google Cloud console, see Troubleshooting.
Command line
Use the gcloud storage cp command:
gcloud storage cp gs://BUCKET_NAME/OBJECT_NAME SAVE_TO_LOCATION
Where:
BUCKET_NAMEis the name of the bucket containing the object you are downloading. For example,my-bucket.OBJECT_NAMEis the name of object you are downloading. For example,pets/dog.png.SAVE_TO_LOCATIONis the local path where you are saving your object. For example,Desktop/Images.
If successful, the response looks like the following example:
Completed files 1/1 | 164.3kiB/164.3kiB
If your download is interrupted prior to completion, run the same cp
command to resume the download from where it left off.
Client libraries
For more information, see the
Cloud Storage C++ API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage C# API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage Go API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage Java API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
The following sample downloads an individual object: The following sample downloads multiple objects using multiple processes: The following sample downloads all objects with a common prefix using multiple processes:
For more information, see the
Cloud Storage Node.js API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
The following sample downloads an individual object: The following sample downloads multiple objects using multiple processes: The following sample downloads all objects with a common prefix using multiple processes:
For more information, see the
Cloud Storage PHP API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage Python API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
The following sample downloads an individual object: The following sample downloads multiple objects using multiple processes: The following sample downloads all objects in a bucket using multiple processes:
For more information, see the
Cloud Storage Ruby API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
C++
namespacegcs=::google::cloud::storage;
[](gcs::Clientclient,std::stringconst&bucket_name,
std::stringconst&object_name){
gcs::ObjectReadStreamstream=client.ReadObject(bucket_name,object_name);
intcount=0;
std::stringline;
while(std::getline(stream,line,'\n')){
++count;
}
if(stream.bad())throwgoogle::cloud::Status(stream.status());
std::cout << "The object has " << count << " lines\n";
}C#
usingGoogle.Cloud.Storage.V1 ;
usingSystem;
usingSystem.IO;
publicclassDownloadFileSample
{
publicvoidDownloadFile(
stringbucketName="your-unique-bucket-name",
stringobjectName="my-file-name",
stringlocalPath="my-local-path/my-file-name")
{
varstorage=StorageClient .Create ();
usingvaroutputFile=File.OpenWrite(localPath);
storage.DownloadObject(bucketName,objectName,outputFile);
Console.WriteLine($"Downloaded {objectName} to {localPath}.");
}
}
Go
import(
"context"
"fmt"
"io"
"os"
"time"
"cloud.google.com/go/storage"
)
// downloadFile downloads an object to a file.
funcdownloadFile(wio.Writer ,bucket,objectstring,destFileNamestring)error{
// bucket := "bucket-name"
// object := "object-name"
// destFileName := "file.txt"
ctx:=context.Background()
client,err:=storage.NewClient(ctx)
iferr!=nil{
returnfmt.Errorf("storage.NewClient: %w",err)
}
deferclient.Close()
ctx,cancel:=context.WithTimeout(ctx,time.Second*50)
defercancel()
f,err:=os.Create (destFileName)
iferr!=nil{
returnfmt.Errorf("os.Create: %w",err)
}
rc,err:=client.Bucket (bucket).Object (object).NewReader (ctx)
iferr!=nil{
returnfmt.Errorf("Object(%q).NewReader: %w",object,err)
}
deferrc.Close()
if_,err:=io.Copy(f,rc);err!=nil{
returnfmt.Errorf("io.Copy: %w",err)
}
iferr=f.Close();err!=nil{
returnfmt.Errorf("f.Close: %w",err)
}
fmt.Fprintf(w,"Blob %v downloaded to local file %v\n",object,destFileName)
returnnil
}
Java
importcom.google.cloud.storage.BlobId ;
importcom.google.cloud.storage.Storage ;
importcom.google.cloud.storage.StorageOptions ;
importjava.nio.file.Paths;
publicclass DownloadObject{
publicstaticvoiddownloadObject(
StringprojectId,StringbucketName,StringobjectName,StringdestFilePath)
throwsException{
// The ID of your GCP project
// String projectId = "your-project-id";
// The ID of your GCS bucket
// String bucketName = "your-unique-bucket-name";
// The ID of your GCS object
// String objectName = "your-object-name";
// The path to which the file should be downloaded
// String destFilePath = "/local/path/to/file.txt";
StorageOptions storageOptions=StorageOptions .newBuilder().setProjectId(projectId).build();
try(Storage storage=storageOptions.getService ()){
storage.downloadTo(BlobId .of(bucketName,objectName),Paths.get(destFilePath));
System.out.println(
"Downloaded object "
+objectName
+" from bucket name "
+bucketName
+" to "
+destFilePath);
}
}
}importcom.google.cloud.storage.BlobInfo ;
importcom.google.cloud.storage.transfermanager.DownloadResult ;
importcom.google.cloud.storage.transfermanager.ParallelDownloadConfig ;
importcom.google.cloud.storage.transfermanager.TransferManager ;
importcom.google.cloud.storage.transfermanager.TransferManagerConfig ;
importjava.nio.file.Path;
importjava.util.List;
class DownloadMany{
publicstaticvoiddownloadManyBlobs(
StringbucketName,List<BlobInfo>blobs,PathdestinationDirectory)throwsException{
try(TransferManager transferManager=
TransferManagerConfig .newBuilder().build().getService()){
ParallelDownloadConfig parallelDownloadConfig=
ParallelDownloadConfig .newBuilder()
.setBucketName(bucketName)
.setDownloadDirectory(destinationDirectory)
.build();
List<DownloadResult>results=
transferManager.downloadBlobs(blobs,parallelDownloadConfig).getDownloadResults();
for(DownloadResult result:results){
System.out.println(
"Download of "
+result .getInput().getName()
+" completed with status "
+result .getStatus());
}
}
}
}importcom.google.cloud.storage.BlobInfo ;
importcom.google.cloud.storage.Storage ;
importcom.google.cloud.storage.StorageOptions ;
importcom.google.cloud.storage.transfermanager.DownloadResult ;
importcom.google.cloud.storage.transfermanager.ParallelDownloadConfig ;
importcom.google.cloud.storage.transfermanager.TransferManager ;
importcom.google.cloud.storage.transfermanager.TransferManagerConfig ;
importjava.nio.file.Path;
importjava.util.List;
importjava.util.stream.Collectors;
class DownloadBucket{
publicstaticvoiddownloadBucketContents(
StringprojectId,StringbucketName,PathdestinationDirectory){
Storage storage=StorageOptions .newBuilder().setProjectId(projectId).build().getService();
List<BlobInfo>blobs=
storage
.list(bucketName)
.streamAll()
.map(blob->blob.asBlobInfo ())
.collect(Collectors.toList());
TransferManager transferManager=TransferManagerConfig .newBuilder().build().getService();
ParallelDownloadConfig parallelDownloadConfig=
ParallelDownloadConfig .newBuilder()
.setBucketName(bucketName)
.setDownloadDirectory(destinationDirectory)
.build();
List<DownloadResult>results=
transferManager.downloadBlobs (blobs,parallelDownloadConfig).getDownloadResults();
for(DownloadResult result:results){
System.out.println(
"Download of "
+result .getInput().getName()
+" completed with status "
+result .getStatus());
}
}
}Node.js
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// The ID of your GCS bucket
// const bucketName = 'your-unique-bucket-name';
// The ID of your GCS file
// const fileName = 'your-file-name';
// The path to which the file should be downloaded
// const destFileName = '/local/path/to/file.txt';
// Imports the Google Cloud client library
const{Storage}=require('@google-cloud/storage');
// Creates a client
conststorage=newStorage();
asyncfunctiondownloadFile(){
constoptions={
destination:destFileName,
};
// Downloads the file
awaitstorage.bucket(bucketName).file(fileName).download (options);
console.log(
`gs://${bucketName}/${fileName} downloaded to ${destFileName}.`
);
}
downloadFile().catch(console.error);/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// The ID of your GCS bucket
// const bucketName = 'your-unique-bucket-name';
// The ID of the first GCS file to download
// const firstFileName = 'your-first-file-name';
// The ID of the second GCS file to download
// const secondFileName = 'your-second-file-name;
// Imports the Google Cloud client library
const{Storage,TransferManager}=require('@google-cloud/storage');
// Creates a client
conststorage=newStorage();
// Creates a transfer manager client
consttransferManager=newTransferManager (storage.bucket(bucketName));
asyncfunctiondownloadManyFilesWithTransferManager(){
// Downloads the files
awaittransferManager.downloadManyFiles ([firstFileName,secondFileName]);
for(constfileNameof[firstFileName,secondFileName]){
console.log(`gs://${bucketName}/${fileName} downloaded to ${fileName}.`);
}
}
downloadManyFilesWithTransferManager().catch(console.error);/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// The ID of your GCS bucket
// const bucketName = 'your-unique-bucket-name';
// The ID of the GCS folder to download. The folder will be downloaded to the local path of the executing code.
// const folderName = 'your-folder-name';
// Imports the Google Cloud client library
const{Storage,TransferManager}=require('@google-cloud/storage');
// Creates a client
conststorage=newStorage();
// Creates a transfer manager client
consttransferManager=newTransferManager (storage.bucket(bucketName));
asyncfunctiondownloadFolderWithTransferManager(){
// Downloads the folder
awaittransferManager.downloadManyFiles (folderName);
console.log(
`gs://${bucketName}/${folderName} downloaded to ${folderName}.`
);
}
downloadFolderWithTransferManager().catch(console.error);PHP
use Google\Cloud\Storage\StorageClient;
/**
* Download an object from Cloud Storage and save it as a local file.
*
* @param string $bucketName The name of your Cloud Storage bucket.
* (e.g. 'my-bucket')
* @param string $objectName The name of your Cloud Storage object.
* (e.g. 'my-object')
* @param string $destination The local destination to save the object.
* (e.g. '/path/to/your/file')
*/
function download_object(string $bucketName, string $objectName, string $destination): void
{
$storage = new StorageClient();
$bucket = $storage->bucket($bucketName);
$object = $bucket->object($objectName);
$object->downloadToFile($destination);
printf(
'Downloaded gs://%s/%s to %s' . PHP_EOL,
$bucketName,
$objectName,
basename($destination)
);
}Python
fromgoogle.cloudimport storage
defdownload_blob(bucket_name, source_blob_name, destination_file_name):
"""Downloads a blob from the bucket."""
# The ID of your GCS bucket
# bucket_name = "your-bucket-name"
# The ID of your GCS object
# source_blob_name = "storage-object-name"
# The path to which the file should be downloaded
# destination_file_name = "local/path/to/file"
storage_client = storage .Client ()
bucket = storage_client.bucket (bucket_name)
# Construct a client side representation of a blob.
# Note `Bucket.blob` differs from `Bucket.get_blob` as it doesn't retrieve
# any content from Google Cloud Storage. As we don't need additional data,
# using `Bucket.blob` is preferred here.
blob = bucket.blob(source_blob_name)
blob.download_to_filename (destination_file_name)
print(
"Downloaded storage object {} from bucket {} to local file {}.".format(
source_blob_name, bucket_name, destination_file_name
)
)
defdownload_many_blobs_with_transfer_manager(
bucket_name, blob_names, destination_directory="", workers=8
):
"""Download blobs in a list by name, concurrently in a process pool.
The filename of each blob once downloaded is derived from the blob name and
the `destination_directory `parameter. For complete control of the filename
of each blob, use transfer_manager.download_many() instead.
Directories will be created automatically as needed to accommodate blob
names that include slashes.
"""
# The ID of your GCS bucket
# bucket_name = "your-bucket-name"
# The list of blob names to download. The names of each blobs will also
# be the name of each destination file (use transfer_manager.download_many()
# instead to control each destination file name). If there is a "/" in the
# blob name, then corresponding directories will be created on download.
# blob_names = ["myblob", "myblob2"]
# The directory on your computer to which to download all of the files. This
# string is prepended (with os.path.join()) to the name of each blob to form
# the full path. Relative paths and absolute paths are both accepted. An
# empty string means "the current working directory". Note that this
# parameter allows accepts directory traversal ("../" etc.) and is not
# intended for unsanitized end user input.
# destination_directory = ""
# The maximum number of processes to use for the operation. The performance
# impact of this value depends on the use case, but smaller files usually
# benefit from a higher number of processes. Each additional process occupies
# some CPU and memory resources until finished. Threads can be used instead
# of processes by passing `worker_type=transfer_manager.THREAD`.
# workers=8
fromgoogle.cloud.storageimport Client , transfer_manager
storage_client = Client()
bucket = storage_client.bucket (bucket_name)
results = transfer_manager .download_many_to_path (
bucket, blob_names, destination_directory=destination_directory, max_workers=workers
)
for name, result in zip(blob_names, results):
# The results list is either `None` or an exception for each blob in
# the input list, in order.
if isinstance(result, Exception):
print("Failed to download {} due to exception: {}".format(name, result))
else:
print("Downloaded {} to {}.".format(name, destination_directory + name))defdownload_bucket_with_transfer_manager(
bucket_name, destination_directory="", workers=8, max_results=1000
):
"""Download all of the blobs in a bucket, concurrently in a process pool.
The filename of each blob once downloaded is derived from the blob name and
the `destination_directory `parameter. For complete control of the filename
of each blob, use transfer_manager.download_many() instead.
Directories will be created automatically as needed, for instance to
accommodate blob names that include slashes.
"""
# The ID of your GCS bucket
# bucket_name = "your-bucket-name"
# The directory on your computer to which to download all of the files. This
# string is prepended (with os.path.join()) to the name of each blob to form
# the full path. Relative paths and absolute paths are both accepted. An
# empty string means "the current working directory". Note that this
# parameter allows accepts directory traversal ("../" etc.) and is not
# intended for unsanitized end user input.
# destination_directory = ""
# The maximum number of processes to use for the operation. The performance
# impact of this value depends on the use case, but smaller files usually
# benefit from a higher number of processes. Each additional process occupies
# some CPU and memory resources until finished. Threads can be used instead
# of processes by passing `worker_type=transfer_manager.THREAD`.
# workers=8
# The maximum number of results to fetch from bucket.list_blobs(). This
# sample code fetches all of the blobs up to max_results and queues them all
# for download at once. Though they will still be executed in batches up to
# the processes limit, queueing them all at once can be taxing on system
# memory if buckets are very large. Adjust max_results as needed for your
# system environment, or set it to None if you are sure the bucket is not
# too large to hold in memory easily.
# max_results=1000
fromgoogle.cloud.storageimport Client , transfer_manager
storage_client = Client()
bucket = storage_client.bucket (bucket_name)
blob_names = [blob.name for blob in bucket.list_blobs(max_results=max_results)]
results = transfer_manager .download_many_to_path (
bucket, blob_names, destination_directory=destination_directory, max_workers=workers
)
for name, result in zip(blob_names, results):
# The results list is either `None` or an exception for each blob in
# the input list, in order.
if isinstance(result, Exception):
print("Failed to download {} due to exception: {}".format(name, result))
else:
print("Downloaded {} to {}.".format(name, destination_directory + name))Ruby
defdownload_filebucket_name:,file_name:,local_file_path:
# The ID of your GCS bucket
# bucket_name = "your-unique-bucket-name"
# The ID of your GCS object
# file_name = "your-file-name"
# The path to which the file should be downloaded
# local_file_path = "/local/path/to/file.txt"
require"google/cloud/storage"
storage=Google::Cloud::Storage .new
bucket=storage.bucketbucket_name,skip_lookup:true
file=bucket.file file_name
file.download local_file_path
puts"Downloaded #{file.name} to #{local_file_path}"
end
REST APIs
JSON API
Have gcloud CLI installed and initialized, which lets you generate an access token for the
Authorizationheader.Use
cURLto call the JSON API with aGETObject request:curl -X GET \ -H "Authorization: Bearer $(gcloud auth print-access-token)" \ -o "SAVE_TO_LOCATION" \ "https://storage.googleapis.com/storage/v1/b/BUCKET_NAME/o/OBJECT_NAME?alt=media"
Where:
SAVE_TO_LOCATIONis the path to the location where you want to save your object. For example,Desktop/dog.png.BUCKET_NAMEis the name of the bucket containing the object you are downloading. For example,my-bucket.OBJECT_NAMEis the URL-encoded name of the object you are downloading. For example,pets/dog.png, URL-encoded aspets%2Fdog.png.
XML API
Have gcloud CLI installed and initialized, which lets you generate an access token for the
Authorizationheader.Use
cURLto call the XML API with aGETObject request:curl -X GET \ -H "Authorization: Bearer $(gcloud auth print-access-token)" \ -o "SAVE_TO_LOCATION" \ "https://storage.googleapis.com/BUCKET_NAME/OBJECT_NAME"
Where:
SAVE_TO_LOCATIONis the path to the location where you want to save your object. For example,Desktop/dog.png.BUCKET_NAMEis the name of the bucket containing the object you are downloading. For example,my-bucket.OBJECT_NAMEis the URL-encoded name of the object you are downloading. For example,pets/dog.png, URL-encoded aspets%2Fdog.png.
To more efficiently download all objects in a bucket or subdirectory, use the
gcloud storage cp command or a client library:
gcloud storage cp --recursive gs://BUCKET_NAME/FOLDER_NAME .
Download a portion of an object
If your download gets interrupted, you can resume where you left off by requesting only the portion of the object that's left. Complete the following instructions to download a portion of an object.
Console
The Google Cloud console does not support downloading portions of an object. Use the gcloud CLI instead.
Command line
The Google Cloud CLI automatically attempts to resume interrupted downloads,
except when performing streaming downloads. If your download gets
interrupted, a partially downloaded temporary file becomes visible in
the destination hierarchy. Run the same cp command to resume the
download where it left off.
When the download is complete, the temporary file is deleted and
replaced with the downloaded contents. Temporary files are stored in a
configurable location, which by default is in the user's home directory
under .config/gcloud/surface_data/storage/tracker_files. You can
change or view the location that temporary files are stored by running
gcloud config get storage/tracker_files_directory.
Client libraries
For more information, see the
Cloud Storage C++ API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage C# API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage Go API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage Java API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage Node.js API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage PHP API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage Python API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
For more information, see the
Cloud Storage Ruby API
reference documentation.
To authenticate to Cloud Storage, set up Application Default Credentials.
For more information, see
Set up authentication for client libraries.
C++
namespacegcs=::google::cloud::storage;
[](gcs::Clientclient,std::stringconst&bucket_name,
std::stringconst&object_name,std::int64_tstart,std::int64_tend){
gcs::ObjectReadStreamstream=
client.ReadObject(bucket_name,object_name,gcs::ReadRange(start,end));
intcount=0;
std::stringline;
while(std::getline(stream,line,'\n')){
std::cout << line << "\n";
++count;
}
if(stream.bad())throwgoogle::cloud::Status(stream.status());
std::cout << "The requested range has " << count << " lines\n";
}C#
usingGoogle.Apis.Storage.v1;
usingGoogle.Cloud.Storage.V1 ;
usingSystem;
usingSystem.IO;
usingSystem.Net.Http;
usingSystem.Net.Http.Headers;
usingSystem.Threading.Tasks;
publicclassDownloadByteRangeAsyncSample
{
publicasyncTaskDownloadByteRangeAsync(
stringbucketName="your-unique-bucket-name",
stringobjectName="my-file-name",
longfirstByte=0,
longlastByte=20,
stringlocalPath="my-local-path/my-file-name")
{
varstorageClient=StorageClient .Create ();
// Create an HTTP request for the media, for a limited byte range.
StorageServicestorage=storageClient.Service;
varuri=newUri($"{storage.BaseUri}b/{bucketName}/o/{objectName}?alt=media");
varrequest=newHttpRequestMessage{RequestUri=uri};
request.Headers.Range =newRangeHeaderValue(firstByte,lastByte);
usingvaroutputFile=File.OpenWrite(localPath);
// Use the HttpClient in the storage object because it supplies
// all the authentication headers we need.
varresponse=awaitstorage.HttpClient.SendAsync(request);
awaitresponse.Content.CopyToAsync(outputFile,null);
Console.WriteLine($"Downloaded {objectName} to {localPath}.");
}
}Go
import(
"context"
"fmt"
"io"
"os"
"time"
"cloud.google.com/go/storage"
)
// downloadByteRange downloads a specific byte range of an object to a file.
funcdownloadByteRange(wio.Writer ,bucket,objectstring,startByteint64,endByteint64,destFileNamestring)error{
// bucket := "bucket-name"
// object := "object-name"
// startByte := 0
// endByte := 20
// destFileName := "file.txt"
ctx:=context.Background()
client,err:=storage.NewClient(ctx)
iferr!=nil{
returnfmt.Errorf("storage.NewClient: %w",err)
}
deferclient.Close()
ctx,cancel:=context.WithTimeout(ctx,time.Second*50)
defercancel()
f,err:=os.Create (destFileName)
iferr!=nil{
returnfmt.Errorf("os.Create: %w",err)
}
length:=endByte-startByte
rc,err:=client.Bucket (bucket).Object (object).NewRangeReader (ctx,startByte,length)
iferr!=nil{
returnfmt.Errorf("Object(%q).NewReader: %w",object,err)
}
deferrc.Close()
if_,err:=io.Copy(f,rc);err!=nil{
returnfmt.Errorf("io.Copy: %w",err)
}
iferr=f.Close();err!=nil{
returnfmt.Errorf("f.Close: %w",err)
}
fmt.Fprintf(w,"Bytes %v to %v of blob %v downloaded to local file %v\n",startByte,startByte+length,object,destFileName)
returnnil
}
Java
importcom.google.cloud.ReadChannel ;
importcom.google.cloud.storage.BlobId ;
importcom.google.cloud.storage.Storage ;
importcom.google.cloud.storage.StorageOptions ;
importcom.google.common.io.ByteStreams;
importjava.io.IOException;
importjava.nio.channels.FileChannel;
importjava.nio.file.Paths;
importjava.nio.file.StandardOpenOption;
publicclass DownloadByteRange{
publicstaticvoiddownloadByteRange(
String projectId,
String bucketName,
String blobName,
longstartByte,
longendBytes,
String destFileName)
throwsIOException{
// The ID of your GCP project
// String projectId = "your-project-id";
// The ID of your GCS bucket
// String bucketName = "your-unique-bucket-name";
// The name of the blob/file that you wish to modify permissions on
// String blobName = "your-blob-name";
// The starting byte at which to begin the download
// long startByte = 0;
// The ending byte at which to end the download
// long endByte = 20;
// The path to which the file should be downloaded
// String destFileName = '/local/path/to/file.txt';
Storage storage=StorageOptions .newBuilder().setProjectId(projectId).build().getService();
BlobId blobId=BlobId .of(bucketName,blobName);
try(ReadChannel from=storage.reader (blobId);
FileChannelto=FileChannel.open(Paths.get(destFileName),StandardOpenOption.WRITE)){
from.seek(startByte);
from.limit(endBytes);
ByteStreams.copy(from,to);
System.out.printf(
"%s downloaded to %s from byte %d to byte %d%n",
blobId.toGsUtilUri (),destFileName,startByte,endBytes);
}
}
}Node.js
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// The ID of your GCS bucket
// const bucketName = 'your-unique-bucket-name';
// The ID of your GCS file
// const fileName = 'your-file-name';
// The starting byte at which to begin the download
// const startByte = 0;
// The ending byte at which to end the download
// const endByte = 20;
// The path to which the file should be downloaded
// const destFileName = '/local/path/to/file.txt';
// Imports the Google Cloud client library
const{Storage}=require('@google-cloud/storage');
// Creates a client
conststorage=newStorage();
asyncfunctiondownloadByteRange(){
constoptions={
destination:destFileName,
start:startByte,
end:endByte,
};
// Downloads the file from the starting byte to the ending byte specified in options
awaitstorage.bucket(bucketName).file(fileName).download (options);
console.log(
`gs://${bucketName}/${fileName} downloaded to ${destFileName} from byte ${startByte} to byte ${endByte}.`
);
}
downloadByteRange();PHP
use Google\Cloud\Storage\StorageClient;
/**
* Download a byte range from Cloud Storage and save it as a local file.
*
* @param string $bucketName The name of your Cloud Storage bucket.
* (e.g. 'my-bucket')
* @param string $objectName The name of your Cloud Storage object.
* (e.g. 'my-object')
* @param int $startByte The starting byte at which to begin the download.
* (e.g. 1)
* @param int $endByte The ending byte at which to end the download. (e.g. 5)
* @param string $destination The local destination to save the object.
* (e.g. '/path/to/your/file')
*/
function download_byte_range(
string $bucketName,
string $objectName,
int $startByte,
int $endByte,
string $destination
): void {
$storage = new StorageClient();
$bucket = $storage->bucket($bucketName);
$object = $bucket->object($objectName);
$object->downloadToFile($destination, [
'restOptions' => [
'headers' => [
'Range' => "bytes=$startByte-$endByte",
],
],
]);
printf(
'Downloaded gs://%s/%s to %s' . PHP_EOL,
$bucketName,
$objectName,
basename($destination)
);
}Python
fromgoogle.cloudimport storage
defdownload_byte_range(
bucket_name, source_blob_name, start_byte, end_byte, destination_file_name
):
"""Downloads a blob from the bucket."""
# The ID of your GCS bucket
# bucket_name = "your-bucket-name"
# The ID of your GCS object
# source_blob_name = "storage-object-name"
# The starting byte at which to begin the download
# start_byte = 0
# The ending byte at which to end the download
# end_byte = 20
# The path to which the file should be downloaded
# destination_file_name = "local/path/to/file"
storage_client = storage .Client ()
bucket = storage_client.bucket (bucket_name)
# Construct a client side representation of a blob.
# Note `Bucket.blob` differs from `Bucket.get_blob` as it doesn't retrieve
# any content from Google Cloud Storage. As we don't need additional data,
# using `Bucket.blob` is preferred here.
blob = bucket.blob(source_blob_name)
blob.download_to_filename (destination_file_name, start=start_byte, end=end_byte)
print(
"Downloaded bytes {} to {} of object {} from bucket {} to local file {}.".format(
start_byte, end_byte, source_blob_name, bucket_name, destination_file_name
)
)
Ruby
# The ID of your GCS bucket
# bucket_name = "your-unique-bucket-name"
# file_name = "Name of a file in the Storage bucket"
# The starting byte at which to begin the download
# start_byte = 0
# The ending byte at which to end the download
# end_byte = 20
# The path to which the file should be downloaded
# local_file_path = "/local/path/to/file.txt"
require"google/cloud/storage"
storage=Google::Cloud::Storage .new
bucket=storage.bucketbucket_name
file=bucket.file file_name
file.download local_file_path,range:start_byte..end_byte
puts"Downloaded bytes #{start_byte} to #{end_byte} of object #{file_name} from bucket #{bucket_name}"\
+" to local file #{local_file_path}."
REST APIs
JSON API
Use the Range header in your request to download a portion of
an object.
Have gcloud CLI installed and initialized, which lets you generate an access token for the
Authorizationheader.Use
cURLto call the JSON API with aGETObject request:curl -X GET \ -H "Authorization: Bearer $(gcloud auth print-access-token)" \ -H "Range: bytes=FIRST_BYTE-LAST_BYTE" \ -o "SAVE_TO_LOCATION" \ "https://storage.googleapis.com/storage/v1/b/BUCKET_NAME/o/OBJECT_NAME?alt=media"
Where:
FIRST_BYTEis the first byte in the range of bytes you want to download. For example,1000.LAST_BYTEis the last byte in the range of bytes you want to download. For example,1999.SAVE_TO_LOCATIONis the path to the location where you want to save your object. For example,Desktop/dog.png.BUCKET_NAMEis the name of the bucket containing the object you are downloading. For example,my-bucket.OBJECT_NAMEis the URL-encoded name of the object you are downloading. For example,pets/dog.png, URL-encoded aspets%2Fdog.png.
XML API
Use the Range header in your request to download a portion of
an object.
Have gcloud CLI installed and initialized, which lets you generate an access token for the
Authorizationheader.Use
cURLto call the XML API with aGETObject request:curl -X GET \ -H "Authorization: Bearer $(gcloud auth print-access-token)" \ -H "Range: bytes=FIRST_BYTE-LAST_BYTE" \ -o "SAVE_TO_LOCATION" \ "https://storage.googleapis.com/BUCKET_NAME/OBJECT_NAME"
Where:
FIRST_BYTEis the first byte in the range of bytes you want to download. For example,1000.LAST_BYTEis the last byte in the range of bytes you want to download. For example,1999.SAVE_TO_LOCATIONis the path to the location where you want to save your object. For example,$HOME/Desktop/dog.png.BUCKET_NAMEis the name of the bucket containing the object you are downloading. For example,my-bucket.OBJECT_NAMEis the URL-encoded name of the object you are downloading. For example,pets/dog.png, URL-encoded aspets%2Fdog.png.
What's next
- Read the conceptual overview for uploading and downloading, including advanced download strategies.
- Transfer data from cloud providers or other online sources, such as URL lists.
- Transfer objects to your Compute Engine instance.
- Learn how you can bill Cloud Storage access charges to requesters.
- Learn how Cloud Storage can serve gzipped files in an uncompressed state.
Try it for yourself
If you're new to Google Cloud, create an account to evaluate how Cloud Storage performs in real-world scenarios. New customers also get 300ドル in free credits to run, test, and deploy workloads.
Try Cloud Storage free