Migrating to the Storage Transfer Service Cloud Client Library
Stay organized with collections
Save and categorize content based on your preferences.
To ensure high quality and to be consistent with our other Cloud libraries, the documentation for Storage Transfer Service is now using the Cloud Client Libraries instead of the Google API Client Libraries. See Client libraries explained for more information about the two options.
The Google API Client Library continues to receive updates, but is no longer referenced in the documentation.
This guide covers the main differences as they apply to using Storage Transfer Service, and provides instructions on updating your clients when migrating to the Cloud Client Library.
Java
Updating dependencies
To switch to the new library, replace your dependency on
google-api-services-storagetransfer with google-cloud-storage-transfer.
<dependency> <groupId>com.google.cloud</groupId> <artifactId>google-cloud-storage-transfer</artifactId> <version>0.2.3</version> </dependency>
If you are using Gradle without BOM, add this to your dependencies:
implementation'com.google.cloud:google-cloud-storage-transfer:0.2.3'
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>libraries-bom</artifactId>
<version>24.1.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage-transfer</artifactId>
</dependency>
For the most part, code can be pretty easily converted from the API Client Library to the Cloud Client Library. The following are some key differences between the two Java clients
Client instantiation
The Cloud Client Library reduces a lot of the boilerplate associated with client instantiation by handling it behind the scenes.
API Client Library
GoogleCredentialscredential=GoogleCredentials.getApplicationDefault();
if(credential.createScopedRequired()){
credential=credential.createScoped(StoragetransferScopes.all());
}
StoragetransferstorageTransfer=newStoragetransfer.Builder(Utils.getDefaultTransport(),
Utils.getDefaultJsonFactory(),newHttpCredentialsAdapter(credential))
.build();
Cloud Client Library
StorageTransferServiceClientstorageTransfer=StorageTransferServiceClient.create();
Builders for model classes
Model classes in the Cloud Client Library use builders instead of constructors.
API Client Library
TransferJobtransferJob=
newTransferJob()
.setStatus("ENABLED");
Cloud Client Library
TransferJobtransferJob=
TransferJob.newBuilder()
.setStatus(Status.ENABLED)
.build();
List operations return iterables
List operations in the Cloud Client Library return simple iterables instead of the paginated results in the API Client Library.
API Client Library
publicclass StoragetransferExample{
publicstaticvoidmain(Stringargs[])throwsIOException,GeneralSecurityException{
StoragetransferstoragetransferService=createStoragetransferService();
Storagetransfer.TransferJobs.Listrequest=storagetransferService.transferJobs().list();
ListTransferJobsResponseresponse;
do{
response=request.execute();
if(response.getTransferJobs()==null){
continue;
}
for(TransferJobtransferJob:response.getTransferJobs()){
System.out.println(transferJob);
}
request.setPageToken(response.getNextPageToken());
}while(response.getNextPageToken()!=null);
}
publicstaticStoragetransfercreateStoragetransferService()
throwsIOException,GeneralSecurityException{
HttpTransporthttpTransport=GoogleNetHttpTransport.newTrustedTransport();
JsonFactoryjsonFactory=JacksonFactory.getDefaultInstance();
GoogleCredentialcredential=GoogleCredential.getApplicationDefault();
}
returnnewStoragetransfer.Builder(httpTransport,jsonFactory,credential)
.build();
}
}
Cloud Client Library
publicclass StoragetransferExample{
publicstaticvoidmain(Stringargs[])throwsException{
StorageTransferServiceClientstorageTransfer=StorageTransferServiceClient.create();
ListTransferJobsRequestrequest=ListTransferJobsRequest.newBuilder().build();
for(TransferJobjob:client.listTransferJobs(request).iterateAll()){
System.out.println(job);
}
}
}
Sample comparisons
Here, we include the old API Client Library samples, compared to their equivalent samples using the Cloud Client Library. If you used these samples before, you can use this comparison to understand how to move your code to the new Cloud Client Library.
Transfer from Amazon S3
API Client Library
importcom.google.api.client.googleapis.util.Utils ;
importcom.google.api.services.storagetransfer.v1.Storagetransfer;
importcom.google.api.services.storagetransfer.v1.StoragetransferScopes;
importcom.google.api.services.storagetransfer.v1.model.AwsAccessKey;
importcom.google.api.services.storagetransfer.v1.model.AwsS3Data;
importcom.google.api.services.storagetransfer.v1.model.Date;
importcom.google.api.services.storagetransfer.v1.model.GcsData;
importcom.google.api.services.storagetransfer.v1.model.Schedule;
importcom.google.api.services.storagetransfer.v1.model.TimeOfDay;
importcom.google.api.services.storagetransfer.v1.model.TransferJob;
importcom.google.api.services.storagetransfer.v1.model.TransferSpec;
importcom.google.auth.http.HttpCredentialsAdapter ;
importcom.google.auth.oauth2.GoogleCredentials ;
importjava.io.IOException;
importjava.util.Calendar;
publicclass TransferFromAwsApiary{
// Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
publicstaticvoidtransferFromAws(
StringprojectId,
StringjobDescription,
StringawsSourceBucket,
StringgcsSinkBucket,
longstartDateTime)
throwsIOException{
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job from S3 to GCS.";
// The name of the source AWS bucket to transfer data from
// String awsSourceBucket = "yourAwsSourceBucket";
// The name of the GCS bucket to transfer data to
// String gcsSinkBucket = "your-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// The ID used to access your AWS account. Should be accessed via environment variable.
StringawsAccessKeyId=System.getenv("AWS_ACCESS_KEY_ID");
// The Secret Key used to access your AWS account. Should be accessed via environment variable.
StringawsSecretAccessKey=System.getenv("AWS_SECRET_ACCESS_KEY");
// Set up source and sink
TransferSpectransferSpec=
newTransferSpec()
.setAwsS3DataSource(
newAwsS3Data()
.setBucketName(awsSourceBucket)
.setAwsAccessKey(
newAwsAccessKey()
.setAccessKeyId(awsAccessKeyId)
.setSecretAccessKey(awsSecretAccessKey)))
.setGcsDataSink(newGcsData().setBucketName(gcsSinkBucket));
// Parse epoch timestamp into the model classes
CalendarstartCalendar=Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
DatestartDate=
newDate()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH)+1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
TimeOfDaystartTime=
newTimeOfDay()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND));
Scheduleschedule=
newSchedule()
.setScheduleStartDate(startDate)
.setScheduleEndDate(startDate)
.setStartTimeOfDay(startTime);
// Set up the transfer job
TransferJobtransferJob=
newTransferJob()
.setDescription (jobDescription)
.setProjectId(projectId)
.setTransferSpec(transferSpec)
.setSchedule(schedule)
.setStatus("ENABLED");
// Create a Transfer Service client
GoogleCredentials credential=GoogleCredentials .getApplicationDefault ();
if(credential.createScopedRequired ()){
credential=credential.createScoped (StoragetransferScopes.all());
}
StoragetransferstorageTransfer=
newStoragetransfer.Builder(
Utils .getDefaultTransport (),
Utils .getDefaultJsonFactory (),
newHttpCredentialsAdapter (credential))
.build();
// Create the transfer job
TransferJobresponse=storageTransfer.transferJobs().create(transferJob).execute();
System.out.println("Created transfer job from AWS to GCS:");
System.out.println(response.toPrettyString());
}
}Cloud Client Library
importcom.google.storagetransfer.v1.proto.StorageTransferServiceClient ;
importcom.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest ;
importcom.google.storagetransfer.v1.proto.TransferTypes.AwsAccessKey ;
importcom.google.storagetransfer.v1.proto.TransferTypes.AwsS3Data ;
importcom.google.storagetransfer.v1.proto.TransferTypes.GcsData ;
importcom.google.storagetransfer.v1.proto.TransferTypes.Schedule ;
importcom.google.storagetransfer.v1.proto.TransferTypes.TransferJob ;
importcom.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
importcom.google.storagetransfer.v1.proto.TransferTypes.TransferSpec ;
importcom.google.type.Date ;
importcom.google.type.TimeOfDay ;
importjava.io.IOException;
importjava.util.Calendar;
publicclass TransferFromAws{
// Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
publicstaticvoidtransferFromAws(
StringprojectId,
StringjobDescription,
StringawsSourceBucket,
StringgcsSinkBucket,
longstartDateTime)
throwsIOException{
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job from S3 to GCS.";
// The name of the source AWS bucket to transfer data from
// String awsSourceBucket = "yourAwsSourceBucket";
// The name of the GCS bucket to transfer data to
// String gcsSinkBucket = "your-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// The ID used to access your AWS account. Should be accessed via environment variable.
StringawsAccessKeyId=System.getenv("AWS_ACCESS_KEY_ID");
// The Secret Key used to access your AWS account. Should be accessed via environment variable.
StringawsSecretAccessKey=System.getenv("AWS_SECRET_ACCESS_KEY");
// Set up source and sink
TransferSpec transferSpec=
TransferSpec .newBuilder()
.setAwsS3DataSource (
AwsS3Data .newBuilder()
.setBucketName(awsSourceBucket)
.setAwsAccessKey (
AwsAccessKey .newBuilder()
.setAccessKeyId (awsAccessKeyId)
.setSecretAccessKey (awsSecretAccessKey)))
.setGcsDataSink(GcsData .newBuilder().setBucketName(gcsSinkBucket))
.build();
// Parse epoch timestamp into the model classes
CalendarstartCalendar=Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Date startDate=
Date .newBuilder()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH)+1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
.build();
TimeOfDay startTime=
TimeOfDay .newBuilder()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND))
.build();
Schedule schedule=
Schedule .newBuilder()
.setScheduleStartDate (startDate)
.setScheduleEndDate (startDate)
.setStartTimeOfDay (startTime)
.build();
// Set up the transfer job
TransferJob transferJob=
TransferJob .newBuilder()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(transferSpec)
.setSchedule (schedule)
.setStatus(Status.ENABLED)
.build();
// Create a Transfer Service client
StorageTransferServiceClient storageTransfer=StorageTransferServiceClient .create();
// Create the transfer job
TransferJob response=
storageTransfer.createTransferJob (
CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());
System.out.println("Created transfer job from AWS to GCS:");
System.out.println(response.toString());
}
}Transfer to nearline
API Client Library
importcom.google.api.client.googleapis.util.Utils ;
importcom.google.api.services.storagetransfer.v1.Storagetransfer;
importcom.google.api.services.storagetransfer.v1.StoragetransferScopes;
importcom.google.api.services.storagetransfer.v1.model.Date;
importcom.google.api.services.storagetransfer.v1.model.GcsData;
importcom.google.api.services.storagetransfer.v1.model.ObjectConditions;
importcom.google.api.services.storagetransfer.v1.model.Schedule;
importcom.google.api.services.storagetransfer.v1.model.TimeOfDay;
importcom.google.api.services.storagetransfer.v1.model.TransferJob;
importcom.google.api.services.storagetransfer.v1.model.TransferOptions;
importcom.google.api.services.storagetransfer.v1.model.TransferSpec;
importcom.google.auth.http.HttpCredentialsAdapter ;
importcom.google.auth.oauth2.GoogleCredentials ;
importjava.io.IOException;
importjava.util.Calendar;
publicclass TransferToNearlineApiary{
/**
* Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
* than 30 days old to a Nearline GCS bucket.
*/
publicstaticvoidtransferToNearlineApiary(
StringprojectId,
StringjobDescription,
StringgcsSourceBucket,
StringgcsNearlineSinkBucket,
longstartDateTime)
throwsIOException{
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";
// The name of the source GCS bucket to transfer data from
// String gcsSourceBucket = "your-gcs-source-bucket";
// The name of the Nearline GCS bucket to transfer old objects to
// String gcsSinkBucket = "your-nearline-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// Parse epoch timestamp into the model classes
CalendarstartCalendar=Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Datedate=
newDate()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH)+1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
TimeOfDaytime=
newTimeOfDay()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND));
TransferJobtransferJob=
newTransferJob()
.setDescription (jobDescription)
.setProjectId(projectId)
.setTransferSpec(
newTransferSpec()
.setGcsDataSource(newGcsData().setBucketName(gcsSourceBucket))
.setGcsDataSink(newGcsData().setBucketName(gcsNearlineSinkBucket))
.setObjectConditions(
newObjectConditions()
.setMinTimeElapsedSinceLastModification("2592000s"/* 30 days */))
.setTransferOptions(
newTransferOptions().setDeleteObjectsFromSourceAfterTransfer(true)))
.setSchedule(newSchedule().setScheduleStartDate(date).setStartTimeOfDay(time))
.setStatus("ENABLED");
// Create a Transfer Service client
GoogleCredentials credential=GoogleCredentials .getApplicationDefault ();
if(credential.createScopedRequired ()){
credential=credential.createScoped (StoragetransferScopes.all());
}
StoragetransferstorageTransfer=
newStoragetransfer.Builder(
Utils .getDefaultTransport (),
Utils .getDefaultJsonFactory (),
newHttpCredentialsAdapter (credential))
.build();
// Create the transfer job
TransferJobresponse=storageTransfer.transferJobs().create(transferJob).execute();
System.out.println("Created transfer job from standard bucket to Nearline bucket:");
System.out.println(response.toPrettyString());
}
}Cloud Client Library
importcom.google.protobuf.Duration ;
importcom.google.storagetransfer.v1.proto.StorageTransferServiceClient ;
importcom.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest ;
importcom.google.storagetransfer.v1.proto.TransferTypes.GcsData ;
importcom.google.storagetransfer.v1.proto.TransferTypes.ObjectConditions ;
importcom.google.storagetransfer.v1.proto.TransferTypes.Schedule ;
importcom.google.storagetransfer.v1.proto.TransferTypes.TransferJob ;
importcom.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
importcom.google.storagetransfer.v1.proto.TransferTypes.TransferOptions ;
importcom.google.storagetransfer.v1.proto.TransferTypes.TransferSpec ;
importcom.google.type.Date ;
importcom.google.type.TimeOfDay ;
importjava.io.IOException;
importjava.util.Calendar;
publicclass TransferToNearline{
/**
* Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
* than 30 days old to a Nearline GCS bucket.
*/
publicstaticvoidtransferToNearline(
StringprojectId,
StringjobDescription,
StringgcsSourceBucket,
StringgcsNearlineSinkBucket,
longstartDateTime)
throwsIOException{
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// A short description of this job
// String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";
// The name of the source GCS bucket to transfer data from
// String gcsSourceBucket = "your-gcs-source-bucket";
// The name of the Nearline GCS bucket to transfer old objects to
// String gcsSinkBucket = "your-nearline-gcs-bucket";
// What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
// If this is in the past relative to when the job is created, it will run the next day.
// long startDateTime =
// new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();
// Parse epoch timestamp into the model classes
CalendarstartCalendar=Calendar.getInstance();
startCalendar.setTimeInMillis(startDateTime);
// Note that this is a Date from the model class package, not a java.util.Date
Date date=
Date .newBuilder()
.setYear(startCalendar.get(Calendar.YEAR))
.setMonth(startCalendar.get(Calendar.MONTH)+1)
.setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
.build();
TimeOfDay time=
TimeOfDay .newBuilder()
.setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
.setMinutes(startCalendar.get(Calendar.MINUTE))
.setSeconds(startCalendar.get(Calendar.SECOND))
.build();
TransferJob transferJob=
TransferJob .newBuilder()
.setDescription(jobDescription)
.setProjectId(projectId)
.setTransferSpec(
TransferSpec .newBuilder()
.setGcsDataSource(GcsData .newBuilder().setBucketName(gcsSourceBucket))
.setGcsDataSink(GcsData .newBuilder().setBucketName(gcsNearlineSinkBucket))
.setObjectConditions(
ObjectConditions .newBuilder()
.setMinTimeElapsedSinceLastModification (
Duration .newBuilder().setSeconds(2592000/* 30 days */)))
.setTransferOptions(
TransferOptions .newBuilder().setDeleteObjectsFromSourceAfterTransfer (true)))
.setSchedule (Schedule .newBuilder().setScheduleStartDate (date).setStartTimeOfDay (time))
.setStatus(Status.ENABLED)
.build();
// Create a Transfer Service client
StorageTransferServiceClient storageTransfer=StorageTransferServiceClient .create();
// Create the transfer job
TransferJob response=
storageTransfer.createTransferJob (
CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());
System.out.println("Created transfer job from standard bucket to Nearline bucket:");
System.out.println(response.toString());
}
}Check latest transfer operation
API Client Library
importcom.google.api.client.googleapis.util.Utils ;
importcom.google.api.services.storagetransfer.v1.Storagetransfer;
importcom.google.api.services.storagetransfer.v1.StoragetransferScopes;
importcom.google.api.services.storagetransfer.v1.model.Operation;
importcom.google.api.services.storagetransfer.v1.model.TransferJob;
importcom.google.auth.http.HttpCredentialsAdapter ;
importcom.google.auth.oauth2.GoogleCredentials ;
importjava.io.IOException;
publicclass CheckLatestTransferOperationApiary{
// Gets the requested transfer job and checks its latest operation
publicstaticvoidcheckLatestTransferOperationApiary(StringprojectId,StringjobName)
throwsIOException{
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// The name of the job to check
// String jobName = "myJob/1234567890";
// Create Storage Transfer client
GoogleCredentials credential=GoogleCredentials .getApplicationDefault ();
if(credential.createScopedRequired ()){
credential=credential.createScoped (StoragetransferScopes.all());
}
StoragetransferstorageTransfer=
newStoragetransfer.Builder(
Utils .getDefaultTransport (),
Utils .getDefaultJsonFactory (),
newHttpCredentialsAdapter (credential))
.build();
// Get transfer job and check latest operation
TransferJobtransferJob=storageTransfer.transferJobs().get(jobName,projectId).execute();
StringlatestOperationName=transferJob.getLatestOperationName();
if(latestOperationName!=null){
OperationlatestOperation=
storageTransfer.transferOperations().get(latestOperationName).execute();
System.out.println("The latest operation for transfer job "+jobName+" is:");
System.out.println(latestOperation.toPrettyString());
}else{
System.out.println(
"Transfer job "
+jobName
+" does not have an operation scheduled yet,"
+" try again once the job starts running.");
}
}
}Cloud Client Library
importcom.google.longrunning.Operation ;
importcom.google.storagetransfer.v1.proto.StorageTransferServiceClient ;
importcom.google.storagetransfer.v1.proto.TransferProto.GetTransferJobRequest ;
importcom.google.storagetransfer.v1.proto.TransferTypes.TransferJob ;
importcom.google.storagetransfer.v1.proto.TransferTypes.TransferOperation ;
importjava.io.IOException;
publicclass CheckLatestTransferOperation{
// Gets the requested transfer job and checks its latest operation
publicstaticvoidcheckLatestTransferOperation(StringprojectId,StringjobName)
throwsIOException{
// Your Google Cloud Project ID
// String projectId = "your-project-id";
// The name of the job to check
// String jobName = "myJob/1234567890";
StorageTransferServiceClient storageTransfer=StorageTransferServiceClient .create();
// Get transfer job and check latest operation
TransferJob transferJob=
storageTransfer.getTransferJob (
GetTransferJobRequest.newBuilder().setJobName(jobName).setProjectId(projectId).build());
StringlatestOperationName=transferJob.getLatestOperationName();
if(!latestOperationName.isEmpty()){
Operation operation=storageTransfer.getOperationsClient ().getOperation(latestOperationName);
TransferOperation latestOperation=
TransferOperation .parseFrom(operation.getMetadata ().getValue());
System.out.println("The latest operation for transfer job "+jobName+" is:");
System.out.println(latestOperation.toString());
}else{
System.out.println(
"Transfer job "
+jobName
+" hasn't run yet,"
+" try again once the job starts running.");
}
}
}Python
Updating dependencies
To use the new library, add a dependency on google-cloud-storage-transfer.
This will be used instead of the discovery client from
google-api-python-client.
pip install --upgrade google-cloud-storage-transfer
Client instantiation
Use the storage_transfer module instead of googleapiclient.discovery.
API Client Library
"""A sample for creating a Storage Transfer Service client."""
importgoogleapiclient.discovery
defcreate_transfer_client():
return googleapiclient.discovery.build("storagetransfer", "v1")
Cloud Client Library
"""A sample for creating a Storage Transfer Service client."""
fromgoogle.cloudimport storage_transfer
defcreate_transfer_client():
return storage_transfer.StorageTransferServiceClient ()
Sample comparisons
To illustrate the differences between the two libraries, here are the old API Client Samples side-by-side with their equivalents in the Cloud Client Library.
Transfer from Amazon S3
API Client Library
defmain(
description,
project_id,
start_date,
start_time,
source_bucket,
access_key_id,
secret_access_key,
sink_bucket,
):
"""Create a one-time transfer from Amazon S3 to Google Cloud Storage."""
storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")
# Edit this template with desired parameters.
transfer_job = {
"description": description,
"status": "ENABLED",
"projectId": project_id,
"schedule": {
"scheduleStartDate": {
"day": start_date.day,
"month": start_date.month,
"year": start_date.year,
},
"scheduleEndDate": {
"day": start_date.day,
"month": start_date.month,
"year": start_date.year,
},
"startTimeOfDay": {
"hours": start_time.hour,
"minutes": start_time.minute,
"seconds": start_time.second,
},
},
"transferSpec": {
"awsS3DataSource": {
"bucketName": source_bucket,
"awsAccessKey": {
"accessKeyId": access_key_id,
"secretAccessKey": secret_access_key,
},
},
"gcsDataSink": {"bucketName": sink_bucket},
},
}
result = storagetransfer.transferJobs().create(body=transfer_job).execute()
print("Returned transferJob: {}".format(json.dumps(result, indent=4)))
Cloud Client Library
fromdatetimeimport datetime
fromgoogle.cloudimport storage_transfer
defcreate_one_time_aws_transfer(
project_id: str,
description: str,
source_bucket: str,
aws_access_key_id: str,
aws_secret_access_key: str,
sink_bucket: str,
):
"""Creates a one-time transfer job from Amazon S3 to Google Cloud
Storage."""
client = storage_transfer.StorageTransferServiceClient ()
# The ID of the Google Cloud Platform Project that owns the job
# project_id = 'my-project-id'
# A useful description for your transfer job
# description = 'My transfer job'
# AWS S3 source bucket name
# source_bucket = 'my-s3-source-bucket'
# AWS Access Key ID
# aws_access_key_id = 'AKIA...'
# AWS Secret Access Key
# aws_secret_access_key = 'HEAoMK2.../...ku8'
# Google Cloud Storage destination bucket name
# sink_bucket = 'my-gcs-destination-bucket'
now = datetime.utcnow()
# Setting the start date and the end date as
# the same time creates a one-time transfer
one_time_schedule = {"day": now.day, "month": now.month, "year": now.year}
transfer_job_request = storage_transfer.CreateTransferJobRequest (
{
"transfer_job": {
"project_id": project_id,
"description": description,
"status": storage_transfer.TransferJob .Status.ENABLED,
"schedule": {
"schedule_start_date": one_time_schedule,
"schedule_end_date": one_time_schedule,
},
"transfer_spec": {
"aws_s3_data_source": {
"bucket_name": source_bucket,
"aws_access_key": {
"access_key_id": aws_access_key_id,
"secret_access_key": aws_secret_access_key,
},
},
"gcs_data_sink": {
"bucket_name": sink_bucket,
},
},
}
}
)
result = client.create_transfer_job (transfer_job_request)
print(f"Created transferJob: {result.name}")
Transfer to nearline
API Client Library
defmain(description, project_id, start_date, start_time, source_bucket, sink_bucket):
"""Create a daily transfer from Standard to Nearline Storage class."""
storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")
# Edit this template with desired parameters.
transfer_job = {
"description": description,
"status": "ENABLED",
"projectId": project_id,
"schedule": {
"scheduleStartDate": {
"day": start_date.day,
"month": start_date.month,
"year": start_date.year,
},
"startTimeOfDay": {
"hours": start_time.hour,
"minutes": start_time.minute,
"seconds": start_time.second,
},
},
"transferSpec": {
"gcsDataSource": {"bucketName": source_bucket},
"gcsDataSink": {"bucketName": sink_bucket},
"objectConditions": {
"minTimeElapsedSinceLastModification": "2592000s" # 30 days
},
"transferOptions": {"deleteObjectsFromSourceAfterTransfer": "true"},
},
}
result = storagetransfer.transferJobs().create(body=transfer_job).execute()
print("Returned transferJob: {}".format(json.dumps(result, indent=4)))
Cloud Client Library
Note the import of google.protobuf.duration_pb2.Duration.
fromdatetimeimport datetime
fromgoogle.cloudimport storage_transfer
fromgoogle.protobuf.duration_pb2import Duration
defcreate_daily_nearline_30_day_migration(
project_id: str,
description: str,
source_bucket: str,
sink_bucket: str,
start_date: datetime,
):
"""Create a daily migration from a GCS bucket to a Nearline GCS bucket
for objects untouched for 30 days."""
client = storage_transfer.StorageTransferServiceClient ()
# The ID of the Google Cloud Platform Project that owns the job
# project_id = 'my-project-id'
# A useful description for your transfer job
# description = 'My transfer job'
# Google Cloud Storage source bucket name
# source_bucket = 'my-gcs-source-bucket'
# Google Cloud Storage destination bucket name
# sink_bucket = 'my-gcs-destination-bucket'
transfer_job_request = storage_transfer.CreateTransferJobRequest (
{
"transfer_job": {
"project_id": project_id,
"description": description,
"status": storage_transfer.TransferJob .Status.ENABLED,
"schedule": {
"schedule_start_date": {
"day": start_date.day,
"month": start_date.month,
"year": start_date.year,
}
},
"transfer_spec": {
"gcs_data_source": {
"bucket_name": source_bucket,
},
"gcs_data_sink": {
"bucket_name": sink_bucket,
},
"object_conditions": {
"min_time_elapsed_since_last_modification": Duration(
seconds=2592000 # 30 days
)
},
"transfer_options": {
"delete_objects_from_source_after_transfer": True
},
},
}
}
)
result = client.create_transfer_job (transfer_job_request)
print(f"Created transferJob: {result.name}")
Check latest transfer operation
API Client Library
"""Command-line sample that checks the latest operation of a transfer.
This sample is used on this page:
https://cloud.google.com/storage/transfer/create-transfer
For more information, see README.md.
"""
importargparse
importjson
importgoogleapiclient.discovery
defcheck_latest_transfer_operation(project_id, job_name):
"""Check the latest transfer operation associated with a transfer job."""
storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")
transferJob = (
storagetransfer.transferJobs()
.get(projectId=project_id, jobName=job_name)
.execute()
)
latestOperationName = transferJob.get("latestOperationName")
if latestOperationName:
result = (
storagetransfer.transferOperations().get(name=latestOperationName).execute()
)
print(
"The latest operation for job"
+ job_name
+ " is: {}".format(json.dumps(result, indent=4, sort_keys=True))
)
else:
print(
"Transfer job "
+ job_name
+ " does not have an operation scheduled yet, "
+ "try again once the job starts running."
)
if __name__ == "__main__":
parser = argparse.ArgumentParser(
description=__doc__, formatter_class=argparse.RawDescriptionHelpFormatter
)
parser.add_argument("project_id", help="Your Google Cloud project ID.")
parser.add_argument("job_name", help="Your job name.")
args = parser.parse_args()
check_latest_transfer_operation(args.project_id, args.job_name)Cloud Client Library
Note the use of storage_transfer.TransferOperation.deserialize
fromgoogle.cloudimport storage_transfer
defcheck_latest_transfer_operation(project_id: str, job_name: str):
"""Checks the latest transfer operation for a given transfer job."""
client = storage_transfer.StorageTransferServiceClient ()
# The ID of the Google Cloud Platform Project that owns the job
# project_id = 'my-project-id'
# Storage Transfer Service job name
# job_name = 'transferJobs/1234567890'
transfer_job = client.get_transfer_job (
{
"project_id": project_id,
"job_name": job_name,
}
)
if transfer_job.latest_operation_name:
response = client.transport.operations_client.get_operation(
transfer_job.latest_operation_name
)
operation = storage_transfer.TransferOperation .deserialize(
response.metadata.value
)
print(f"Latest transfer operation for `{job_name}`: {operation}")
else:
print(f"Transfer job {job_name} has not ran yet.")