BigQuery Data Transfer Service API Client Libraries
Stay organized with collections
Save and categorize content based on your preferences.
This page shows how to get started with the Cloud Client Libraries for the BigQuery Data Transfer API. Client libraries make it easier to access Google Cloud APIs from a supported language. Although you can use Google Cloud APIs directly by making raw requests to the server, client libraries provide simplifications that significantly reduce the amount of code you need to write.
Read more about the Cloud Client Libraries and the older Google API Client Libraries in Client libraries explained.
Install the client library
C#
Install-Package Google.Cloud.BigQuery.DataTransfer.V1 -Pre
For more information, see Setting Up a C# Development Environment.
Go
go get cloud.google.com/go/bigquery/datatransfer/apiv1
For more information, see Setting Up a Go Development Environment.
Java
If you are using Maven, add
the following to your pom.xml file. For more information about
BOMs, see The Google Cloud Platform Libraries BOM.
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>libraries-bom</artifactId>
<version>26.71.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-bigquerydatatransfer</artifactId>
</dependency>
</dependencies>If you are using Gradle, add the following to your dependencies:
implementation'com.google.cloud:google-cloud-bigquerydatatransfer:2.78.0'If you are using sbt, add the following to your dependencies:
libraryDependencies+="com.google.cloud"%"google-cloud-bigquerydatatransfer"%"2.78.0"If you're using Visual Studio Code or IntelliJ, you can add client libraries to your project using the following IDE plugins:
The plugins provide additional functionality, such as key management for service accounts. Refer to each plugin's documentation for details.
For more information, see Setting Up a Java Development Environment.
Node.js
npm install @google-cloud/bigquery-data-transfer
For more information, see Setting Up a Node.js Development Environment.
PHP
composer require google/cloud-bigquerydatatransfer
For more information, see Using PHP on Google Cloud.
Python
pip install --upgrade google-cloud-bigquery-datatransfer
For more information, see Setting Up a Python Development Environment.
Ruby
gem install google-cloud-bigquery-data_transfer
For more information, see Setting Up a Ruby Development Environment.
Set up authentication
To authenticate calls to Google Cloud APIs, client libraries support Application Default Credentials (ADC); the libraries look for credentials in a set of defined locations and use those credentials to authenticate requests to the API. With ADC, you can make credentials available to your application in a variety of environments, such as local development or production, without needing to modify your application code.For production environments, the way you set up ADC depends on the service and context. For more information, see Set up Application Default Credentials.
For a local development environment, you can set up ADC with the credentials that are associated with your Google Account:
-
Install the Google Cloud CLI. After installation, initialize the Google Cloud CLI by running the following command:
gcloudinit
If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.
-
If you're using a local shell, then create local authentication credentials for your user account:
gcloudauthapplication-defaultlogin
You don't need to do this if you're using Cloud Shell.
If an authentication error is returned, and you are using an external identity provider (IdP), confirm that you have signed in to the gcloud CLI with your federated identity.
A sign-in screen appears. After you sign in, your credentials are stored in the local credential file used by ADC.
Use the client library
The following example shows how to use the client library.
C#
usingGoogle.Api.Gax.ResourceNames ;
usingGoogle.Cloud.BigQuery.DataTransfer.V1 ;
usingSystem;
namespaceGoogleCloudSamples
{
publicclassQuickStart
{
publicstaticvoidMain(string[]args)
{
// Instantiates a client
DataTransferServiceClient client=DataTransferServiceClient .Create ();
// Your Google Cloud Platform project ID
stringprojectId="YOUR-PROJECT-ID";
ProjectName project=ProjectName .FromProject (projectId);
varsources=client.ListDataSources (project);
Console.WriteLine("Supported Data Sources:");
foreach(DataSource sourceinsources)
{
Console.WriteLine(
$"{source.DataSourceId}: "+
$"{source.DisplayName} ({source.Description})");
}
}
}
}Go
// Sample bigquery-quickstart creates a Google BigQuery dataset.
packagemain
import(
"fmt"
"log"
"golang.org/x/net/context"
"google.golang.org/api/iterator"
// Imports the BigQuery Data Transfer client package.
datatransfer"cloud.google.com/go/bigquery/datatransfer/apiv1"
datatransferpb"google.golang.org/genproto/googleapis/cloud/bigquery/datatransfer/v1"
)
funcmain(){
ctx:=context.Background()
// Sets your Google Cloud Platform project ID.
projectID:="YOUR_PROJECT_ID"
// Creates a client.
client,err:=datatransfer.NewClient (ctx)
iferr!=nil{
log.Fatalf("Failed to create client: %v",err)
}
req:=&datatransferpb.ListDataSourcesRequest {
Parent:fmt.Sprintf("projects/%s",projectID),
}
it:=client.ListDataSources(ctx,req)
fmt.Println("Supported Data Sources:")
for{
ds,err:=it.Next()
iferr==iterator.Done{
break
}
iferr!=nil{
log.Fatalf("Failed to list sources: %v",err)
}
fmt.Println(ds.DisplayName)
fmt.Println("\tID: ",ds.DataSourceId)
fmt.Println("\tFull path: ",ds.Name)
fmt.Println("\tDescription: ",ds.Description)
}
}
Java
// Imports the Google Cloud client library
importcom.google.cloud.bigquery.datatransfer.v1.DataSource ;
importcom.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient ;
importcom.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.ListDataSourcesPagedResponse ;
importcom.google.cloud.bigquery.datatransfer.v1.ListDataSourcesRequest ;
publicclass QuickstartSample{
/** List available data sources for the BigQuery Data Transfer service. */
publicstaticvoidmain(String...args)throwsException{
// Sets your Google Cloud Platform project ID.
// String projectId = "YOUR_PROJECT_ID";
StringprojectId=args[0];
// Instantiate a client. If you don't specify credentials when constructing a client, the
// client library will look for credentials in the environment, such as the
// GOOGLE_APPLICATION_CREDENTIALS environment variable.
try(DataTransferServiceClient client=DataTransferServiceClient .create()){
// Request the list of available data sources.
Stringparent=String.format("projects/%s",projectId);
ListDataSourcesRequest request=
ListDataSourcesRequest .newBuilder().setParent(parent).build();
ListDataSourcesPagedResponse response=client.listDataSources(request);
// Print the results.
System.out.println("Supported Data Sources:");
for(DataSource dataSource:response.iterateAll()){
System.out.println(dataSource.getDisplayName());
System.out.printf("\tID: %s%n",dataSource.getDataSourceId());
System.out.printf("\tFull path: %s%n",dataSource.getName());
System.out.printf("\tDescription: %s%n",dataSource.getDescription());
}
}
}
}Node.js
constbigqueryDataTransfer=require('@google-cloud/bigquery-data-transfer');
constclient=newbigqueryDataTransfer.v1.DataTransferServiceClient ();
asyncfunctionquickstart(){
constprojectId=awaitclient.getProjectId ();
// Iterate over all elements.
constformattedParent=client.projectPath (projectId,'us-central1');
letnextRequest={parent:formattedParent};
constoptions={autoPaginate:false};
console.log('Data sources:');
do{
// Fetch the next page.
constresponses=awaitclient.listDataSources(nextRequest,options);
// The actual resources in a response.
constresources=responses[0];
// The next request if the response shows that there are more responses.
nextRequest=responses[1];
// The actual response object, if necessary.
// const rawResponse = responses[2];
resources.forEach(resource=>{
console.log(` ${resource.name}`);
});
}while(nextRequest);
console.log('\n\n');
console.log('Sources via stream:');
client
.listDataSourcesStream ({parent:formattedParent})
.on('data',element=>{
console.log(` ${element.name}`);
});
}
quickstart();PHP
# Includes the autoloader for libraries installed with composer
require __DIR__ . '/vendor/autoload.php';
# Imports the Google Cloud client library
use Google\Cloud\BigQuery\DataTransfer\V1\DataTransferServiceClient;
# Instantiates a client
$bqdtsClient = new DataTransferServiceClient();
# Your Google Cloud Platform project ID
$projectId = 'YOUR_PROJECT_ID';
$parent = sprintf('projects/%s/locations/us', $projectId);
try {
echo 'Supported Data Sources:', PHP_EOL;
$pagedResponse = $bqdtsClient->listDataSources($parent);
foreach ($pagedResponse->iterateAllElements() as $dataSource) {
echo 'Data source: ', $dataSource->getDisplayName(), PHP_EOL;
echo 'ID: ', $dataSource->getDataSourceId(), PHP_EOL;
echo 'Full path: ', $dataSource->getName(), PHP_EOL;
echo 'Description: ', $dataSource->getDescription(), PHP_EOL;
}
} finally {
$bqdtsClient->close();
}Python
fromgoogle.cloudimport bigquery_datatransfer
client = bigquery_datatransfer.DataTransferServiceClient ()
# TODO: Update to your project ID.
project_id = "my-project"
# Get the full path to your project.
parent = client.common_project_path (project_id)
print("Supported Data Sources:")
# Iterate over all possible data sources.
for data_source in client.list_data_sources (parent=parent):
print("{}:".format(data_source.display_name))
print("\tID: {}".format(data_source.data_source_id))
print("\tFull path: {}".format(data_source.name))
print("\tDescription: {}".format(data_source.description))Ruby
# Imports the Google Cloud client library
require"google/cloud/bigquery/data_transfer"
# Your Google Cloud Platform project ID
# project_id = "YOUR_PROJECT_ID"
# Instantiate a client
data_transfer=Google::Cloud::Bigquery::DataTransfer.data_transfer_service
# Get the full path to your project.
project_path=data_transfer.project_pathproject:project_id
puts"Supported Data Sources:"
# Iterate over all possible data sources.
data_transfer.list_data_sources(parent:project_path).eachdo|data_source|
puts"Data source: #{data_source.display_name}"
puts"ID: #{data_source.data_source_id}"
puts"Full path: #{data_source.name}"
puts"Description: #{data_source.description}"
endAdditional resources
C#
The following list contains links to more resources related to the client library for C#:
Go
The following list contains links to more resources related to the client library for Go:
Java
The following list contains links to more resources related to the client library for Java:
Node.js
The following list contains links to more resources related to the client library for Node.js:
PHP
The following list contains links to more resources related to the client library for PHP:
Python
The following list contains links to more resources related to the client library for Python:
Ruby
The following list contains links to more resources related to the client library for Ruby: