BigQuery API Client Libraries
This page shows how to get started with the Cloud Client Libraries for the BigQuery API. Client libraries make it easier to access Google Cloud APIs from a supported language. Although you can use Google Cloud APIs directly by making raw requests to the server, client libraries provide simplifications that significantly reduce the amount of code you need to write.
Read more about the Cloud Client Libraries and the older Google API Client Libraries in Client libraries explained.
Install the client library
C#
Install-Package Google.Cloud.BigQuery.V2 -Pre
For more information, see Setting Up a C# Development Environment.
Go
go get cloud.google.com/go/bigquery
For more information, see Setting Up a Go Development Environment.
Java
If you are using Maven, add
the following to your pom.xml file. For more information about
BOMs, see The Google Cloud Platform Libraries BOM.
<!--Usinglibraries-bomtomanageversions.
Seehttps://github.com/GoogleCloudPlatform/cloud-opensource-java/wiki/The-Google-Cloud-Platform-Libraries-BOM-->
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>libraries-bom</artifactId>
<version>26.62.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-bigquery</artifactId>
</dependency>
</dependencies>
If you are using Gradle, add the following to your dependencies:
implementationplatform('com.google.cloud:libraries-bom:26.45.0')
implementation'com.google.cloud:google-cloud-bigquery'If you are using sbt, add the following to your dependencies:
libraryDependencies+="com.google.cloud"%"google-cloud-bigquery"%"2.42.2"If you're using Visual Studio Code, IntelliJ, or Eclipse, you can add client libraries to your project using the following IDE plugins:
The plugins provide additional functionality, such as key management for service accounts. Refer to each plugin's documentation for details.
For more information, see Setting Up a Java Development Environment.
Node.js
npm install @google-cloud/bigquery
For more information, see Setting Up a Node.js Development Environment.
PHP
composer require google/cloud-bigquery
For more information, see Using PHP on Google Cloud.
Python
pip install --upgrade google-cloud-bigquery
For more information, see Setting Up a Python Development Environment.
Ruby
gem install google-cloud-bigquery
For more information, see Setting Up a Ruby Development Environment.
Set up authentication
To authenticate calls to Google Cloud APIs, client libraries support Application Default Credentials (ADC); the libraries look for credentials in a set of defined locations and use those credentials to authenticate requests to the API. With ADC, you can make credentials available to your application in a variety of environments, such as local development or production, without needing to modify your application code.For production environments, the way you set up ADC depends on the service and context. For more information, see Set up Application Default Credentials.
For a local development environment, you can set up ADC with the credentials that are associated with your Google Account:
-
Install the Google Cloud CLI. After installation, initialize the Google Cloud CLI by running the following command:
gcloudinit
If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.
-
If you're using a local shell, then create local authentication credentials for your user account:
gcloudauthapplication-defaultlogin
You don't need to do this if you're using Cloud Shell.
If an authentication error is returned, and you are using an external identity provider (IdP), confirm that you have signed in to the gcloud CLI with your federated identity.
A sign-in screen appears. After you sign in, your credentials are stored in the local credential file used by ADC.
Use the client library
The following example shows how to initialize a client and perform a query on a BigQuery API public dataset.
C#
usingGoogle.Cloud.BigQuery.V2 ;
usingSystem;
publicclassBigQueryQuery
{
publicvoidQuery(
stringprojectId="your-project-id"
)
{
BigQueryClient client=BigQueryClient .Create (projectId);
stringquery=@"
SELECT name FROM `bigquery-public-data.usa_names.usa_1910_2013`
WHERE state = 'TX'
LIMIT 100";
BigQueryJob job=client.CreateQueryJob (
sql:query,
parameters:null,
options:newQueryOptions {UseQueryCache=false});
// Wait for the job to complete.
job=job.PollUntilCompleted ().ThrowOnAnyError();
// Display the results
foreach(BigQueryRow rowinclient.GetQueryResults (job.Reference ))
{
Console.WriteLine($"{row["name"]}");
}
}
}Go
import(
"context"
"fmt"
"io"
"cloud.google.com/go/bigquery"
"google.golang.org/api/iterator"
)
// queryBasic demonstrates issuing a query and reading results.
funcqueryBasic(wio.Writer,projectIDstring)error{
// projectID := "my-project-id"
ctx:=context.Background()
client,err:=bigquery.NewClient(ctx,projectID)
iferr!=nil{
returnfmt.Errorf("bigquery.NewClient: %v",err)
}
deferclient.Close()
q:=client.Query(
"SELECT name FROM `bigquery-public-data.usa_names.usa_1910_2013` "+
"WHERE state = \"TX\" "+
"LIMIT 100")
// Location must match that of the dataset(s) referenced in the query.
q.Location ="US"
// Run the query and print results when the query job is completed.
job,err:=q.Run(ctx)
iferr!=nil{
returnerr
}
status,err:=job.Wait(ctx)
iferr!=nil{
returnerr
}
iferr:=status.Err ();err!=nil{
returnerr
}
it,err:=job.Read(ctx)
for{
varrow[]bigquery.Value
err:=it.Next(&row)
iferr==iterator.Done{
break
}
iferr!=nil{
returnerr
}
fmt.Fprintln(w,row)
}
returnnil
}
Java
importcom.google.cloud.bigquery.BigQuery ;
importcom.google.cloud.bigquery.BigQueryException ;
importcom.google.cloud.bigquery.BigQueryOptions ;
importcom.google.cloud.bigquery.FieldValueList ;
importcom.google.cloud.bigquery.Job ;
importcom.google.cloud.bigquery.JobId ;
importcom.google.cloud.bigquery.JobInfo ;
importcom.google.cloud.bigquery.QueryJobConfiguration ;
importcom.google.cloud.bigquery.TableResult ;
publicclass SimpleApp{
publicstaticvoidmain(String...args)throwsException{
// TODO(developer): Replace these variables before running the app.
StringprojectId="MY_PROJECT_ID";
simpleApp(projectId);
}
publicstaticvoidsimpleApp(StringprojectId){
try{
BigQuery bigquery=BigQueryOptions .getDefaultInstance().getService();
QueryJobConfiguration queryConfig=
QueryJobConfiguration .newBuilder(
"SELECT CONCAT('https://stackoverflow.com/questions/', "
+"CAST(id as STRING)) as url, view_count "
+"FROM `bigquery-public-data.stackoverflow.posts_questions` "
+"WHERE tags like '%google-bigquery%' "
+"ORDER BY view_count DESC "
+"LIMIT 10")
// Use standard SQL syntax for queries.
// See: https://cloud.google.com/bigquery/sql-reference/
.setUseLegacySql(false)
.build();
JobId jobId=JobId .newBuilder().setProject(projectId).build();
Job queryJob=bigquery.create (JobInfo.newBuilder(queryConfig).setJobId(jobId).build());
// Wait for the query to complete.
queryJob=queryJob.waitFor ();
// Check for errors
if(queryJob==null){
thrownewRuntimeException("Job no longer exists");
}elseif(queryJob.getStatus().getExecutionErrors ()!=null
&& queryJob.getStatus().getExecutionErrors().size() > 0){
// TODO(developer): Handle errors here. An error here do not necessarily mean that the job
// has completed or was unsuccessful.
// For more details: https://cloud.google.com/bigquery/troubleshooting-errors
thrownewRuntimeException("An unhandled error has occurred");
}
// Get the results.
TableResult result=queryJob.getQueryResults ();
// Print all pages of the results.
for(FieldValueList row:result.iterateAll ()){
// String type
Stringurl=row.get("url").getStringValue();
StringviewCount=row.get("view_count").getStringValue();
System.out.printf("%s : %s views\n",url,viewCount);
}
}catch(BigQueryException |InterruptedExceptione){
System.out.println("Simple App failed due to error: \n"+e.toString());
}
}
}Node.js
// Import the Google Cloud client library using default credentials
const{BigQuery}=require('@google-cloud/bigquery');
constbigquery=newBigQuery ();
asyncfunctionquery(){
// Queries the U.S. given names dataset for the state of Texas.
constquery=`SELECT name
FROM \`bigquery-public-data.usa_names.usa_1910_2013\`
WHERE state = 'TX'
LIMIT 100`;
// For all options, see https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query
constoptions={
query:query,
// Location must match that of the dataset(s) referenced in the query.
location:'US',
};
// Run the query as a job
const[job]=awaitbigquery.createQueryJob(options);
console.log(`Job ${job .id} started.`);
// Wait for the query to finish
const[rows]=awaitjob .getQueryResults ();
// Print the results
console.log('Rows:');
rows.forEach(row=>console.log(row));
}PHP
use Google\Cloud\BigQuery\BigQueryClient;
use Google\Cloud\Core\ExponentialBackoff;
/** Uncomment and populate these variables in your code */
// $projectId = 'The Google project ID';
// $query = 'SELECT id, view_count FROM `bigquery-public-data.stackoverflow.posts_questions`';
$bigQuery = new BigQueryClient([
'projectId' => $projectId,
]);
$jobConfig = $bigQuery->query($query);
$job = $bigQuery->startQuery($jobConfig);
$backoff = new ExponentialBackoff(10);
$backoff->execute(function () use ($job) {
print('Waiting for job to complete' . PHP_EOL);
$job->reload();
if (!$job->isComplete()) {
throw new Exception('Job has not yet completed', 500);
}
});
$queryResults = $job->queryResults();
$i = 0;
foreach ($queryResults as $row) {
printf('--- Row %s ---' . PHP_EOL, ++$i);
foreach ($row as $column => $value) {
printf('%s: %s' . PHP_EOL, $column, json_encode($value));
}
}
printf('Found %s row(s)' . PHP_EOL, $i);Python
fromgoogle.cloudimport bigquery
# Construct a BigQuery client object.
client = bigquery .Client ()
query = """
SELECT name, SUM(number) as total_people
FROM `bigquery-public-data.usa_names.usa_1910_2013`
WHERE state = 'TX'
GROUP BY name, state
ORDER BY total_people DESC
LIMIT 20
"""
rows = client.query_and_wait (query) # Make an API request.
print("The query data:")
for row in rows:
# Row values can be accessed by field name or index.
print("name={}, count={}".format(row[0], row["total_people"]))Ruby
require"google/cloud/bigquery"
defquery
bigquery=Google::Cloud::Bigquery .new
sql="SELECT name FROM `bigquery-public-data.usa_names.usa_1910_2013` "\
"WHERE state = 'TX' "\
"LIMIT 100"
# Location must match that of the dataset(s) referenced in the query.
results=bigquery.querysqldo|config|
config.location="US"
end
results.eachdo|row|
putsrow.inspect
end
endAdditional resources
C#
The following list contains links to more resources related to the client library for C#:
Go
The following list contains links to more resources related to the client library for Go:
Java
The following list contains links to more resources related to the client library for Java:
Node.js
The following list contains links to more resources related to the client library for Node.js:
PHP
The following list contains links to more resources related to the client library for PHP:
Python
The following list contains links to more resources related to the client library for Python:
Ruby
The following list contains links to more resources related to the client library for Ruby:
Third-party BigQuery API client libraries
In addition to the Google-supported client libraries listed in the tables above, a set of third-party libraries are available.
| Language | Library |
|---|---|
| Python | pandas-gbq (usage guide), ibis (tutorial) |
| R | bigrquery, BigQueryR |
| Scala | spark-bigquery-connector |
What's next?
- View available BigQuery code samples.
- Query a public dataset with the BigQuery API client libraries.
- Visualize BigQuery API public data using a Jupyter notebook.
Try it for yourself
If you're new to Google Cloud, create an account to evaluate how BigQuery performs in real-world scenarios. New customers also get 300ドル in free credits to run, test, and deploy workloads.
Try BigQuery free