Getting started with Spanner in Node.js
Objectives
This tutorial walks you through the following steps using the Spanner client library for Node.js:
- Create a Spanner instance and database.
- Write, read, and execute SQL queries on data in the database.
- Update the database schema.
- Update data using a read-write transaction.
- Add a secondary index to the database.
- Use the index to read and execute SQL queries on data.
- Retrieve data using a read-only transaction.
Costs
This tutorial uses Spanner, which is a billable component of the Google Cloud. For information on the cost of using Spanner, see Pricing.
Before you begin
Complete the steps described in Set up, which cover creating and setting a default Google Cloud project, enabling billing, enabling the Cloud Spanner API, and setting up OAuth 2.0 to get authentication credentials to use the Cloud Spanner API.
In particular, make sure that you run gcloud auth
application-default login
to set up your local development environment with authentication
credentials.
Prepare your local Node.js environment
Follow the steps to Set Up a Node.js Development Environment
Clone the sample app repository to your local machine:
gitclonehttps://github.com/googleapis/nodejs-spannerAlternatively, you can download the sample as a zip file and extract it.
Change to the directory that contains the Spanner sample code:
cdsamples/Install dependencies using
npm:npminstall
Create an instance
When you first use Spanner, you must create an instance, which is an allocation of resources that are used by Spanner databases. When you create an instance, you choose an instance configuration, which determines where your data is stored, and also the number of nodes to use, which determines the amount of serving and storage resources in your instance.
See Create an instance
to learn how to create a Spanner instance using any of the
following methods. You can name your instance test-instance to use it with
other topics in this document that reference an instance named test-instance.
- The Google Cloud CLI
- The Google Cloud console
- A client library (C++, C#, Go, Java, Node.js, PHP, Python, or Ruby)
Look through sample files
The samples repository contains a sample that shows how to use Spanner with Node.js.
Take a look through the samples/schema.js file, which shows how to
create a database and modify a database schema. The data uses the example schema
shown in the Schema and data model
page.
Create a database
GoogleSQL
nodeschema.jscreateDatabasetest-instanceexample-dbMY_PROJECT_ID
PostgreSQL
nodeschema.jscreatePgDatabasetest-instanceexample-dbMY_PROJECT_ID
You should see:
Createddatabaseexample-dboninstancetest-instance.
GoogleSQL
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
// creates a client
constspanner=newSpanner ({
projectId:projectID,
});
constdatabaseAdminClient=spanner.getDatabaseAdminClient ();
constcreateSingersTableStatement=`
CREATE TABLE Singers (
SingerId INT64 NOT NULL,
FirstName STRING(1024),
LastName STRING(1024),
SingerInfo BYTES(MAX),
FullName STRING(2048) AS (ARRAY_TO_STRING([FirstName, LastName], " ")) STORED,
) PRIMARY KEY (SingerId)`;
constcreateAlbumsTableStatement=`
CREATE TABLE Albums (
SingerId INT64 NOT NULL,
AlbumId INT64 NOT NULL,
AlbumTitle STRING(MAX)
) PRIMARY KEY (SingerId, AlbumId),
INTERLEAVE IN PARENT Singers ON DELETE CASCADE`;
// Creates a new database
try{
const[operation]=awaitdatabaseAdminClient.createDatabase({
createStatement:'CREATE DATABASE `'+databaseID+'`',
extraStatements:[
createSingersTableStatement,
createAlbumsTableStatement,
],
parent:databaseAdminClient.instancePath(projectID,instanceID),
});
console.log(`Waiting for creation of ${databaseID} to complete...`);
awaitoperation .promise();
console.log(`Created database ${databaseID} on instance ${instanceID}.`);
}catch(err){
console.error('ERROR:',err);
}
PostgreSQL
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// const projectId = 'my-project-id';
// Imports the Google Cloud client library
const{Spanner,protos}=require('@google-cloud/spanner');
// creates a client
constspanner=newSpanner ({
projectId:projectId,
});
constdatabaseAdminClient=spanner.getDatabaseAdminClient ();
asyncfunctioncreatePgDatabase(){
// Creates a PostgreSQL database. PostgreSQL create requests may not contain any additional
// DDL statements. We need to execute these separately after the database has been created.
const[operationCreate]=awaitdatabaseAdminClient.createDatabase({
createStatement:'CREATE DATABASE "'+databaseId+'"',
parent:databaseAdminClient.instancePath(projectId,instanceId),
databaseDialect:
protos.google.spanner.admin.database.v1.DatabaseDialect .POSTGRESQL ,
});
console.log(`Waiting for operation on ${databaseId} to complete...`);
awaitoperationCreate.promise();
const[metadata]=awaitdatabaseAdminClient.getDatabase({
name:databaseAdminClient.databasePath(projectId,instanceId,databaseId),
});
console.log(
`Created database ${databaseId} on instance ${instanceId} with dialect ${metadata.databaseDialect}.`,
);
// Create a couple of tables using a separate request. We must use PostgreSQL style DDL as the
// database has been created with the PostgreSQL dialect.
conststatements=[
`CREATE TABLE Singers
(SingerId bigint NOT NULL,
FirstName varchar(1024),
LastName varchar(1024),
SingerInfo bytea,
FullName character varying(2048) GENERATED ALWAYS AS (FirstName || ' ' || LastName) STORED,
PRIMARY KEY (SingerId)
);
CREATE TABLE Albums
(AlbumId bigint NOT NULL,
SingerId bigint NOT NULL REFERENCES Singers (SingerId),
AlbumTitle text,
PRIMARY KEY (AlbumId)
);`,
];
const[operationUpdateDDL]=awaitdatabaseAdminClient.updateDatabaseDdl({
database:databaseAdminClient.databasePath(
projectId,
instanceId,
databaseId,
),
statements:[statements],
});
awaitoperationUpdateDDL.promise();
console.log('Updated schema');
}
createPgDatabase();The next step is to write data to your database.
Create a database client
Before you can do reads or writes, you must create aDatabase:
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
// Creates a client
constspanner=newSpanner ({projectId});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
// The query to execute
constquery={
sql:'SELECT 1',
};
// Execute a simple SQL statement
const[rows]=awaitdatabase.run(query);
console.log(`Query: ${rows.length} found.`);
rows.forEach(row=>console.log(row));You can think of a Database as a database connection: all of your interactions
with Spanner must go through a Database. Typically you create a Database
when your application starts up, then you re-use that Database to read, write,
and execute transactions. Each client uses resources in Spanner.
If you create multiple clients in the same app, you should call
Database.close()
to clean up the client's resources, including network connections, as soon as it
is no longer needed.
Read more in the Database
reference.
Write data with DML
You can insert data using Data Manipulation Language (DML) in a read-write transaction.
You use the runUpdate() method to execute a DML statement.
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
database.runTransaction (async(err,transaction)=>{
if(err){
console.error(err);
return;
}
try{
const[rowCount]=awaittransaction.runUpdate ({
sql:`INSERT Singers (SingerId, FirstName, LastName) VALUES
(12, 'Melissa', 'Garcia'),
(13, 'Russell', 'Morales'),
(14, 'Jacqueline', 'Long'),
(15, 'Dylan', 'Shaw')`,
});
console.log(`${rowCount} records inserted.`);
awaittransaction.commit();
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the database when finished.
database.close();
}
});
Run the sample using the writeUsingDml argument.
nodedml.jswriteUsingDmltest-instanceexample-dbMY_PROJECT_ID
You should see:
4recordsinserted.
Write data with mutations
You can also insert data using mutations.
You write data using a
Table object. The
Table.insert()
method adds new rows to the table. All inserts in a single batch are applied
atomically.
This code shows how to write the data using mutations:
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
// Instantiate Spanner table objects
constsingersTable=database.table('Singers');
constalbumsTable=database.table('Albums');
// Inserts rows into the Singers table
// Note: Cloud Spanner interprets Node.js numbers as FLOAT64s, so
// they must be converted to strings before being inserted as INT64s
try{
awaitsingersTable.insert([
{SingerId:'1',FirstName:'Marc',LastName:'Richards'},
{SingerId:'2',FirstName:'Catalina',LastName:'Smith'},
{SingerId:'3',FirstName:'Alice',LastName:'Trentor'},
{SingerId:'4',FirstName:'Lea',LastName:'Martin'},
{SingerId:'5',FirstName:'David',LastName:'Lomond'},
]);
awaitalbumsTable.insert([
{SingerId:'1',AlbumId:'1',AlbumTitle:'Total Junk'},
{SingerId:'1',AlbumId:'2',AlbumTitle:'Go, Go, Go'},
{SingerId:'2',AlbumId:'1',AlbumTitle:'Green'},
{SingerId:'2',AlbumId:'2',AlbumTitle:'Forever Hold your Peace'},
{SingerId:'2',AlbumId:'3',AlbumTitle:'Terrified'},
]);
console.log('Inserted data.');
}catch(err){
console.error('ERROR:',err);
}finally{
awaitdatabase.close();
}Run the sample using the insert argument.
nodecrud.jsinserttest-instanceexample-dbMY_PROJECT_ID
You should see:
Inserteddata.
Query data using SQL
Spanner supports a SQL interface for reading data, which you can access on the command line using the Google Cloud CLI or programmatically using the Spanner client library for Node.js.
On the command line
Execute the following SQL statement to read the values of all columns from the
Albums table:
gcloudspannerdatabasesexecute-sqlexample-db--instance=test-instance\
--sql='SELECT SingerId, AlbumId, AlbumTitle FROM Albums'
The result shows:
SingerIdAlbumIdAlbumTitle
11TotalJunk
12Go,Go,Go
21Green
22ForeverHoldYourPeace
23Terrified
Use the Spanner client library for Node.js
In addition to executing a SQL statement on the command line, you can issue the same SQL statement programmatically using the Spanner client library for Node.js.
Use Database.run()
to run the SQL query.
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
constquery={
sql:'SELECT SingerId, AlbumId, AlbumTitle FROM Albums',
};
// Queries rows from the Albums table
try{
const[rows]=awaitdatabase.run(query);
rows.forEach(row=>{
constjson=row.toJSON();
console.log(
`SingerId: ${json.SingerId}, AlbumId: ${json.AlbumId}, AlbumTitle: ${json.AlbumTitle}`,
);
});
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the database when finished.
awaitdatabase.close();
}Here's how to issue the query and access the data:
nodecrud.jsquerytest-instanceexample-dbMY_PROJECT_ID
You should see the following result:
SingerId:1,AlbumId:1,AlbumTitle:TotalJunk
SingerId:1,AlbumId:2,AlbumTitle:Go,Go,Go
SingerId:2,AlbumId:1,AlbumTitle:Green
SingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeace
SingerId:2,AlbumId:3,AlbumTitle:Terrified
Query using a SQL parameter
If your application has a frequently executed query, you can improve its performance by parameterizing it. The resulting parametric query can be cached and reused, which reduces compilation costs. For more information, see Use query parameters to speed up frequently executed queries.
Here is an example of using a parameter in the WHERE clause to
query records containing a specific value for LastName.
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
constquery={
sql:`SELECT SingerId, FirstName, LastName
FROM Singers WHERE LastName = @lastName`,
params:{
lastName:'Garcia',
},
};
// Queries rows from the Albums table
try{
const[rows]=awaitdatabase.run(query);
rows.forEach(row=>{
constjson=row.toJSON();
console.log(
`SingerId: ${json.SingerId}, FirstName: ${json.FirstName}, LastName: ${json.LastName}`,
);
});
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the database when finished.
database.close();
}Here's how to issue the query and access the data:
nodedml.jsqueryWithParametertest-instanceexample-dbMY_PROJECT_ID
You should see the following result:
SingerId:12,FirstName:Melissa,LastName:Garcia
Read data using the read API
In addition to Spanner's SQL interface, Spanner also supports a read interface.
Use Table.read()
to read rows from the database. Use a KeySet object to define a collection of
keys and key ranges to read.
Here's how to read the data:
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
// Reads rows from the Albums table
constalbumsTable=database.table('Albums');
constquery={
columns:['SingerId','AlbumId','AlbumTitle'],
keySet:{
all:true,
},
};
try{
const[rows]=awaitalbumsTable.read(query);
rows.forEach(row=>{
constjson=row.toJSON();
console.log(
`SingerId: ${json.SingerId}, AlbumId: ${json.AlbumId}, AlbumTitle: ${json.AlbumTitle}`,
);
});
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the database when finished.
awaitdatabase.close();
}Run the sample using the read argument.
nodecrud.jsreadtest-instanceexample-dbMY_PROJECT_IDYou should see output similar to:
SingerId:1,AlbumId:1,AlbumTitle:TotalJunk
SingerId:1,AlbumId:2,AlbumTitle:Go,Go,Go
SingerId:2,AlbumId:1,AlbumTitle:Green
SingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeace
SingerId:2,AlbumId:3,AlbumTitle:Terrified
Update the database schema
Assume you need to add a new column called MarketingBudget to the Albums
table. Adding a new column to an existing table requires an update to your
database schema. Spanner supports schema updates to a database while the
database continues to serve traffic. Schema updates don't require taking the
database offline and they don't lock entire tables or columns; you can continue
writing data to the database during the schema update. Read more about supported
schema updates and schema change performance in
Make schema updates.
Add a column
You can add a column on the command line using the Google Cloud CLI or programmatically using the Spanner client library for Node.js.
On the command line
Use the following ALTER TABLE command to
add the new column to the table:
GoogleSQL
gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\
--ddl='ALTER TABLE Albums ADD COLUMN MarketingBudget INT64'
PostgreSQL
gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\
--ddl='ALTER TABLE Albums ADD COLUMN MarketingBudget BIGINT'
You should see:
Schemaupdating...done.
Use the Spanner client library for Node.js
Use Database.updateSchema
to modify the schema:
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
// creates a client
constspanner=newSpanner ({
projectId:projectId,
});
constdatabaseAdminClient=spanner.getDatabaseAdminClient ();
// Creates a new index in the database
try{
const[operation]=awaitdatabaseAdminClient.updateDatabaseDdl({
database:databaseAdminClient.databasePath(
projectId,
instanceId,
databaseId,
),
statements:['ALTER TABLE Albums ADD COLUMN MarketingBudget INT64'],
});
console.log('Waiting for operation to complete...');
awaitoperation .promise();
console.log('Added the MarketingBudget column.');
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the spanner client when finished.
// The databaseAdminClient does not require explicit closure. The closure of the Spanner client will automatically close the databaseAdminClient.
spanner.close();
}
Run the sample using the addColumn argument.
nodeschema.jsaddColumntest-instanceexample-dbMY_PROJECT_ID
You should see:
AddedtheMarketingBudgetcolumn.
Write data to the new column
The following code writes data to the new column. It sets MarketingBudget to
100000 for the row keyed by Albums(1, 1) and to 500000 for the row keyed
by Albums(2, 2).
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
// Update a row in the Albums table
// Note: Cloud Spanner interprets Node.js numbers as FLOAT64s, so they
// must be converted to strings before being inserted as INT64s
constalbumsTable=database.table('Albums');
try{
awaitalbumsTable.update([
{SingerId:'1',AlbumId:'1',MarketingBudget:'100000'},
{SingerId:'2',AlbumId:'2',MarketingBudget:'500000'},
]);
console.log('Updated data.');
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the database when finished.
database.close();
}Run the sample using the update argument.
nodecrud.jsupdatetest-instanceexample-dbMY_PROJECT_ID
You should see:
Updateddata.
You can also execute a SQL query or a read call to fetch the values that you just wrote.
Here's the code to execute the query:
// This sample uses the `MarketingBudget` column. You can add the column
// by running the `add_column` sample or by running this DDL statement against
// your database:
// ALTER TABLE Albums ADD COLUMN MarketingBudget INT64
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
constquery={
sql:'SELECT SingerId, AlbumId, MarketingBudget FROM Albums',
};
// Queries rows from the Albums table
try{
const[rows]=awaitdatabase.run(query);
rows.forEach(asyncrow=>{
constjson=row.toJSON();
console.log(
`SingerId: ${json.SingerId}, AlbumId: ${
json.AlbumId
}, MarketingBudget: ${
json.MarketingBudget?json.MarketingBudget:null
}`,
);
});
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the database when finished.
database.close();
}To execute this query, run the sample using the queryNewColumn argument.
nodeschema.jsqueryNewColumntest-instanceexample-dbMY_PROJECT_ID
You should see:
SingerId:1,AlbumId:1,MarketingBudget:100000
SingerId:1,AlbumId:2,MarketingBudget:null
SingerId:2,AlbumId:1,MarketingBudget:null
SingerId:2,AlbumId:2,MarketingBudget:500000
SingerId:2,AlbumId:3,MarketingBudget:null
Update data
You can update data using DML in a read-write transaction.
You use the runUpdate() method to execute a DML statement.
// This sample transfers 200,000 from the MarketingBudget field
// of the second Album to the first Album, as long as the second
// Album has enough money in its budget. Make sure to run the
// addColumn and updateData samples first (in that order).
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
consttransferAmount=200000;
database.runTransaction ((err,transaction)=>{
if(err){
console.error(err);
return;
}
letfirstBudget,secondBudget;
constqueryOne=`SELECT MarketingBudget FROM Albums
WHERE SingerId = 2 AND AlbumId = 2`;
constqueryTwo=`SELECT MarketingBudget FROM Albums
WHERE SingerId = 1 AND AlbumId = 1`;
Promise.all([
// Reads the second album's budget
transaction.run(queryOne).then(results=>{
// Gets second album's budget
constrows=results[0].map(row=>row.toJSON());
secondBudget=rows[0].MarketingBudget;
console.log(`The second album's marketing budget: ${secondBudget}`);
// Makes sure the second album's budget is large enough
if(secondBudget < transferAmount){
thrownewError(
`The second album's budget (${secondBudget}) is less than the transfer amount (${transferAmount}).`,
);
}
}),
// Reads the first album's budget
transaction.run(queryTwo).then(results=>{
// Gets first album's budget
constrows=results[0].map(row=>row.toJSON());
firstBudget=rows[0].MarketingBudget;
console.log(`The first album's marketing budget: ${firstBudget}`);
}),
])
.then(()=>{
// Transfers the budgets between the albums
console.log(firstBudget,secondBudget);
firstBudget+=transferAmount;
secondBudget-=transferAmount;
console.log(firstBudget,secondBudget);
// Updates the database
// Note: Cloud Spanner interprets Node.js numbers as FLOAT64s, so they
// must be converted (back) to strings before being inserted as INT64s.
returntransaction
.runUpdate ({
sql:`UPDATE Albums SET MarketingBudget = @Budget
WHERE SingerId = 1 and AlbumId = 1`,
params:{
Budget:firstBudget,
},
})
.then(()=>
transaction.runUpdate ({
sql:`UPDATE Albums SET MarketingBudget = @Budget
WHERE SingerId = 2 and AlbumId = 2`,
params:{
Budget:secondBudget,
},
}),
);
})
.then(()=>{
// Commits the transaction and send the changes to the database
returntransaction.commit();
})
.then(()=>{
console.log(
`Successfully executed read-write transaction using DML to transfer ${transferAmount} from Album 2 to Album 1.`,
);
})
.then(()=>{
// Closes the database when finished
database.close();
});
});Run the sample using the writeWithTransactionUsingDml argument.
nodedml.jswriteWithTransactionUsingDmltest-instanceexample-dbMY_PROJECT_ID
You should see:
Successfullyexecutedread-writetransactionusingDMLtotransfer200000ドルfromAlbum2toAlbum1.
Use a secondary index
Suppose you wanted to fetch all rows of Albums that have AlbumTitle values
in a certain range. You could read all values from the AlbumTitle column using
a SQL statement or a read call, and then discard the rows that don't meet the
criteria, but doing this full table scan is expensive, especially for tables
with a lot of rows. Instead you can speed up the retrieval of rows when
searching by non-primary key columns by creating a
secondary index on the table.
Adding a secondary index to an existing table requires a schema update. Like other schema updates, Spanner supports adding an index while the database continues to serve traffic. Spanner automatically backfills the index with your existing data. Backfills might take a few minutes to complete, but you don't need to take the database offline or avoid writing to the indexed table during this process. For more details, see Add a secondary index.
After you add a secondary index, Spanner automatically uses it for SQL queries that are likely to run faster with the index. If you use the read interface, you must specify the index that you want to use.
Add a secondary index
You can add an index on the command line using the gcloud CLI or programmatically using the Spanner client library for Node.js.
On the command line
Use the following CREATE INDEX command
to add an index to the database:
gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\
--ddl='CREATE INDEX AlbumsByAlbumTitle ON Albums(AlbumTitle)'
You should see:
Schemaupdating...done.
Using the Spanner client library for Node.js
Use Database.updateSchema()
to add an index:
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
constdatabaseAdminClient=spanner.getDatabaseAdminClient ();
constrequest=['CREATE INDEX AlbumsByAlbumTitle ON Albums(AlbumTitle)'];
// Creates a new index in the database
try{
const[operation]=awaitdatabaseAdminClient.updateDatabaseDdl({
database:databaseAdminClient.databasePath(
projectId,
instanceId,
databaseId,
),
statements:request,
});
console.log('Waiting for operation to complete...');
awaitoperation .promise();
console.log('Added the AlbumsByAlbumTitle index.');
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the spanner client when finished.
// The databaseAdminClient does not require explicit closure. The closure of the Spanner client will automatically close the databaseAdminClient.
spanner.close();
}Run the sample using the createIndex argument.
nodeindexing.jscreateIndextest-instanceexample-dbMY_PROJECT_ID
Adding an index can take a few minutes. After the index is added, you should see:
AddedtheAlbumsByAlbumTitleindex.
Read using the index
For SQL queries, Spanner automatically uses an appropriate index. In the read interface, you must specify the index in your request.
To use the index in the read interface, use the
Table.read() method.
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
constalbumsTable=database.table('Albums');
constquery={
columns:['AlbumId','AlbumTitle'],
keySet:{
all:true,
},
index:'AlbumsByAlbumTitle',
};
// Reads the Albums table using an index
try{
const[rows]=awaitalbumsTable.read(query);
rows.forEach(row=>{
constjson=row.toJSON();
console.log(`AlbumId: ${json.AlbumId}, AlbumTitle: ${json.AlbumTitle}`);
});
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the database when finished.
database.close();
}Run the sample using the readIndex argument.
nodeindexing.jsreadIndextest-instanceexample-dbMY_PROJECT_ID
You should see:
AlbumId:2,AlbumTitle:ForeverHoldyourPeace
AlbumId:2,AlbumTitle:Go,Go,Go
AlbumId:1,AlbumTitle:Green
AlbumId:3,AlbumTitle:Terrified
AlbumId:1,AlbumTitle:TotalJunk
Add an index for index-only reads
You might have noticed that the previous read example doesn't include reading
the MarketingBudget column. This is because Spanner's read interface
doesn't support the ability to join an index with a data table to look up values
that are not stored in the index.
Create an alternate definition of AlbumsByAlbumTitle that stores a copy of
MarketingBudget in the index.
On the command line
GoogleSQL
gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\
--ddl='CREATEINDEXAlbumsByAlbumTitle2ONAlbums(AlbumTitle)STORING(MarketingBudget)
PostgreSQL
gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\
--ddl='CREATEINDEXAlbumsByAlbumTitle2ONAlbums(AlbumTitle)INCLUDE(MarketingBudget)
Adding an index can take a few minutes. After the index is added, you should see:
Schemaupdating...done.
Using the Spanner client library for Node.js
UseDatabase.updateSchema()
to add an index with a STORING clause:
// "Storing" indexes store copies of the columns they index
// This speeds up queries, but takes more space compared to normal indexes
// See the link below for more information:
// https://cloud.google.com/spanner/docs/secondary-indexes#storing_clause
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
constdatabaseAdminClient=spanner.getDatabaseAdminClient ();
constrequest=[
'CREATE INDEX AlbumsByAlbumTitle2 ON Albums(AlbumTitle) STORING (MarketingBudget)',
];
// Creates a new index in the database
try{
const[operation]=awaitdatabaseAdminClient.updateDatabaseDdl({
database:databaseAdminClient.databasePath(
projectId,
instanceId,
databaseId,
),
statements:request,
});
console.log('Waiting for operation to complete...');
awaitoperation .promise();
console.log('Added the AlbumsByAlbumTitle2 index.');
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the spanner client when finished.
// The databaseAdminClient does not require explicit closure. The closure of the Spanner client will automatically close the databaseAdminClient.
spanner.close();
}Run the sample using the createStoringIndex argument.
nodeindexing.jscreateStoringIndextest-instanceexample-dbMY_PROJECT_ID
You should see:
AddedtheAlbumsByAlbumTitle2index.
Now you can execute a read that fetches all AlbumId, AlbumTitle, and
MarketingBudget columns from the AlbumsByAlbumTitle2 index:
// "Storing" indexes store copies of the columns they index
// This speeds up queries, but takes more space compared to normal indexes
// See the link below for more information:
// https://cloud.google.com/spanner/docs/secondary-indexes#storing_clause
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
constalbumsTable=database.table('Albums');
constquery={
columns:['AlbumId','AlbumTitle','MarketingBudget'],
keySet:{
all:true,
},
index:'AlbumsByAlbumTitle2',
};
// Reads the Albums table using a storing index
try{
const[rows]=awaitalbumsTable.read(query);
rows.forEach(row=>{
constjson=row.toJSON();
letrowString=`AlbumId: ${json.AlbumId}`;
rowString+=`, AlbumTitle: ${json.AlbumTitle}`;
if(json.MarketingBudget){
rowString+=`, MarketingBudget: ${json.MarketingBudget}`;
}
console.log(rowString);
});
}catch(err){
console.error('ERROR:',err);
}finally{
// Close the database when finished.
database.close();
}Run the sample using the readStoringIndex argument.
nodeindexing.jsreadStoringIndextest-instanceexample-dbMY_PROJECT_ID
You should see output similar to:
AlbumId:2,AlbumTitle:ForeverHoldyourPeace,MarketingBudget:300000
AlbumId:2,AlbumTitle:Go,Go,Go,MarketingBudget:null
AlbumId:1,AlbumTitle:Green,MarketingBudget:null
AlbumId:3,AlbumTitle:Terrified,MarketingBudget:null
AlbumId:1,AlbumTitle:TotalJunk,MarketingBudget:300000
Retrieve data using read-only transactions
Suppose you want to execute more than one read at the same timestamp. Read-only
transactions observe a consistent
prefix of the transaction commit history, so your application always gets
consistent data.
Use Database.runTransaction()
for executing read-only transactions.
The following shows how to run a query and perform a read in the same read-only transaction:
// Imports the Google Cloud client library
const{Spanner}=require('@google-cloud/spanner');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = 'my-project-id';
// const instanceId = 'my-instance';
// const databaseId = 'my-database';
// Creates a client
constspanner=newSpanner ({
projectId:projectId,
});
// Gets a reference to a Cloud Spanner instance and database
constinstance=spanner.instance(instanceId);
constdatabase=instance.database(databaseId);
// Gets a transaction object that captures the database state
// at a specific point in time
database.getSnapshot (async(err,transaction)=>{
if(err){
console.error(err);
return;
}
constqueryOne='SELECT SingerId, AlbumId, AlbumTitle FROM Albums';
try{
// Read #1, using SQL
const[qOneRows]=awaittransaction.run(queryOne);
qOneRows.forEach(row=>{
constjson=row.toJSON();
console.log(
`SingerId: ${json.SingerId}, AlbumId: ${json.AlbumId}, AlbumTitle: ${json.AlbumTitle}`,
);
});
constqueryTwo={
columns:['SingerId','AlbumId','AlbumTitle'],
};
// Read #2, using the `read` method. Even if changes occur
// in-between the reads, the transaction ensures that both
// return the same data.
const[qTwoRows]=awaittransaction.read('Albums',queryTwo);
qTwoRows.forEach(row=>{
constjson=row.toJSON();
console.log(
`SingerId: ${json.SingerId}, AlbumId: ${json.AlbumId}, AlbumTitle: ${json.AlbumTitle}`,
);
});
console.log('Successfully executed read-only transaction.');
}catch(err){
console.error('ERROR:',err);
}finally{
transaction.end();
// Close the database when finished.
awaitdatabase.close();
}
});Run the sample using the readOnly argument.
nodetransaction.jsreadOnlytest-instanceexample-dbMY_PROJECT_ID
You should see output similar to:
SingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeace
SingerId:1,AlbumId:2,AlbumTitle:Go,Go,Go
SingerId:2,AlbumId:1,AlbumTitle:Green
SingerId:2,AlbumId:3,AlbumTitle:Terrified
SingerId:1,AlbumId:1,AlbumTitle:TotalJunk
SingerId:1,AlbumId:2,AlbumTitle:Go,Go,Go
SingerId:1,AlbumId:1,AlbumTitle:TotalJunk
SingerId:2,AlbumId:1,AlbumTitle:Green
SingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeace
SingerId:2,AlbumId:3,AlbumTitle:Terrified
Successfullyexecutedread-onlytransaction.
Cleanup
To avoid incurring additional charges to your Cloud Billing account for the resources used in this tutorial, drop the database and delete the instance that you created.
Delete the database
If you delete an instance, all databases within it are automatically deleted. This step shows how to delete a database without deleting an instance (you would still incur charges for the instance).
On the command line
gcloudspannerdatabasesdeleteexample-db--instance=test-instance
Using the Google Cloud console
Go to the Spanner Instances page in the Google Cloud console.
Click the instance.
Click the database that you want to delete.
In the Database details page, click Delete.
Confirm that you want to delete the database and click Delete.
Delete the instance
Deleting an instance automatically drops all databases created in that instance.
On the command line
gcloudspannerinstancesdeletetest-instance
Using the Google Cloud console
Go to the Spanner Instances page in the Google Cloud console.
Click your instance.
Click Delete.
Confirm that you want to delete the instance and click Delete.
What's next
Learn how to access Spanner with a virtual machine instance.
Learn about authorization and authentication credentials in Authenticate to Cloud services using client libraries.
Learn more about Spanner Schema design best practices.