Create a trigger using Terraform
Stay organized with collections
Save and categorize content based on your preferences.
This document describes how to use Terraform and the
google_eventarc_trigger
resource to create Eventarc triggers for the following Google Cloud
destinations:
For more information about using Terraform, see the Terraform on Google Cloud documentation.
The code samples in this guide route direct events from Cloud Storage but can be adapted for any event provider. For example, to learn how to route direct events from Pub/Sub to Cloud Run, see the Terraform quickstart.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get 300ドル in free credits to run, test, and deploy workloads.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
-
Create a project: To create a project, you need the Project Creator
(
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission. Learn how to grant roles.
-
Verify that billing is enabled for your Google Cloud project.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
-
Create a project: To create a project, you need the Project Creator
(
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission. Learn how to grant roles.
-
Verify that billing is enabled for your Google Cloud project.
-
Enable the Cloud Resource Manager and Identity and Access Management (IAM) APIs.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission. Learn how to grant roles. -
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
Terraform is integrated into the Cloud Shell environment and you can use Cloud Shell to deploy your Terraform resources without having to install Terraform.
Prepare to deploy Terraform
Before deploying any Terraform resources, you must create a Terraform configuration file. A Terraform configuration file lets you define your preferred end-state for your infrastructure using the Terraform syntax.
Prepare Cloud Shell
In Cloud Shell, set the default Google Cloud project where you want to apply your Terraform configurations. You only need to run this command once per project, and you can run it in any directory:
export GOOGLE_CLOUD_PROJECT=PROJECT_ID
Replace PROJECT_ID with the ID of your Google Cloud project.
Note that environment variables are overridden if you set explicit values in the Terraform configuration file.
Prepare the directory
Each Terraform configuration file must have its own directory (also called a root module). In Cloud Shell, create a directory and a create a new file within that directory:
mkdir DIRECTORY && cd DIRECTORY && touch main.tf
The filename must have the .tf extension—for
example, in this document, the file is referred to as main.tf.
Define your Terraform configuration
Copy the applicable Terraform code samples into your newly created
main.tf file. Optionally, you can copy the code from GitHub. This
is recommended when the Terraform snippet is part of an end-to-end solution.
Typically, you apply the entire configuration at once. However, you can also target a specific resource. For example:
terraformapply-target="google_eventarc_trigger.default"
Note that the Terraform code samples use interpolation for substitutions such as reference variables, attributes of resources, and call functions.
Enable APIs
Terraform samples typically assume that the required APIs are enabled in your Google Cloud project. Use the following code to enable the APIs:
Cloud Run
# Enable Cloud Run API
resource"google_project_service""run"{
service="run.googleapis.com"
disable_on_destroy=false
}
# Enable Eventarc API
resource"google_project_service""eventarc"{
service="eventarc.googleapis.com"
disable_on_destroy=false
}
# Enable Pub/Sub API
resource"google_project_service""pubsub"{
service="pubsub.googleapis.com"
disable_on_destroy=false
}GKE
# Enable GKE API
resource"google_project_service""container"{
service="container.googleapis.com"
disable_on_destroy=false
}
# Enable Eventarc API
resource"google_project_service""eventarc"{
service="eventarc.googleapis.com"
disable_on_destroy=false
}
# Enable Pub/Sub API
resource"google_project_service""pubsub"{
service="pubsub.googleapis.com"
disable_on_destroy=false
}Workflows
# Enable Workflows API
resource"google_project_service""workflows"{
service="workflows.googleapis.com"
disable_on_destroy=false
}
# Enable Eventarc API
resource"google_project_service""eventarc"{
service="eventarc.googleapis.com"
disable_on_destroy=false
}
# Enable Pub/Sub API
resource"google_project_service""pubsub"{
service="pubsub.googleapis.com"
disable_on_destroy=false
}Create a service account and configure its access
Every Eventarc trigger is associated with an IAM service account at the time the trigger is created. Use the following code to create a dedicated service account and grant the user-managed service account specific Identity and Access Management roles to manage events:
Cloud Run
# Used to retrieve project information later
data"google_project""project"{}
# Create a dedicated service account
resource"google_service_account""eventarc"{
account_id="eventarc-trigger-sa"
display_name="Eventarc Trigger Service Account"
}
# Grant permission to receive Eventarc events
resource"google_project_iam_member""eventreceiver"{
project=data.google_project.project.id
role="roles/eventarc.eventReceiver"
member="serviceAccount:${google_service_account.eventarc.email}"
}
# Grant permission to invoke Cloud Run services
resource"google_project_iam_member""runinvoker"{
project=data.google_project.project.id
role="roles/run.invoker"
member="serviceAccount:${google_service_account.eventarc.email}"
}The Pub/Sub service agent is automatically created when the
Pub/Sub API is enabled. If the Pub/Sub service agent was
created on or before April 8, 2021, and the service account does not have
the Cloud Pub/Sub Service Agent role
(roles/pubsub.serviceAgent), grant the
Service
Account Token Creator role (roles/iam.serviceAccountTokenCreator)
to the service agent. For more information, see
Create and grant roles to service agents.
resource"google_project_iam_member""tokencreator"{ project=data.google_project.project.id role="roles/iam.serviceAccountTokenCreator" member="serviceAccount:service-${data.google_project.project.number}@gcp-sa-pubsub.iam.gserviceaccount.com" }
GKE
Before creating the service account, enable Eventarc to manage GKE clusters:
# Used to retrieve project_number later data"google_project""project"{} # Enable Eventarc to manage GKE clusters # This is usually done with: gcloud eventarc gke-destinations init # # Eventarc creates a separate Event Forwarder pod for each trigger targeting a # GKE service, and requires explicit permissions to make changes to the # cluster. This is done by granting permissions to a special service account # (the Eventarc P4SA) to manage resources in the cluster. This needs to be done # once per Google Cloud project. # This identity is created with: gcloud beta services identity create --service eventarc.googleapis.com # This local variable is used for convenience locals{ eventarc_sa="serviceAccount:service-${data.google_project.project.number}@gcp-sa-eventarc.iam.gserviceaccount.com" } resource"google_project_iam_member""computeViewer"{ project=data.google_project.project.id role="roles/compute.viewer" member=local.eventarc_sa } resource"google_project_iam_member""containerDeveloper"{ project=data.google_project.project.id role="roles/container.developer" member=local.eventarc_sa } resource"google_project_iam_member""serviceAccountAdmin"{ project=data.google_project.project.id role="roles/iam.serviceAccountAdmin" member=local.eventarc_sa }Create the service account:
# Create a service account to be used by GKE trigger resource"google_service_account""eventarc_gke_trigger_sa"{ account_id="eventarc-gke-trigger-sa" display_name="Evenarc GKE Trigger Service Account" } # Grant permission to receive Eventarc events resource"google_project_iam_member""eventreceiver"{ project=data.google_project.project.id role="roles/eventarc.eventReceiver" member="serviceAccount:${google_service_account.eventarc_gke_trigger_sa.email}" } # Grant permission to subscribe to Pub/Sub topics resource"google_project_iam_member""pubsubscriber"{ project=data.google_project.project.id role="roles/pubsub.subscriber" member="serviceAccount:${google_service_account.eventarc_gke_trigger_sa.email}" }
Workflows
# Used to retrieve project information later
data"google_project""project"{}
# Create a service account for Eventarc trigger and Workflows
resource"google_service_account""eventarc"{
account_id="eventarc-workflows-sa"
display_name="Eventarc Workflows Service Account"
}
# Grant permission to invoke Workflows
resource"google_project_iam_member""workflowsinvoker"{
project=data.google_project.project.id
role="roles/workflows.invoker"
member="serviceAccount:${google_service_account.eventarc.email}"
}
# Grant permission to receive events
resource"google_project_iam_member""eventreceiver"{
project=data.google_project.project.id
role="roles/eventarc.eventReceiver"
member="serviceAccount:${google_service_account.eventarc.email}"
}
# Grant permission to write logs
resource"google_project_iam_member""logwriter"{
project=data.google_project.project.id
role="roles/logging.logWriter"
member="serviceAccount:${google_service_account.eventarc.email}"
}The Pub/Sub service agent is automatically created when the
Pub/Sub API is enabled. If the Pub/Sub service agent was
created on or before April 8, 2021, and the service account does not have a
the Cloud Pub/Sub Service Agent role
(roles/pubsub.serviceAgent), grant the
Service
Account Token Creator role (roles/iam.serviceAccountTokenCreator)
to the service agent. For more information, see
Create and grant roles to service agents.
resource"google_project_iam_member""tokencreator"{ project=data.google_project.project.id role="roles/iam.serviceAccountTokenCreator" member="serviceAccount:service-${data.google_project.project.number}@gcp-sa-pubsub.iam.gserviceaccount.com" }
Create a Cloud Storage bucket as an event provider
Use the following code to create a Cloud Storage bucket, and grant the
Pub/Sub
Publisher role (roles/pubsub.publisher) to the
Cloud Storage service agent.
Cloud Run
# Cloud Storage bucket names must be globally unique
resource"random_id""bucket_name_suffix"{
byte_length=4
}
# Create a Cloud Storage bucket
resource"google_storage_bucket""default"{
name="trigger-cloudrun-${data.google_project.project.name}-${random_id.bucket_name_suffix.hex}"
location=google_cloud_run_v2_service.default.location
force_destroy=true
uniform_bucket_level_access=true
}
# Grant the Cloud Storage service account permission to publish pub/sub topics
data"google_storage_project_service_account""gcs_account"{}
resource"google_project_iam_member""pubsubpublisher"{
project=data.google_project.project.id
role="roles/pubsub.publisher"
member="serviceAccount:${data.google_storage_project_service_account.gcs_account.email_address}"
}GKE
# Cloud Storage bucket names must be globally unique
resource"random_id""bucket_name_suffix"{
byte_length=4
}
# Create a Cloud Storage bucket
resource"google_storage_bucket""default"{
name="trigger-gke-${data.google_project.project.name}-${random_id.bucket_name_suffix.hex}"
location="us-central1"
force_destroy=true
uniform_bucket_level_access=true
}
# Grant the Cloud Storage service account permission to publish pub/sub topics
data"google_storage_project_service_account""gcs_account"{}
resource"google_project_iam_member""pubsubpublisher"{
project=data.google_project.project.id
role="roles/pubsub.publisher"
member="serviceAccount:${data.google_storage_project_service_account.gcs_account.email_address}"
}Workflows
# Cloud Storage bucket names must be globally unique
resource"random_id""bucket_name_suffix"{
byte_length=4
}
# Create a Cloud Storage bucket
resource"google_storage_bucket""default"{
name="trigger-workflows-${data.google_project.project.name}-${random_id.bucket_name_suffix.hex}"
location=google_workflows_workflow.default.region
force_destroy=true
uniform_bucket_level_access=true
}
# Grant the Cloud Storage service account permission to publish Pub/Sub topics
data"google_storage_project_service_account""gcs_account"{}
resource"google_project_iam_member""pubsubpublisher"{
project=data.google_project.project.id
role="roles/pubsub.publisher"
member="serviceAccount:${data.google_storage_project_service_account.gcs_account.email_address}"
}Create an event receiver to be the event target
Create an event receiver using one of the following Terraform resources:
Cloud Run
Create a Cloud Run service as an event destination for the Eventarc trigger:
# Deploy Cloud Run service
resource"google_cloud_run_v2_service""default"{
name="hello-events"
location="us-central1"
deletion_protection=false # set to "true" in production
template{
containers{
# This container will log received events
image="us-docker.pkg.dev/cloudrun/container/hello"
}
service_account=google_service_account.eventarc.email
}
depends_on=[google_project_service.run]
}GKE
To simplify this guide, create a Google Kubernetes Engine service as an event destination outside of Terraform, in between applying Terraform configurations.
If you haven't created a trigger in this Google Cloud project before, run the following command to create the Eventarc service agent:
gcloudbetaservicesidentitycreate--serviceeventarc.googleapis.com
Create a GKE cluster:
# Create an auto-pilot GKE cluster resource"google_container_cluster""gke_cluster"{ name="eventarc-cluster" location="us-central1" enable_autopilot=true depends_on=[ google_project_service.container ] }Deploy a Kubernetes service on GKE that will receive HTTP requests and log events by using a prebuilt Cloud Run image,
us-docker.pkg.dev/cloudrun/container/hello:Get authentication credentials to interact with the cluster:
gcloudcontainerclustersget-credentialseventarc-cluster\ --region=us-central1Create a deployment named
hello-gke:kubectlcreatedeploymenthello-gke\ --image=us-docker.pkg.dev/cloudrun/container/helloExpose the deployment as a Kubernetes service:
kubectlexposedeploymenthello-gke\ --typeClusterIP--port80--target-port8080Make sure the pod is running:
kubectlgetpodsThe output should be similar to the following:
NAMEREADYSTATUSRESTARTSAGE hello-gke-5b6574b4db-rzzcr1/1Running02m45sIf the
STATUSisPendingorContainerCreating, the pod is deploying. Wait a minute for the deployment to complete, and check the status again.Make sure the service is running:
kubectlgetsvcThe output should be similar to the following:
NAMETYPECLUSTER-IPEXTERNAL-IPPORT(S)AGE hello-gkeClusterIP34.118.230.123<none>80/TCP4m46s kubernetesClusterIP34.118.224.1<none>443/TCP14m
Workflows
Deploy a workflow that executes when an object is updated in the Cloud Storage bucket:
# Create a workflow
resource"google_workflows_workflow""default"{
name="storage-workflow-tf"
region="us-central1"
description="Workflow that returns information about storage events"
service_account=google_service_account.eventarc.email
deletion_protection=false # set to "true" in production
# Note that $$ is needed for Terraform
source_contents=<<EOF
main:
params:[event]
steps:
-log_event:
call:sys.log
args:
text:$${event}
severity:INFO
-gather_data:
assign:
-bucket:$${event.data.bucket}
-name:$${event.data.name}
-message:$${"Received event " + event.type + " - " + bucket + ", "+name}
-return_data:
return:$${message}
EOF
depends_on=[
google_project_service.workflows
]
}Define an Eventarc trigger
An Eventarc trigger routes events from an event provider to an
event destination. Use the
google_eventarc_trigger
resource to specify CloudEvents attributes in the matching_criteria
and filter the events. For more information, follow the instructions when
creating a trigger for a specific provider, event type, and destination.
Events that match all the filters are sent to the destination.
Cloud Run
Create an Eventarc trigger that routes Cloud Storage
events to the hello-event Cloud Run service.
# Create an Eventarc trigger, routing Cloud Storage events to Cloud Run
resource"google_eventarc_trigger""default"{
name="trigger-storage-cloudrun-tf"
location=google_cloud_run_v2_service.default.location
# Capture objects changed in the bucket
matching_criteria{
attribute="type"
value="google.cloud.storage.object.v1.finalized"
}
matching_criteria{
attribute="bucket"
value=google_storage_bucket.default.name
}
# Send events to Cloud Run
destination{
cloud_run_service{
service=google_cloud_run_v2_service.default.name
region=google_cloud_run_v2_service.default.location
}
}
service_account=google_service_account.eventarc.email
depends_on=[
google_project_service.eventarc,
google_project_iam_member.pubsubpublisher
]
}GKE
Create an Eventarc trigger that routes Cloud Storage
events to the hello-gke GKE service.
# Create an Eventarc trigger, routing Storage events to GKE
resource"google_eventarc_trigger""default"{
name="trigger-storage-gke-tf"
location="us-central1"
# Capture objects changed in the bucket
matching_criteria{
attribute="type"
value="google.cloud.storage.object.v1.finalized"
}
matching_criteria{
attribute="bucket"
value=google_storage_bucket.default.name
}
# Send events to GKE service
destination{
gke{
cluster="eventarc-cluster"
location="us-central1"
namespace="default"
path="/"
service="hello-gke"
}
}
service_account=google_service_account.eventarc_gke_trigger_sa.email
}Workflows
Create an Eventarc trigger that routes Cloud Storage
events to the workflow named storage-workflow-tf.
# Create an Eventarc trigger, routing Cloud Storage events to Workflows
resource"google_eventarc_trigger""default"{
name="trigger-storage-workflows-tf"
location=google_workflows_workflow.default.region
# Capture objects changed in the bucket
matching_criteria{
attribute="type"
value="google.cloud.storage.object.v1.finalized"
}
matching_criteria{
attribute="bucket"
value=google_storage_bucket.default.name
}
# Send events to Workflows
destination{
workflow=google_workflows_workflow.default.id
}
service_account=google_service_account.eventarc.email
depends_on=[
google_project_service.eventarc,
google_project_service.workflows,
]
}Apply Terraform
Use the Terraform CLI to provision infrastructure based on the configuration file.
To learn how to apply or remove a Terraform configuration, see Basic Terraform commands.
Initialize Terraform. You only need to do this once per directory.
terraform init
Optionally, to use the latest Google provider version, include the
-upgradeoption:terraform init -upgrade
Review the configuration and verify that the resources that Terraform is going to create or update match your expectations:
terraform plan
Make corrections to the configuration as necessary.
Apply the Terraform configuration by running the following command and entering
yesat the prompt:terraform apply
Wait until Terraform displays the "Apply complete!" message.
Verify the creation of resources
Cloud Run
Confirm that the service has been created:
gcloudrunserviceslist--regionus-central1Confirm that the trigger has been created:
gcloudeventarctriggerslist--locationus-central1The output should be similar to the following:
NAME:trigger-storage-cloudrun-tf TYPE:google.cloud.storage.object.v1.finalized DESTINATION:CloudRunservice:hello-events ACTIVE:Yes LOCATION:us-central1
GKE
Confirm that the service has been created:
kubectlgetservicehello-gkeConfirm that the trigger has been created:
gcloudeventarctriggerslist--locationus-central1The output should be similar to the following:
NAME:trigger-storage-gke-tf TYPE:google.cloud.storage.object.v1.finalized DESTINATION:GKE:hello-gke ACTIVE:Yes LOCATION:us-central1
Workflows
Confirm that the workflow has been created:
gcloudworkflowslist--locationus-central1Confirm that the Eventarc trigger has been created:
gcloudeventarctriggerslist--locationus-central1The output should be similar to the following:
NAME:trigger-storage-workflows-tf TYPE:google.cloud.storage.object.v1.finalized DESTINATION:Workflows:storage-workflow-tf ACTIVE:Yes LOCATION:us-central1
Generate and view an event
You can generate an event and confirm that the Eventarc trigger is working as expected.
Retrieve the name of the Cloud Storage bucket you previously created:
gcloudstoragelsUpload a text file to the Cloud Storage bucket:
echo"Hello World" > random.txt gcloudstoragecprandom.txtgs://BUCKET_NAME/random.txtReplace
BUCKET_NAMEwith the Cloud Storage bucket name you retrieved in the previous step. For example:gcloud storage cp random.txt gs://BUCKET_NAME/random.txtThe upload generates an event and the event receiver service logs the event's message.
Verify that an event is received:
Cloud Run
Filter the log entries created by your service:
gcloudloggingread'jsonPayload.message: "Received event of type google.cloud.storage.object.v1.finalized."'Look for a log entry similar to the following:
Receivedeventoftypegoogle.cloud.storage.object.v1.finalized. Eventdata:{"kind":"storage#object","id":"trigger-cloudrun-BUCKET_NAME/random.txt",...}
GKE
Find the pod ID:
POD_NAME=$(kubectlgetpods-ocustom-columns=":metadata.name"--no-headers)This command uses
kubectl's formatted output.Check the logs of the pod:
kubectllogs$POD_NAMELook for a log entry similar to the following:
{"severity":"INFO","eventType":"google.cloud.storage.object.v1.finalized","message": "Received event of type google.cloud.storage.object.v1.finalized. Event data: ...}
Workflows
Verify that a workflows execution is triggered by listing the last five executions:
gcloudworkflowsexecutionsliststorage-workflow-tf--limit=5The output should include a list of executions with a
NAME,STATE,START_TIME, andEND_TIME.Get the results for the most recent execution:
EXECUTION_NAME=$(gcloudworkflowsexecutionsliststorage-workflow-tf--limit=1--format"value(name)") gcloudworkflowsexecutionsdescribe$EXECUTION_NAMEConfirm that the output is similar to the following:
... result:'"Received event google.cloud.storage.object.v1.finalized - BUCKET_NAME, random.txt"' startTime:'2024-12-13T17:23:50.451316533Z' state:SUCCEEDED ...
Clean up
Remove resources previously applied with your Terraform configuration by running the following
command and entering yes at the prompt:
terraform destroy
You can also delete your Google Cloud project to avoid incurring charges. Deleting your Google Cloud project stops billing for all the resources used within that project.
- In the Google Cloud console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.