The solution helps you set up a log-streaming pipeline from Stackdriver Logging to Datadog.
-
Fill the required variables in the
terraform.tfvars.sample
file located in this directory. -
Verify the IAM roles for your Terraform service account:
roles/logging.configWriter
on the project (to create the logsink)roles/iam.admin
on the project (to grant write permissions for logsink service account)roles/serviceusage.admin
on the destination project (to enable destination API)roles/pubsub.admin
on the destination project (to create a pub/sub topic)roles/serviceAccount.admin
on the destination project (to create a service account for the logsink subscriber)
-
Run the Terraform automation:
terraform init terraform apply
You should see similar outputs as the following:
-
Navigate to the Datadog Google Cloud Integration Tile.
-
On the Configuration tab, select Upload Key File and upload the JSON file located at the specified
output_key_path
. -
Press Install/Update.
Name | Description | Type | Default | Required |
---|---|---|---|---|
key_output_path | The path to a directory where the JSON private key of the new Datadog service account will be created. | string |
"../datadog-sink/datadog-sa-key.json" |
no |
parent_resource_id | The ID of the project in which pubsub topic destination will be created. | string |
n/a | yes |
project_id | The ID of the project in which the log export will be created. | string |
n/a | yes |
push_endpoint | The URL locating the endpoint to which messages should be pushed. | string |
n/a | yes |
Name | Description |
---|---|
datadog_service_account | Datadog service account email |
log_writer | n/a |
pubsub_subscription_name | Pub/Sub topic subscription name |
pubsub_topic_name | Pub/Sub topic name |
pubsub_topic_project | Pub/Sub topic project id |