From e1eb88c36f73b089e5c19a684b1282bbab4c586d Mon Sep 17 00:00:00 2001 From: jcountsNR <94138069+jcountsNR@users.noreply.github.com> Date: Wed, 24 Jan 2024 08:10:37 -0800 Subject: [PATCH] Chore: update the confluent collector docs --- ...lemetry-collector-kafka-confluentcloud.mdx | 211 +++++++++--------- 1 file changed, 103 insertions(+), 108 deletions(-) diff --git a/src/content/docs/more-integrations/open-source-telemetry-integrations/opentelemetry/collector/collector-configuration-examples/opentelemetry-collector-kafka-confluentcloud.mdx b/src/content/docs/more-integrations/open-source-telemetry-integrations/opentelemetry/collector/collector-configuration-examples/opentelemetry-collector-kafka-confluentcloud.mdx index d3e76f01e6f..0c56a12e383 100644 --- a/src/content/docs/more-integrations/open-source-telemetry-integrations/opentelemetry/collector/collector-configuration-examples/opentelemetry-collector-kafka-confluentcloud.mdx +++ b/src/content/docs/more-integrations/open-source-telemetry-integrations/opentelemetry/collector/collector-configuration-examples/opentelemetry-collector-kafka-confluentcloud.mdx @@ -11,7 +11,9 @@ metaDescription: You can collect Kafka metrics from Confluent using the OpenTele You can collect metrics about your Confluent Cloud-managed Kafka deployment with the OpenTelemetry Collector. The collector is a component of OpenTelemetry that collects, processes, and exports telemetry data to New Relic (or any observability backend). -Complete the steps below to collect Kafka metrics from Confluent using an OpenTelemetry collector running in docker. +This integration works by running a prometheus receiver configuration inside the OpenTelemetry collector, which scrapes [Confluent Cloud's metrics API](https://api.telemetry.confluent.cloud/docs/descriptors/datasets/cloud?_ga=2.183807142.1264186867.1705940186-6520871.1686857317&_gl=1*1te8jue*_ga*NjUyMDg3MS4xNjg2ODU3MzE3*_ga_D2D3EGKSGD*MTcwNjAzNzYwOS41Ni4wLjE3MDYwMzc2MDkuNjAuMC4w) and exports that data to New Relic. + +Complete the steps below to collect Kafka metrics from Confluent and export them to New Relic. @@ -21,7 +23,8 @@ Complete the steps below to collect Kafka metrics from Confluent using an OpenTe * You have a docker daemon running * You have [Docker Compose](https://github.com/newrelic/newrelic-opentelemetry-examples/tree/main/other-examples/collector/confluentcloud) installed - * You have a [Confluent Cloud account](https://www.confluent.io/get-started/) + * You have a [Confluent Cloud account](https://www.confluent.io/get-started/) + * You have your [Confluent Cloud API key & secret](https://docs.confluent.io/confluent-cli/current/command-reference/api-key/confluent_api-key_create.html) available @@ -31,38 +34,18 @@ Complete the steps below to collect Kafka metrics from Confluent using an OpenTe Download [New Relic's OpenTelemetry Examples repo](https://github.com/newrelic/newrelic-opentelemetry-examples)as this setup uses its example collector configuration. Once installed, open the [Confluent Cloud example](https://github.com/newrelic/newrelic-opentelemetry-examples/tree/main/other-examples/collector/confluentcloud) directory. For more information, you can check the `README` there as well. - - ## Add the authentication files - - This example setup uses TLS to authenticate the request to Confluent Cloud. There are multiple methods to authenticate, so you should follow your company best practices and authentication methods. - - * TLS/SSL requires you to create keys and certificates, create your own Certificate Authority (CA), and sign the certificate. - * Doing this should leave you with three files which need to be added to this directory. - * Those files are referenced in this example as the follwing files: `key.pem`, `cert.pem`, `ca.pem`. - - - For more information about TLS authentication with Confluent Cloud, check the [documentation on authenticating with TLS](https://docs.confluent.io/platform/current/kafka/authentication_ssl.html) as well as the [security tutorial](https://docs.confluent.io/platform/current/security/security_tutorial.html). - For dev/test Confluent environments, you can simplify this by using plain text authentication. - - - - ## Set environment variables and run the collector - Export the following variables or add them in a `.env` file, then run the `docker compose up` command. + * Set the API key & Secret variables for both Confluent Cloud and New relic in the `.env` file + * Set the `Cluster_ID` variable with the target kafka cluster ID + * (Optional) To monitor connectors or schema registry's managed by Confluent Cloud, you can un-comment the configuration in the `collector.yaml` file and set the corresponding ID in the `.env` file ```bash # Open the confluent cloud example directory cd newrelic-opentelemetry-examples/other-examples/collector/confluentcloud # Set environment variables. -export NEW_RELIC_API_KEY= -export NEW_RELIC_OTLP_ENDPOINT=https://otlp.nr-data.net -export CLUSTER_ID= -export CLUSTER_API_KEY= -export CLUSTER_API_SECRET= -export CLUSTER_BOOTSTRAP_SERVER= # run the collector in docker docker compose up @@ -70,89 +53,101 @@ docker compose up ### Local Variable information - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- Variable - - Description - - Docs -
- `NEW_RELIC_API_KEY` - - New Relic Ingest API Key - - [API Key docs](https://docs.newrelic.com/docs/apis/intro-apis/new-relic-api-keys/) -
- `NEW_RELIC_OTLP_ENDPOINT` - - New Relic OTLP endpoint is https://otlp.nr-data.net:4318 - - [OTLP endpoint config docs](/docs/more-integrations/open-source-telemetry-integrations/opentelemetry/get-started/opentelemetry-set-up-your-app/#review-settings) -
- `CLUSTER_ID` - - ID of cluster from Confluent Cloud - - Available in your Confluent cluster settings -
- `CONFLUENT_API_ID` - - Cloud API key - - [Cloud API key docs](https://docs.confluent.io/cloud/current/monitoring/metrics-api.html#metrics-quick-start) -
- `CONFLUENT_API_SECRET` - - Cloud API secret - - [Cloud API key docs](https://docs.confluent.io/cloud/current/monitoring/metrics-api.html#metrics-quick-start) -
- `CLUSTER_BOOTSTRAP_SERVER` - - Bootstrap Server for cluster - - Available in your cluster settings -
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ Variable + + Description + + Docs +
+ `NEW_RELIC_API_KEY` + + New Relic Ingest API Key + + [API Key docs](/docs/apis/intro-apis/new-relic-api-keys/) +
+ `NEW_RELIC_OTLP_ENDPOINT` + + Default US New Relic OTLP endpoint is https://otlp.nr-data.net:4318 + + [OTLP endpoint config docs](/docs/more-integrations/open-source-telemetry-integrations/opentelemetry/get-started/opentelemetry-set-up-your-app/#review-settings) +
+ `CLUSTER_ID` + + ID of cluster from Confluent Cloud + + [Docs for list cluster ID command](https://docs.confluent.io/confluent-cli/current/command-reference/kafka/cluster/confluent_kafka_cluster_list.html#description) +
+ `CONFLUENT_API_KEY` + + Cloud API key + + [Cloud API key docs](https://docs.confluent.io/cloud/current/monitoring/metrics-api.html#metrics-quick-start) +
+ `CONFLUENT_API_SECRET` + + Cloud API secret + + [Cloud API key docs](https://docs.confluent.io/cloud/current/monitoring/metrics-api.html#metrics-quick-start) +
+ `CONNECTOR_ID` + + (OPTIONAL) You can monitor your Confluent connectors by specifying the ID here + + [Docs for list connector ID command](https://docs.confluent.io/confluent-cli/current/command-reference/connect/cluster/confluent_connect_cluster_list.html) +
+ `SCHEMA_REGISTRY_ID` + + (OPTIONAL) You can monitor your Confluent Schema Registry by specifying the ID here + + [Docs for list connector ID command](https://docs.confluent.io/confluent-cli/current/command-reference/schema-registry/schema/confluent_schema-registry_schema_list.html) +
+