Skip to content

Commit

Permalink
Merge pull request #2 from jcountsNR/confluent
Browse files Browse the repository at this point in the history
chore: update confluent docs
  • Loading branch information
jcountsNR authored Oct 18, 2023
2 parents 8807e75 + 078bc98 commit 3e1fb09
Showing 1 changed file with 22 additions and 99 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,7 @@ metaDescription: You can collect Kafka metrics from Confluent using the OpenTele
---
You can collect metrics about your Confluent Cloud-managed Kafka deployment with the OpenTelemetry Collector. The collector is a component of OpenTelemetry that collects, processes, and exports telemetry data to New Relic (or any observability backend).

<Callout variant="tip">
If you're looking for help with other collector use cases, see the [newrelic-opentelemetry-examples](https://github.com/newrelic/newrelic-opentelemetry-examples) repository.
</Callout>

Complete the steps below to collect Kafka metrics from Confluent.
Complete the steps below to collect Kafka metrics from Confluent using an OpenTelemetry collector running in docker.

## Step 1: Sign up for New Relic! [#signup]

Expand All @@ -23,37 +19,34 @@ Complete the steps below to collect Kafka metrics from Confluent.

## Step 2: Prerequisites [#prerequisites]

* Ensure [Go](https://go.dev/doc/install) is installed before proceeding.
* Set [GOPATH](https://go.dev/doc/gopath_code) to `$GOPATH/bin` and add to the PATH variable.

## Step 3: Compile from PR source repo [#compile]
* You must have a docker daemon running
* You must have [Docker Compose](https://github.com/newrelic/newrelic-opentelemetry-examples/tree/main/other-examples/collector/confluentcloud) installed
* You will need to generate a [TLS authentication key](https://docs.confluent.io/platform/current/kafka/encryption.html) from your [Confluent Cloud account](https://www.confluent.io/get-started/).

<Callout variant="important">
New Relic supports the OpenTelemetry community by contributing our work upstream to both the [Core](https://github.com/open-telemetry/opentelemetry-collector) and [Contrib](https://github.com/open-telemetry/opentelemetry-collector-contrib) repos.

When [PR22839](https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/22839) on the [OpenTelemetry Collector Contrib repo](https://github.com/open-telemetry/opentelemetry-collector-contrib) has been merged, the documentation below will be updated to reflect the main branch of the Contrib repo.
</Callout>
## Step 3: Download or clone the examples repo [#example]

See [https://github.com/4demos/opentelemetry-collector-contrib.git](https://github.com/4demos/opentelemetry-collector-contrib) for latest installation instructions.
This setup uses the example collector configuration in [New Relic's OpenTelemetry Examples repo](https://github.com/newrelic/newrelic-opentelemetry-examples).

```bash
git clone https://github.com/4demos/opentelemetry-collector-contrib.git
cd opentelemetry-collector-contrib
make otelcontribcol

```
Download the repo above, and navigate to the [Confluent Cloud example](https://github.com/newrelic/newrelic-opentelemetry-examples/tree/main/other-examples/collector/confluentcloud) directory. For more information, you can check the `README` there as well.

The binary will be installed under `./bin`
## Step 4: Add the pem key files and set variables [#configure-opentelemetry-collectors]

## Step 4: Configure Opentelemetry collectors [#configure-opentelemetry-collectors]
* Add the TLS authentication keys (key.pem, cert.pem, and ca.pem files) created in the pre-requisites into the `confluentcloud` directory.

Create a new file called `config.yaml` from the example below.
* "Export the following variables or add them in a `.env` file, then run the `docker compose up` command."

Replace the following keys in the file with your own values:
```bash
export NEW_RELIC_API_KEY=<your_api_key>
export NEW_RELIC_OTLP_ENDPOINT=https://otlp.nr-data.net
export CLUSTER_ID=<your_cluster_id>
export CLUSTER_API_KEY=<your_cluster_api_key>
export CLUSTER_API_SECRET=<your_cluster_api_secret>
export CLUSTER_BOOTSTRAP_SERVER=<your_cluster_bootstrap_server>

docker compose up
```

* [Cloud API key](https://docs.confluent.io/cloud/current/monitoring/metrics-api.html#metrics-quick-start)
* CONFLUENT_API_ID
* CONFLUENT_API_SECRET
* [Kafka Client API key](https://docs.confluent.io/cloud/current/access-management/authenticate/api-keys/api-keys.html#resource-specific-api-keys)
* CLUSTER_API_KEY
* CLUSTER_API_SECRET
Expand All @@ -66,82 +59,12 @@ Replace the following keys in the file with your own values:
* bootstrap server provided by confluent for the cluster
* example: xxx-xxxx.us-east-2.aws.confluent.cloud:9092

```yaml
receivers:
kafkametrics:
brokers:
- CLUSTER_BOOTSTRAP_SERVER
protocol_version: 2.0.0
scrapers:
- brokers
- topics
- consumers
auth:
sasl:
username: CLUSTER_API_KEY
password: CLUSTER_API_SECRET
mechanism: PLAIN
tls:
insecure_skip_verify: false
collection_interval: 30s


prometheus:
config:
scrape_configs:
- job_name: "confluent"
scrape_interval: 60s # Do not go any lower than this or you'll hit rate limits
static_configs:
- targets: ["api.telemetry.confluent.cloud"]
scheme: https
basic_auth:
username: CONFLUENT_API_ID
password: CONFLUENT_API_SECRET
metrics_path: /v2/metrics/cloud/export
params:
"resource.kafka.id":
- CLUSTER_ID
exporters:
otlp:
endpoint: https://otlp.nr-data.net:4317
headers:
api-key: NEW_RELIC_LICENSE_KEY
processors:
batch:
memory_limiter:
limit_mib: 400
spike_limit_mib: 100
check_interval: 5s
service:
telemetry:
logs:
pipelines:
metrics:
receivers: [prometheus]
processors: [batch]
exporters: [otlp]
metrics/kafka:
receivers: [kafkametrics]
processors: [batch]
exporters: [otlp]


```

## Step 5: Run the collector [#run-collector]

Execute the following, making sure to insert the operating system (for example, `darwin` or `linux`):

```
./bin/otelcontribcol_INSERT_THE_OPERATING_SYSTEM_amd64 --config config.yaml
```


## Step 6: Set up dashboards in New Relic
## Step 5: Set up dashboards in New Relic

Check out this New Relic [example dashboard](https://github.com/newrelic/newrelic-quickstarts/blob/main/dashboards/confluent-cloud/confluent-cloud.json) that uses these metrics:

### Kafka instance metrics [#instance-metrics]
### (COMING SOON) Kafka instance metrics [#instance-metrics]

<table>
<thead>
Expand Down

0 comments on commit 3e1fb09

Please sign in to comment.