As a developer of applications and services, you can use OpenShift Connectors to create and configure connections between OpenShift Streams for Apache Kafka and third-party systems.
In this example, you connect a data source (a data generator) that creates Kafka messages and a data sink (an HTTP endpoint) that consumes the Kafka messages.
Before you use Connectors, you must complete the following prerequisites:
-
Determine which OpenShift environment to use for your Connectors namespace. The Connectors namespace is where your Connectors instances are deployed.
-
Configure OpenShift Streams for Apache Kafka for use with Connectors.
Determining which OpenShift environment to use for your Connectors namespace
You have the following choices:
-
The hosted preview environment
-
The Connectors instances are hosted on a multitenant OpenShift Dedicated cluster that is owned by Red Hat.
-
You can create four Connectors instances at a time.
-
The preview environment applies 48-hour expiration windows, as described in Red Hat OpenShift Connectors Preview guidelines.
-
-
Your own OpenShift Dedicated Trial environment
-
You have access to your own OpenShift Dedicated Trial environment.
-
You can create an unlimited number of Connectors instances.
-
Your OpenShift Dedicated Trial cluster expires after 60 days.
-
A cluster administrator must install the Connectors add-on as described in Adding the Red Hat OpenShift Connectors add-on to your OpenShift cluster.
-
Configuring OpenShift Streams for Apache Kafka for use with Connectors
Complete the steps in Getting started with OpenShift Streams for Apache Kafka to set up the following components:
-
A Kafka instance that you can use for Connectors. For this example, the Kafka instance is
test-connect
. -
A Kafka topic to store messages sent by data sources and make the messages available to data sinks. For this example, the Kafka topic is
test-topic
. -
A service account that allows you to connect and authenticate your Connectors instances with your Kafka instance.
-
Access rules for the service account that define how your Connectors instances can access and use the topics in your Kafka instance.
-
Verify that the Kafka instance is listed on the Kafka Instances page and that the state of the Kafka instance is shown as Ready.
-
Verify that your service account was successfully created on the Service Accounts page.
-
Verify that you saved your service account credentials to a secure location.
-
Verify that the permissions for your service account are listed on the Access page of the Kafka instance.
-
Verify that the Kafka topic that you created for Connectors is listed on the Topics page of the Kafka instance.
-
If you plan to use your own OpenShift cluster Trial to deploy your Connectors instances, verify that a cluster administrator added the Connectors add-on to your Trial cluster.
A source connector consumes events from an external data source and produces Kafka messages.
You configure your Connectors instance to listen for events from the data source and produce a Kafka message for each event. Your Connectors instance sends the messages at regular intervals to the Kafka topic that you created for Connectors.
For this example, you create an instance of the Data Generator source connector. The Data Generator is provided for development and testing purposes. You specify the text for a message and how often to send the message.
-
If you want to use a dead letter queue (DLQ) to handle any messaging errors, create a Kafka topic for the DLQ.
-
You’re logged in to the OpenShift Connectors web console at https://console.redhat.com/application-services/connectors.
-
In the OpenShift Connectors web console, click Create a Connectors instance.
-
Select the connector that you want to use for connecting to a data source.
You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
For example, to find the Data Generator source connector, type
data
in the search box. The list is filtered to show only the Data Generator source card.Click the card to select the connector, and then click Next.
-
On the Kafka Instance page, click the card for the Streams for Apache Kafka instance that you configured for Connectors. For example, click the test-connect card.
Click Next.
-
On the Deployment page, the namespace that you select depends on your OpenShift environment.
If you’re using your own OpenShift environment, select the card for the namespace that was created when a cluster administrator added the Connectors service to your cluster, as described in Adding the Red Hat OpenShift Connectors add-on to your OpenShift cluster.
If you’re using the hosted preview environment, click Create preview namespace to provision a namespace for hosting the Connectors instances that you create.
Click Next.
-
Specify the core configuration for your Connectors instance:
-
Type a name for your Connectors instance. For example, type
hello world generator
. -
In the Client ID and Client Secret fields, type the credentials for the service account that you created for Connectors and then click Next.
-
-
Provide connector-specific configuration. For the Data Generator, provide the following information:
-
Topic Name: Type the name of the Kafka topic that you created for Connectors. For example, type
test-topic
. -
Content Type: Accept the default,
text/plain
. -
Message: Type the content of the message that you want the Connectors instance to send to the Kafka topic. For example, type
Hello World!!
. -
Period: Specify the interval (in milliseconds) at which you want the Connectors instance to send messages to the Kafka topic. For example, to send a message every 10 seconds, specify
10000
. -
Data Shape Produces Format: Accept the default,
application/octet-stream
.Click Next.
-
-
Select one of the following error handling policies for your Connectors instance:
-
Stop: If a message fails to send, the Connectors instance stops running and changes its status to the Failed state. You can view the error message.
-
Ignore: If a message fails to send, the Connectors instance ignores the error and continues to run. No error message is logged.
-
Dead letter queue: If a message fails to send, the Connectors instance sends error details to the Kafka topic that you created for the DLQ.
Click Next.
-
-
Review the summary of the configuration properties and then click Create Connectors instance.
Your Connectors instance is listed on the Connectors Instances page. After a couple of seconds, the status of your Connectors instance changes to the Ready state and it starts producing messages and sending them to its associated Kafka topic.
From the Connectors Instances page, you can stop, start, duplicate, and delete your Connectors instance, as well as edit its configuration, by clicking the Options icon (three vertical dots).
-
Verify that your source Connectors instance generates messages.
-
In the OpenShift Application Services web console, select Streams for Apache Kafka > Kafka Instances.
-
Click the Kafka instance that you created for connectors. For example, click test-connect.
-
Click the Topics tab and then click the topic that you specified for your source Connectors instance. For example, click test-topic.
-
Click the Messages tab to see a list of
Hello World!!
messages.
-
A sink connector consumes messages from a Kafka topic and sends them to an external system.
For this example, you use the HTTP Sink connector which consumes the Kafka messages (produced by your Data Generator source Connectors instance) and sends the messages to an HTTP endpoint.
-
You’re logged in to the OpenShift Connectors web console at https://console.redhat.com/application-services/connectors.
-
You created a Data Generator source Connectors instance.
-
For the data sink example, open the free webhook.site in a browser window. The
webhook.site
page provides a unique URL that you copy for use as an HTTP data sink. -
If you want to use a dead letter queue (DLQ) to handle any messaging errors, create a Kafka topic for the DLQ.
-
In the OpenShift Connectors web console, click Create Connectors instance.
-
Select the sink connector that you want to use:
-
For example, type
http
in the search field. The list of Connectors is filtered to show the HTTP sink connector. -
Click the HTTP sink card and then click Next.
-
-
On the Kafka Instance page, select the Streams for Apache Kafka instance for the connector to work with. For example, select test-connect.
Click Next.
-
On the Deployment page, the namespace that you select depends on your OpenShift environment.
If you’re using your own OpenShift environment, select the card for the namespace that was created when a cluster administrator added the Connectors service to your cluster.
If you’re using the hosted preview environment, click the preview namespace that you provisioned when you created the source connector.
Click Next.
-
Provide the core configuration for your connector:
-
Type a unique name for the connector. For example, type
hello world receiver
. -
In the Client ID and Client Secret fields, type the credentials for the service account that you created for Connectors and then click Next.
-
-
Provide the connector-specific configuration for your HTTP sink Connectors instance:
-
Topic Names: Type the name of the topic that you used for the source Connectors instance. For example, type
test-topic
. -
Method: Accept the default,
POST
. -
URL: Type your unique URL from the webhook.site.
-
Data Shape Consumes Format: Accept the default,
application/octet-stream
.Click Next.
-
-
Select an error handling policy for your Connectors instance. For example, select Stop.
Click Next.
-
Review the summary of the configuration properties and then click Create Connectors instance.
Your Connectors instance is added to the Connectors Instances page.
After a couple of seconds, the status of your Connectors instance changes to the Ready state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this example, the data sink is the HTTP URL that you provided).
-
Verify that you see HTTP POST calls with
"Hello World!!"
messages. Open a web browser tab to your custom URL for the webhook.site.