Skip to content

Latest commit

 

History

History
 
 

quarkus-registry

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 

Using Quarkus applications with Kafka instances and OpenShift Service Registry

As a developer of applications and services, you can connect Quarkus applications to Kafka instances in OpenShift Streams for Apache Kafka and Service Registry instances in OpenShift Service Registry. This feature makes it easy for development teams to store and reuse schemas in event-streaming architectures.

Quarkus is a Kubernetes-native Java framework made for Java virtual machines (JVMs) and native compilation, and optimized for serverless, cloud, and Kubernetes environments. Quarkus is designed to work with popular Java standards, frameworks, and libraries such as Eclipse MicroProfile and Spring, as well as Apache Kafka, RESTEasy (JAX-RS), Hibernate ORM (JPA), Infinispan, Camel, and many more.

In this quick start, you manually configure connections from an example Quarkus application to Kafka and Service Registry instances. The application uses the Kafka instance to produce and consume messages and a schema stored in the Service Registry instance to serialize/deserialize the messages.

Note
When you’ve completed this quick start and understand the required connection configurations for Kafka and Service Registry instances, you can use the OpenShift Application Services command-line interface (CLI) to generate these types of configurations in a more automated way. To learn more, see Connecting client applications to OpenShift Application Services using the rhoas CLI.
Prerequisites

Importing the Quarkus sample code

For this quick start, you use the Quarkus Service Registry sample code from the App Services Guides and Samples repository in GitHub. After you understand the concepts and tasks in this quick start, you can use your own Quarkus applications with Streams for Apache Kafka and Service Registry in the same way.

Procedure
  1. On the command line, clone the App Services Guides and Samples repository from GitHub.

    git clone https://github.com/redhat-developer/app-services-guides app-services-guides
  2. In your IDE, open the code-examples/quarkus-service-registry-quickstart directory from the repository that you cloned.

Configuring the Quarkus application to connect to Kafka and Service Registry instances

To enable your Quarkus applications to access a Kafka instance, configure the connection properties using the Kafka bootstrap server endpoint. To access a Service Registry instance, configure the registry endpoint connection property with the Core Registry API value supported by the Apicurio serializer/deserializer (SerDes).

Access to the Service Registry and Kafka instances is managed using the same service account and SASL/OAUTHBEARER token endpoint. For Quarkus, you can configure all connection properties using the application.properties file. The example in this task sets environment variables and then references them in the application.properties file.

Quarkus applications use MicroProfile Reactive Messaging to produce messages to and consume messages from your Kafka instances in Streams for Apache Kafka. For details on configuration options, see the Apache Kafka Reference Guide in the Quarkus documentation.

This Quarkus example application includes producer and consumer processes that serialize/deserialize Kafka messages using a schema stored in Service Registry.

Prerequisites
  • You have the bootstrap server endpoint for your Kafka instance. To get this information, find your Kafka instance in the Streams for Apache Kafka web console, click the options icon (three vertical dots), and click Connection. Copy the Bootstrap server value.

  • You have the SASL/OAUTHBEARER token endpoint for your Kafka and Service Registry instances. To get this information, find your Service Registry instance in the Service Registry web console, click the options icon (three vertical dots), and click Connection. Copy the Token endpoint URL value.

  • You have the Core Registry API endpoint for your Service Registry instance. To get this information, find your Service Registry instance in the Service Registry web console, click the options icon (three vertical dots), and click Connection. Copy the Core Registry API value.

  • You have the generated credentials for your service account. To reset the credentials, use the Service Accounts page in the Application Services section of the Red Hat Hybrid Cloud Console. Copy the Client ID and Client secret values.

  • You’ve set the permissions for your service account to access the Kafka instance resources. To verify the current permissions, click your Kafka instance in the Streams for Apache Kafka web console and use the Access page to find your service account permission settings.

  • You’ve set the permissions for your service account to access the Service Registry instance resources. To verify the current permissions, click your Service Registry instance in the Service Registry web console, and use the Access page to find your service account role settings.

Procedure
  1. On the command line, set the following environment variables to use your Kafka and Service Registry instances with Quarkus or other applications.

    Replace the values in angle brackets (< >) with your own server and credential information, as follows:

    • The <bootstrap_server> value is the Bootstrap server endpoint for your Kafka instance.

    • The <core_registry_api_url> value is Core Registry API URL for your Service Registry instance.

    • The SERVICE_REGISTRY_CORE_PATH variable is a constant value used to set the API path for Service Registry.

    • The <oauth_token_endpoint_url> value is the SASL/OAUTHBEARER Token endpoint URL for your Kafka and Service Registry instances.

    • The <client_id> and <client_secret> values are the generated credentials for your service account.

      Setting environment variables for server and credentials
      $ export KAFKA_HOST=<bootstrap_server>
      $ export SERVICE_REGISTRY_URL=<core_registry_api_url>
      $ export SERVICE_REGISTRY_CORE_PATH=/apis/registry/v2
      $ export RHOAS_SERVICE_ACCOUNT_OAUTH_TOKEN_URL=<oauth_token_endpoint_url>
      $ export RHOAS_SERVICE_ACCOUNT_CLIENT_ID=<client_id>
      $ export RHOAS_SERVICE_ACCOUNT_CLIENT_SECRET=<client_secret>
  2. In the Quarkus example application, review the /src/main/resources/application.properties files in the consumer and producer subfolders to understand how the environment variables you set in the previous step are used. This example uses the dev configuration profile in the application.properties files.

Creating the quotes Kafka topic in Streams for Apache Kafka

The Quarkus application in this quick start uses a Kafka topic called quotes to produce and consume messages. In this task, you create the quotes topic in your Kafka instance.

Prerequisites
  • You have a running Kafka instance in OpenShift Streams for Apache Kafka.

Procedure
  1. In the Streams for Apache Kafka web console, click Kafka Instances and then click the name of the Kafka instance that you want to add a topic to.

  2. Click the Topics tab.

  3. Click Create topic and specify the following topic properties:

    1. Topic name: For this quick start, enter quotes as the topic name. Click Next.

    2. Partitions: Set the number of partitions for the topic. For this quick start, set the value to 1. Click Next.

    3. Message retention: Set the message retention time and size. For this quick start, set the retention time to A week and the retention size to Unlimited. Click Next.

    4. Replicas: For this release of Streams for Apache Kafka, the replica values are preconfigured. The number of partition replicas for the topic is set to 3 and the minimum number of follower replicas that must be in sync with a partition leader is set to 2. For a trial Kafka instance, the number of replicas and the minimum in-sync replica factor are both set to 1. Click Finish.

After you complete the setup, the new topic appears on the Topics page. You can now run the Quarkus application to start producing and consuming messages to and from this topic.

Verification
  • Verify that the quotes topic is listed on the Topics page.

Running the Quarkus example application

After you configure your Quarkus application to connect to Kafka and Service Registry instances, and you create the Kafka topic, you can run the Quarkus application to start producing and consuming messages to and from this topic.

The Quarkus application in this quick start consists of the following processes:

  • A consumer process that is implemented by the QuotesResource class. This class exposes the /quotes REST endpoint that streams quotes from the quotes topic. This process also has a minimal frontend that uses Server-Sent Events to stream the quotes to a web page.

  • A producer process that is implemented by the QuotesProducer class. This class produces a new quote periodically (every 5 seconds) with a random value that is published to the quotes topic.

Prerequisites
Procedure
  1. On the command line, change to the code-examples/quarkus-service-registry-quickstart/consumer directory that you imported and run the consumer process.

    Running the example consumer process
    $ cd ~/code-examples/quarkus-service-registry-quickstart/consumer
    $ mvn quarkus:dev
  2. After the consumer process is running, in a web browser, go to http://localhost:8080/quotes.html and verify that this process is available.

  3. Leave the consumer process running, and run the producer process in a different terminal.

    Running the example producer process
    $ cd ~/code-examples/quarkus-service-registry-quickstart/producer
    $ mvn quarkus:dev
  4. When both the consumer and producer processes are running, view the generated quotes in the web browser at http://localhost:8080/quotes.html.

  5. In the web console, go to Service Registry > Service Registry Instances, click your Service Registry instance, and view the automatically generated schema for your application.

What just happened?
  • The Quarkus application is configured to use the io.apicurio.registry.serde.avro.AvroKafkaSerializer Java class for serializing and the io.apicurio.registry.serde.avro.AvroKafkaDeserializer class for deserializing messages to Avro format. This SerDes is configured to use remote schemas in OpenShift Service Registry rather than the local schemas in the application.

  • Because there are no schemas in the Service Registry instance, the SerDes published the schema for the quotes topic. The name of the schema is managed by the TopicRecordIdStrategy class, which uses the topic_name-value convention. You can find this schema in the Service Registry instance and configure compatibility rules to govern how the schema can evolve for future versions.

  • If the Quarkus application fails to run, review the error log in the terminal and address any problems. Also review the steps in this quick start to ensure that the Quarkus application and Kafka topic are configured correctly.