Kevin Boone's website
https://kevinboone.me
https://kevinboone.me/img/favicon.ico
-Wed, January 17 2024
+Fri, January 19 2024
+
+Getting started with Kafka Streams, part 2
+https://kevinboone.me/kafka_streams_hello_2.md
+https://kevinboone.me/kafka_streams_hello_2.md
+Following on from my article on the rudiments of the Kafka Streams API, this one introduces stateful operations like counting and aggregation.
+
+
+Fri, January 19 2024
+
+
+
+Getting started with Kafka Streams
+https://kevinboone.me/kafka_streams_hello.md
+https://kevinboone.me/kafka_streams_hello.md
+Kafka Streams is a Java library and framework for creating applications that consume, process, and return Apache Kafka messages. This article provides a tutorial about implementing a very basic Streams application.
+
+
+Fri, January 19 2024
+
+
They don't make them like that any more: Garmin Nuvi 300
https://kevinboone.me/nuvi300.md
diff --git a/kafka_streams_hello_2.html b/kafka_streams_hello_2.html
index 02e43bc..c4e48d2 100644
--- a/kafka_streams_hello_2.html
+++ b/kafka_streams_hello_2.html
@@ -273,7 +273,7 @@
A technical digression
count() what we actually get is an instance of
KStream<?, Long>.
This makes perfect sense – a count can only be a number. But when we
-write to the output stream, we’re writing messages whose payloads is a
+write to the output stream, we’re writing messages whose payload is a
Long, and kafka-console-consumer.sh assumes by
default that the values are strings.
So to see the counts, we need to run the consumer like this: