Get started: https://cnfl.io/apache-kafka-101-learn-more | Want to read data out of Kafka topics? Learn how it works in this episode on consumers in Confluent's flagship course, Apache Kafka® 101.
In this episode, you’ll learn how to use the Consumer API to connect to a Kafka cluster and read data out of its topics.
Tim Berglund walks you through how to configure consumers to subscribe to specific topics, how to define key-value pairs in consumer records, and what happens when a consumer dies. All so you’re ready to get your hands on some code to really see how these components work together.
CHAPTERS
0:00 - Intro to Client Applications: Consumers
1:13 - How to Subscribe to Topics
1:47 - No “Last” Message With Streaming Data
2:39 - Detecting New Messages to Consume
3:09 - What’s in a Consumer Record?
4:27 - Persistence: Kafka Is a Log, Not a Queue
5:04 - What If a Consumer Dies? Offset Commits
6:31 - Scaling With Consumer Groups
8:04 - Rebalancing Consumers & Summary
LEARN MORE
► Take the Free Course: https://cnfl.io/apache-kafka-101-learn-more
► Confluent Developer: https://developer.confluent.io
CONNECT
Subscribe, if you dare: https://www.youtube.com/@ConfluentDeveloper?sub_confirmation=1
Community Slack: https://confluentcommunity.slack.com
X: https://x.com/confluentinc
Linkedin: https://www.linkedin.com/company/confluent
GitHub: https://github.com/confluentinc
Site: https://developer.confluent.io
ABOUT CONFLUENT DEVELOPER
Confluent Developer provides comprehensive resources for developers looking to learn about Apache Kafka®, Apache Flink®, Confluent Cloud, Confluent Platform, and any other technology related to the broader Data Streaming Platform. Content on Confluent Developer includes courses, getting started guides, topical deep-dives, patterns, tutorials, and listings of community events. Learn more at https://developer.confluent.io.
#apachekafka #kafka #confluent