Produce Apache Kafka Messages using Apache Flink and Java

Produce Apache Kafka Messages using Apache Flink and Java

Getting started with Apache Flink® can be challenging. In this short video, Wade Waldron will walk you through a complete example of how to produce Apache Kafka® messages using Apache Flink and Java.

Take the "Building Apache Flink Applications in Java" course now:

► Check out the "Flink 101" course for more information:
► Learn about "Dataflow Programming with Apache Flink and Apache Kafka":


00:00 - Intro

01:36 - The Flink Job

01:41 - The Entry Point

02:15 - The Stream Execution Environment

02:30 - Kafka Configuration

03:09 - The Data Generator Source

04:27- The Data Stream

04:52 - The Kafka Record Serialization Schema

06:58 - The Kafka Sink

07:55 - Putting it Together

08:15 - Executing the Stream

08:22 - Compiling and Running

08:44 - Verifying it Works

10:07 - Next Steps

Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Confluent’s cloud-native offering is the foundational platform for data in motion – designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, organizations can meet the new business imperative of delivering rich, digital front-end customer experiences and transitioning to sophisticated, real-time, software-driven backend operations. To learn more, please visit

#flink #java #streamprocessing #confluent