Spring Cloud Stream With Apache Kafka [Springboot]

Spring Cloud Stream is a framework/module for building highly scalable event-driven microservices connected with shared messaging systems.

Event-driven — It's nothing but a sharing of messages (data) in the form of events.

Suppose we have a small product delivery systems architecture .

Refer below image.

The framework provides a flexible programming model built on already established and familiar Spring idioms and best practices, including support for persistent pub/sub semantics (publisher-subscriber systems), consumer groups, and stateful partitions.

Spring Cloud Stream supports a variety of binder implementations:

In market, many binders available to achieved pub/sub, event driven systems. Some popular like Apache Kafka, RabbitMQ etc.

In this implementation Example , I am using Apache Kafka.

For Creating Loosely Coupled Pub/Sub Application, we use spring cloud streams.

Yeah, of course we can directly implement Apache Kafka also.(we will see in future blog for sure)

Step 1 : Create two different spring boot app(Producer And Consumer) including below dependencies.

1. Spring boot web starter

2. Spring cloud stream

3. Apache Kafka

4. Lombok

Step 2 : Create one dummy Spring bean Book

//simple Book POJO with getter and setter

//here i used Lombok tool to avoid boilerplate code.

@ Data

public class Book {

private int id;

private String bookName;


Step 3 : In Producer Springboot App,


Enable the binding with Kafka Producer using annotation on Main Java Class.

Declare as a source because its producer who produced data.

Create a application.yml file to set topic name






destination: myfirst

myfirst is my topic name

In Main class


private MessageChannel output;

Autowired MessageChannel which is available by spring cloud dependency.

Now we create a REST endpoint to produce data.

Use @ Restcontroller annotation to make it as rest controller.

Create a post endpoint to publish data


public Book publishEvents(@RequestBody Book book) {


return book;


this.output is our above autowired field inside this we have send method to produce data.

using withPayload build your payload to published.

Using this output object we can send publish/produce data .


The diffrence in Normal Kafka Implmentation and here is We dont need to summarize serialization & deserialization for app it will automatically done by spring cloud streams.

Our Producer App is Ready. Change the server port and run.

Now time to create a Consumer App.

Same like previous producer app, setup spring boot project .

Inside Main Java class File.


Put the annotation on consumer main class to enable binding with Apache Kafka.

Declare as a Consumer App that’s why we use Sink here.

Inside application.yml






destination: myfirst

myfirst is our topic name.dont forget to mentioned same topic name in producer and consumer app.

Now time to consume a message which is produced by Producer App.

But before that we need to create a Listener who listens every message which is published by producer app.


public void consumerMsg(Book book) {

logger.info("Consume Payload"+book);


Here i am using input inside stream listener , let's see our .yml file first .

We are using input to bind listener with a same topic .

Use stream listener annotation to listen/consume data from producer.

*I am using logger here to show msgs on console.

Run consumer app on different server port.

Before running Producer and Consumer App Zookeeper and kafka server must be run.

  • if you dont know how to setup kafka and zookeeper will see in next blog.

See Output In Postman

Produce Messages Using Rest Api.

If we produce as many as data it will be consume by consumer app which is running on different port.

Consumes Messages In Console

Thanks for reading.

Keep Share and Follow.




Software Engineer | Java Spring boot | JavaScript | CodeIgniter | Angular | HTML | CSS | Bootstrap | Youtuber | Writer

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Coin Hunt World update!

#EDA ‘Model-View-Broker’ Pattern: the new MVC

Defining A Pipeline for Metadata Deployments — Part 2

Getting Started with Augmented Reality: A Beginner’s Guide.

What scenarios are appropriate for serverless?

Spring and DynamoDB

Programming Challenges — Roman Numerals — Rust

Ey, I am trying to grasp the concepts of old retro style graphics from games, the nostalgic old…

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Jahid Momin

Jahid Momin

Software Engineer | Java Spring boot | JavaScript | CodeIgniter | Angular | HTML | CSS | Bootstrap | Youtuber | Writer

More from Medium

Work with Apache Kafka in Spring Boot Application

Spring WebFlux and Spring Data Redis Reactive

Spring Boot Kafka Integration

Event Driven Architecture using Spring Cloud and Kafka broker

The final Architecture