Spring Cloud Stream With Apache Kafka [Springboot]

Jahid Momin
4 min readApr 9, 2022

Spring Cloud Stream is a framework/module for building highly scalable event-driven microservices connected with shared messaging systems.

Event-driven — It's nothing but a sharing of messages (data) in the form of events.

Suppose we have a small product delivery systems architecture .

Refer below image.

The framework provides a flexible programming model built on already established and familiar Spring idioms and best practices, including support for persistent pub/sub semantics (publisher-subscriber systems), consumer groups, and stateful partitions.

Spring Cloud Stream supports a variety of binder implementations:

In market, many binders available to achieved pub/sub, event driven systems. Some popular like Apache Kafka, RabbitMQ etc.

In this implementation Example , I am using Apache Kafka.

For Creating Loosely Coupled Pub/Sub Application, we use spring cloud streams.

Yeah, of course we can directly implement Apache Kafka also.(we will see in future blog for sure)

Step 1 : Create two different spring boot app(Producer And Consumer) including below dependencies.

1. Spring boot web starter

2. Spring cloud stream

3. Apache Kafka

4. Lombok

Step 2 : Create one dummy Spring bean Book

//simple Book POJO with getter and setter

//here i used Lombok tool to avoid boilerplate code.

@ Data

public class Book {

private int id;

private String bookName;

}

Step 3 : In Producer Springboot App,

@EnableBinding(Source.class)

Enable the binding with Kafka Producer using annotation on Main Java Class.

Declare as a source because its producer who produced data.

Create a application.yml file to set topic name

spring:

cloud:

stream:

bindings:

output:

destination: myfirst

myfirst is my topic name

In Main class

@Autowired

private MessageChannel output;

Autowired MessageChannel which is available by spring cloud dependency.

Now we create a REST endpoint to produce data.

Use @ Restcontroller annotation to make it as rest controller.

Create a post endpoint to publish data

@PostMapping("/publish")

public Book publishEvents(@RequestBody Book book) {

this.output.send(MessageBuilder.withPayload(book).build());

return book;

}

this.output is our above autowired field inside this we have send method to produce data.

using withPayload build your payload to published.

Using this output object we can send publish/produce data .

**********************************************************

The diffrence in Normal Kafka Implmentation and here is We dont need to summarize serialization & deserialization for app it will automatically done by spring cloud streams.

Our Producer App is Ready. Change the server port and run.

Now time to create a Consumer App.

Same like previous producer app, setup spring boot project .

Inside Main Java class File.

@EnableBinding(Sink.class)

Put the annotation on consumer main class to enable binding with Apache Kafka.

Declare as a Consumer App that’s why we use Sink here.

Inside application.yml

spring:

cloud:

stream:

bindings:

input:

destination: myfirst

myfirst is our topic name.dont forget to mentioned same topic name in producer and consumer app.

Now time to consume a message which is produced by Producer App.

But before that we need to create a Listener who listens every message which is published by producer app.

@StreamListener("input")

public void consumerMsg(Book book) {

logger.info("Consume Payload"+book);

}

Here i am using input inside stream listener , let's see our .yml file first .

We are using input to bind listener with a same topic .

Use stream listener annotation to listen/consume data from producer.

*I am using logger here to show msgs on console.

Run consumer app on different server port.

Before running Producer and Consumer App Zookeeper and kafka server must be run.

  • if you dont know how to setup kafka and zookeeper will see in next blog.

See Output In Postman

Produce Messages Using Rest Api.

If we produce as many as data it will be consume by consumer app which is running on different port.

Consumes Messages In Console

Thanks for reading.

Keep Share and Follow.

--

--

Jahid Momin

Team Lead | Sr Software Engineer | Spring boot | Microservices | JavaScript | CodeIgniter | HTML | CSS | ReactJS | NextJS | Youtuber | Writer