Kickstart Your Spring Kafka Journey: Crafting a Kafka Producer with a Dash of Fun!

Gimhan Wijayawardana
3 min readFeb 20, 2025

Ready to dive into the bustling bazaar of real-time data streaming? Picture Galle Face Green on a Sunday evening — vibrant, dynamic, and full of energy. That’s Apache Kafka for you! Today, we’re not just bystanders; we’re rolling up our sleeves to build our very own Kafka producer using Spring Boot. Let’s get this parippu started! 🍛

Prerequisites: Gear Up!

Before we hit the road, let’s ensure our toolkit is ready:

  • Java Development Kit (JDK) 8 or higher: Grab it here.
  • Apache Kafka: Download and install.
  • Integrated Development Environment (IDE): IntelliJ IDEA, Eclipse, or your personal favorite.

Setting Up Apache Kafka: Let’s Get This Party Started!

First up, we need to set up our Kafka environment. Think of Kafka as the cricket pitch where all the action unfolds. 🏏

  1. Download Kafka: Snag the latest version from the official Kafka website.
  2. Extract the Archive: Unzip the downloaded file to your chosen directory.
  3. Start Zookeeper: Kafka’s trusty sidekick. Navigate to the Kafka directory and fire it up:
bin/zookeeper-server-start.sh config/zookeeper.properties

4. Start Kafka Broker: Now, let’s get Kafka grooving:

bin/kafka-server-start.sh config/server.properties

With Zookeeper and Kafka up and running, our pitch is ready for the big match! 🏆

Creating a Spring Boot Project: Time to Code!

Now, let’s set up our Spring Boot application — the captain of our team. 🧢

Initialize the Project: Head over to the Spring Initializr and create a new project with these vibes:

  • Spring Web
  • Spring for Apache Kafka

Configuring Kafka Properties: Tuning the Instruments

To ensure our application syncs perfectly with Kafka, we’ll set the right configurations. In src/main/resources/application.properties, add:

spring.kafka.bootstrap-servers=localhost:9092
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer

These settings ensure our producer is in perfect harmony with Kafka. 🎶

Defining a Kafka Topic: Setting the Stage

Let’s define the topic — our designated dance area. In your main application class, sprinkle in a NewTopic bean:

import org.apache.kafka.clients.admin.NewTopic;
import org.springframework.context.annotation.Bean;
import org.springframework.kafka.config.TopicBuilder;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.SpringApplication;

@SpringBootApplication
public class KafkaProducerApplication {

public static void main(String[] args) {
SpringApplication.run(KafkaProducerApplication.class, args);
}

@Bean
public NewTopic topic() {
return TopicBuilder.name("my-lankan-topic")
.partitions(1)
.replicas(1)
.build();
}
}

Here, we’ve set up a topic named “my-lankan-topic” — the hotspot for our messages. 🥁

Creating the Kafka Producer: Let’s Get This Party Started!

Time to craft a service that sends our messages to the topic:

import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;

@Service
public class KafkaProducerService {

private final KafkaTemplate<String, String> kafkaTemplate;

public KafkaProducerService(KafkaTemplate<String, String> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}

public void sendMessage(String message) {
kafkaTemplate.send("my-lankan-topic", message);
}
}

Our KafkaProducerService is the maestro, ensuring our messages flow smoothly to "my-lankan-topic." 🎷

Exposing a REST Endpoint: Inviting Everyone to the Bash

Let’s roll out the red carpet with a REST controller, allowing external apps to send messages:

import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class MessageController {

private final KafkaProducerService producerService;

public MessageController(KafkaProducerService producerService) {
this.producerService = producerService;
}

@PostMapping("/publish")
public String publishMessage(@RequestParam("message") String message) {
producerService.sendMessage(message);
return "Message published successfully!";
}
}

With this setup, anyone can join the fun and send messages via the /publish endpoint. 🎤

Testing the Producer: Moment of Truth!

Let’s see our producer in action:

  1. Run the Application: Start your Spring Boot app.
  2. Publish a Message: Use Postman, curl, or your tool of choice to send a POST request:
curl -X POST "http://localhost:8080/publish?message=Hello, Kafka!"

3. Verify the Message: To ensure our message landed safely, start a Kafka consumer:

bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-lankan-topic --from-beginning

You should see:

Hello, Kafka!

Apey! Our message made it to the party! 🎉

Conclusion: Hats Off to You!

Well done! You’ve successfully built a Spring Boot application that chats with Apache Kafka, producing messages like a seasoned pro. This is just the beginning; there’s a whole world of Kafka features to explore.

Keep experimenting, keep coding, and happy developing! 🎸

Sign up to discover human stories that deepen your understanding of the world.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Gimhan Wijayawardana
Gimhan Wijayawardana

Written by Gimhan Wijayawardana

Senior Software Engineer at Sysco Labs

No responses yet

Write a response