Struts may be a classic Java web framework, but it works great with modern messaging systems like Apache Kafka. If you’re building event-driven apps, real-time analytics, or streaming services, integrating Kafka into your Struts project brings major benefits. This guide covers how to configure Kafka using Docker for local development and Kubernetes for production—cleanly separate configs, keep your code consistent, and deploy securely.
1. Add Dependencies to pom.xml
First, include Apache Kafka and optional Kubernetes support:
<dependencies>
<!-- Struts framework -->
<dependency>
<groupId>org.apache.struts</groupId>
<artifactId>struts2-core</artifactId>
<version>${struts.version}</version>
</dependency>
<!-- Kafka client -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>${kafka.version}</version>
</dependency>
<!-- Kubernetes client (optional) -->
<dependency>
<groupId>io.fabric8</groupId>
<artifactId>kubernetes-client</artifactId>
<version>${fabric8.version}</version>
</dependency>
</dependencies>
You’ll now have Struts, the Kafka producer/consumer client, and Kubernetes support for live environments.
2. Run Kafka Locally with Docker
Create a
docker-compose.yml
for quick local setup:
version: '2'
services:
zookeeper:
image: confluentinc/cp-zookeeper:latest
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ports:
- "2181:2181"
kafka:
image: confluentinc/cp-kafka:latest
depends_on:
- zookeeper
ports:
- "9092:9092"
environment:
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true"
Run:
docker-compose up -d
This gives you a Kafka broker at
localhost:9092
for development.
3. Create Dev Config: kafka-dev.properties
Save this under
src/main/resources
:
kafka.bootstrapServers=localhost:9092
kafka.clientId=struts-dev-client
kafka.topic=dev-topic
Then use these properties in your Struts actions:
Properties props = new Properties();
props.load(getClass().getResourceAsStream("/kafka-dev.properties"));
Properties kafkaProps = new Properties();
kafkaProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, props.getProperty("kafka.bootstrapServers"));
kafkaProps.put(ProducerConfig.CLIENT_ID_CONFIG, props.getProperty("kafka.clientId"));
kafkaProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
kafkaProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
Producer<String, String> producer = new KafkaProducer<>(kafkaProps);
producer.send(new ProducerRecord<>(props.getProperty("kafka.topic"), "Hello, Kafka!"));
producer.close();
That's all you need for local testing.
4. Prepare Kubernetes for Production
Keep configs out of code by using Kubernetes:
kubectl create configmap kafka-config \
--from-literal=KAFKA_BOOTSTRAP_SERVERS=kafka-service:9092 \
--from-literal=KAFKA_TOPIC=prod-topic \
--from-literal=KAFKA_CLIENT_ID=struts-prod-client
5. Create Production Config: kafka-prod.properties
Put this in your JAR under
resources
:
kafka.bootstrapServers=${KAFKA_BOOTSTRAP_SERVERS}
kafka.clientId=${KAFKA_CLIENT_ID}
kafka.topic=${KAFKA_TOPIC}
Kafka client code stays identical; only the properties source changes.
6. Kubernetes Deployment Snippet
Inject environment variables in your pod spec:
env:
- name: KAFKA_BOOTSTRAP_SERVERS
valueFrom:
configMapKeyRef:
name: kafka-config
key: KAFKA_BOOTSTRAP_SERVERS
- name: KAFKA_CLIENT_ID
valueFrom:
configMapKeyRef:
name: kafka-config
key: KAFKA_CLIENT_ID
- name: KAFKA_TOPIC
valueFrom:
configMapKeyRef:
name: kafka-config
key: KAFKA_TOPIC
Then, when your Struts app runs in Kubernetes, it uses production-ready values automatically.
7. Why This Works Well
- Fast local feedback: Docker and dev-properties let you test Kafka easily.
- Secure production: Kubernetes holds your config, not your code.
- One codebase: Same producer/consumer logic works everywhere.
- Clear environment separation: Dev vs prod configs are distinct and isolated.
Pairing Struts with Docker for local Kafka testing and Kubernetes for production deployment offers a robust messaging setup. Whether you’re publishing events, logging activity, or feeding analytics, this configuration is clean, secure, and scalable. Now you’ve got seamless Kafka integration in your Struts application across development and production environments!
image quote pre code