#1
Integrating Apache Kafka into your Micronaut application can enhance its ability to handle real-time data streams and asynchronous communication. This guide walks you through setting up Kafka using Docker for development and Kubernetes for production, leveraging application.yml files tailored for each environment.

1. Adding Kafka and Kubernetes Dependencies

First, ensure your pom.xml includes the necessary dependencies:
<dependencies>
  <!-- Micronaut Kafka integration -->
  <dependency>
    <groupId>io.micronaut.kafka</groupId>
    <artifactId>>micronaut-kafka</artifactId>
  </dependency>

  <!-- Micronaut Kubernetes client -->
  <dependency>
    <groupId>io.micronaut.kubernetes</groupId>
    <artifactId>micronaut-kubernetes-client</artifactId>
  </dependency>
</dependencies>
These dependencies enable Kafka communication and Kubernetes integration within your Micronaut application.

2. Setting Up Kafka with Docker for Development

For development, you can use Docker to run Kafka and Zookeeper. Create a docker-compose.yml file with the following content:
version: '2'
services:
  zookeeper:
    image: confluentinc/cp-zookeeper
    ports:
      - 2181:2181
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000

  kafka:
    image: confluentinc/cp-kafka
    depends_on:
      - zookeeper
    ports:
      - 9092:9092
    environment:
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
Start Kafka and Zookeeper:
docker-compose up
This setup provides a local Kafka broker accessible for testing and development purposes.

3. Configuring application-dev.yml

Create a file named application-dev.yml in src/main/resources/ with the following content:
micronaut:
  application:
    name: kafka-app

kafka:
  bootstrap:
    servers: localhost:9092
This configuration sets up the Kafka client to connect to the local broker.
To run the application with the development profile:
./mvnw mn:run -Dmicronaut.environments=dev
Micronaut will automatically use the application-dev.yml file for configuration.

4. Preparing for Production with Kubernetes

In a production environment, it's advisable to manage sensitive information like Kafka configurations using Kubernetes Secrets and ConfigMaps.

4.1. Create Kubernetes ConfigMap for Kafka Bootstrap Servers

apiVersion: v1
kind: ConfigMap
metadata:
  name: kafka-config
data:
  KAFKA_BOOTSTRAP_SERVERS: kafka-broker:9092
This ConfigMap stores your Kafka configuration securely and makes it accessible to your application.

5. Configuring application-prod.yml

Create a file named application-prod.yml in src/main/resources/ with the following content:
micronaut:
  application:
    name: kafka-app

kafka:
  bootstrap:
    servers: ${KAFKA_BOOTSTRAP_SERVERS}
This configuration uses environment variables provided by Kubernetes to set up the Kafka client for production.

6. Deploying Micronaut Application to Kubernetes

When deploying your Micronaut application to Kubernetes, ensure the environment variables are correctly set using the previously created ConfigMap:
env:
  - name: MICRONAUT_ENVIRONMENTS
    value: prod
  - name: KAFKA_BOOTSTRAP_SERVERS
    valueFrom:
      configMapKeyRef:
        name: kafka-config
        key: KAFKA_BOOTSTRAP_SERVERS
This setup ensures that your application retrieves the necessary configuration from Kubernetes resources during deployment.
By separating your Kafka configuration into application-dev.yml and application-prod.yml, you can maintain clean and secure environments for development and production. Utilizing Docker for local development and Kubernetes for production allows for scalable and manageable deployments.
Remember to:
  • Use Docker to run a local Kafka broker during development.
  • Store configuration data in Kubernetes ConfigMaps.
  • Reference these Kubernetes resources in your application's environment variables.
This approach provides a robust and flexible setup for integrating Kafka into your Micronaut applications across different environments.

image quote pre code