Microservice Communication with Kafka

Kafka with Node.js

·

3 min read

Overview

Communication between Microservices can be categorized into two types, Synchronous and Asynchronous Communication. while REST and gRPC are Synchronous ways of communication, Kafka communications are asynchronous and do not wait for a response. Here we will see the local setup of Kafka in node.js and communications between services with Kafka queue.

Installation

To install Apache Kafka, Java is the only prerequisite. Make Sure at least Java 11 is already installed in your system.

  • Download the latest version of Apache Kafka from the official site under Binary Downloads.

  • Extract the contents (double click in the Finder) to a directory of your choice.

  • Now you have to start two services zookeeper and kafka from downloaded binaries.

  • Suppose the extracted directory is kafka_2.13-3.60 (The name of the directory may be different based on your installed Kafka version), Go to the Root Directory where you have extracted Kafka and run the following command to start zookeeper.

      kafka_2.13-3.6.0/bin/zookeeper-server-start.sh kafka_2.13-3.6.0/config/zookeeper.properties
    
  • Open another Terminal window and run the following command to start the local Kafka server.

kafka_2.13-3.6.0/bin/kafka-server-start.sh kafka_2.13-3.6.0/config/server.properties

  • Kafka is Now Started on localhost:9092 . Ensure both terminals are running always otherwise you will shut down the zookeeper or Kafka.

  • you can also make a single executable script to run both commands at different terminals as per your operating system. (In macOS you can use something similar to this to start both as a single command).

  • you can read more about downloaded Kafka binaries on the official site.

Setup in Node.js

We must install the Kafka client library for node.js to use our locally running Kafka server. for this tutorial, we will be installing kafkaJS from npm.

To utilize Kafka for microservice communications, we need to write some boilerplate code for different roles such as producer, consumer, and admin in the Kafka client library such that the producer will produce messages in some topic, and then a consumer can consume that message. with the help of the admin client users can configure their Kafka clients.

In the admin.js file, we need to provide the address of the running Kafka broker during initialization, In this case, our local Kafka instance is running at localhost:9092. we can also create a topic and use that topic in producer and consumer files for sending and receiving messages as per the above templates. For now, we are just consoling received messages.

Now our basic template is ready we just need the driver code to use our sendMessage function. for that, we have created index.js to act as both producer as well as consumer based on passed argv from the command line. In producer mode, it will continuously send messages using our sendMessage function. In consumer mode, it will subscribe to the topic and start listening to it using initConsumer.

Now open two terminals and enter the following command.

  • npm start producer

  • npm start consumer

Note: - you may also need to add the start command in the script section of the package.json file as per your folder structure to run the index.js file

First, start the consumer using the above command and then start the producer and see the message passing between these two services in action as below.

Hope this helps to get some understanding of Kafka with node.js. Let me know if you have any other questions.

Resources

  1. https://kafka.js.org/

  2. https://kafka.apache.org/