Install Kafka on Ubuntu
Kafka is a highly scalable streaming platform and a message broker which is distributed and fault tolerant. Kafka can help different applications like Elastic Stack to work together by creating real time data pipelines .Kafka has a buffer system to stores the data which ensures data safety and it holds the data in case the receiver system is down and send the data again to the application once it is back .Kafka works as a messaging system by using reads and writes streams of data.
In this blog I am going to explain the installation steps of Kafka on Ubuntu and will give you a brief introduction of Kafka and why it is important. I have picked Kafka topic because it is widely used for streaming and can hold the data as well in the form of buffer. We can use it to extend the Elastic Stack capabilities to the next level.
Kafka works as a messaging system by using reads and writes streams of data. I will explain the detail description of Kafka in coming blogs but for now lets come to the Ubuntu installation. So to install it on Ubuntu we need to follow these steps:
1) Update the package by typing following command:
sudo apt-get update
2) Install Java if it is not installed on your machine:
sudo apt-get install default-jre
3) Now after installing Java we need to install Zookeeper which is an open source service used to synchronize config information of nodes on a distributed system also it helps to detect the failed nodes and elect leader nodes.
sudo apt-get install zookeeperd
Zookeeper works on default port 2181 so we need to check if we can connect on this port using telnet command.
4) Now install Kafka and for that we need to download Kafka binaries:
wget "http://mirror.bit.edu.cn/apache/kafka/1.0.0/kafka_2.11-1.0.0.tgz" -O ~/Downloads/kafka.tgz
5) After that we need to create a directory as kafka.
sudo mkdir /opt/kafka
6) Now change to the kafka directory. After that extract the downloaded Kafka archive using following command:
sudo tar -xvzf ~/Downloads/kafka.tgz --directory /opt/kafka --strip-components 1
Under /opt/kafka/bin directory you can see many script files using which we can perform different operations like start Kafka server, start new topic, list existing topics. start publishing the messages and start subscribing those messages. Lets see how we can achieve them in coming steps.
7) Once we have extracted Kafka archive inside kafka directory lets test the installation:
sudo /opt/kafka/bin/kafka-server-start.sh /opt/kafka/config/server.properties
8) Now create a new topic by using following command:
/opt/kafka/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test
10) To list all the topics run following command:
/opt/kafka/bin/kafka-topics.sh --list --zookeeper localhost:2181
11) Now after creating the topic lets start publishing the messages on test topic which we have created earlier:
/opt/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test
12) So now we need to create the subscriber as well who will receive these messages:
/opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning
As we have created the producer and subscriber so lets type some text under producer and see them under subscriber screens.
Type messages under producer:
/kafka$ /opt/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test
>How are you
>Hey welcome and lets send some messages through Kafka
See the messages under subscriber screen:
kafka$ /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning
How are you
Hey welcome and lets send some messages through Kafka
This is quite basic blog on kafka which covers installation with basic
introduction of Kafka. In next blogs I will cover some advance blogs on
Kafka. In case of any doubt please leave the comments.
Comments (0)
Leave a comment
Related Blogs
Introduction to Logstash
Dec 20, 2019, 11:38:31 AM | Anurag Srivastava
Importing MongoDB data into Elasticsearch
Mar 9, 2019, 8:20:38 AM | Anurag Srivastava
Importing MySQL data into Elasticsearch
Feb 9, 2019, 12:06:18 PM | Anurag Srivastava
Snapshot and Restore Elasticsearch Indices
Sep 16, 2019, 5:55:06 AM | Anurag Srivastava
Log analysis with Elastic stack
Jan 31, 2018, 6:11:29 AM | Anurag Srivastava
Creating Elasticsearch Cluster
Apr 6, 2019, 8:41:41 PM | Anurag Srivastava
Top Blogs
Configure SonarQube Scanner with Jenkins
Jun 21, 2018, 4:58:11 AM | Anurag Srivastava
Execute Commands on Remote Machines using sshpass
Jul 16, 2018, 5:00:02 PM | Anurag Srivastava
Importing MongoDB data into Elasticsearch
Mar 9, 2019, 8:20:38 AM | Anurag Srivastava
Importing MySQL data into Elasticsearch
Feb 9, 2019, 12:06:18 PM | Anurag Srivastava
Configure Jenkins for Automated Code Deployment
Jun 13, 2018, 3:44:01 PM | Anurag Srivastava
Deploying Angular code using Python script
Jun 26, 2018, 4:50:18 PM | Anurag Srivastava