Logstash: 5 Easy Steps To Consume data From RabbitMQ To Elasticsearch.

Photo by Mimi Thian on Unsplash




The logs processing system architecture

1- Logs Publisher

npm install
sudo chmod +x send.js

2- RabbitMQ

RabbitMQ is the most widely deployed open source message broker. — https://www.rabbitmq.com/

docker run -p 5672:5672 rabbitmq
RabbitMQ Docker Instance Running

3- Logstash With A Log Processing Queue

cd /etc/logstash/conf.d/
sudo nano logstash-rabbitmq.conf
cd /usr/share/logstash/
sudo bin/logstash -f /etc/logstash/conf.d/logstash-rabbitmq.conf
Logstash Pipeline Launched

4- Elasticsearch Index To Store The Processed Logs

5- Now the whole magics can happen

Logs Publisher Run
Logstash Pipeline Parsing Incoming Data From RabbitMQ
curl -XGET ""


Certified AWS Solution Architect, Fullstack Software Engineer & DevOps. I like Solving Challenging Software Engineering Problems & Building Amazing Solutions.