Logstash: 5 Easy Steps To Consume data From RabbitMQ To Elasticsearch.

Photo by Mimi Thian on Unsplash

Introduction

Context

Description

The logs processing system architecture

1- Logs Publisher

npm install
sudo chmod +x send.js

2- RabbitMQ

RabbitMQ is the most widely deployed open source message broker. — https://www.rabbitmq.com/

docker run -p 5672:5672 rabbitmq
RabbitMQ Docker Instance Running

3- Logstash With A Log Processing Queue

cd /etc/logstash/conf.d/
sudo nano logstash-rabbitmq.conf
cd /usr/share/logstash/
sudo bin/logstash -f /etc/logstash/conf.d/logstash-rabbitmq.conf
Logstash Pipeline Launched

4- Elasticsearch Index To Store The Processed Logs

5- Now the whole magics can happen

./send.js
Logs Publisher Run
Logstash Pipeline Parsing Incoming Data From RabbitMQ
curl -XGET "127.0.0.1:9200/logstash_rabbit_mq_hello/_search?pretty"

Conclusion

Certified AWS Solution Architect, Fullstack Software Engineer & DevOps. I like Solving Challenging Software Engineering Problems & Building Amazing Solutions.