Centralised Logging and Monitoring of a Distributed System and Application using ELK

Written by IshwarChandra | Published 2018/07/14
Tech Story Tags: elasticsearch | kibana | monitoring | logging | aws

TLDRvia the TL;DR App

Introduction

Nowadays, centralized logging and monitoring have become an integral part of our technology stack. It empowers organizations to gain important insights into operation, security, and infrastructure. We have different types of service-specific metrics and logs. Metrics and Log files from different distributed web servers, applications, and operating systems can be collected and used together with a centralized overview of useful insights.

We have some very popular SaaS-based logging and monitoring tools e.g. Datadog, Splunk, Loggly and many others.

Implementation

The ELK Stack is most commonly used open source tool for this kind of purpose. Users can use their own ELK deployment or can use managed ElasticSearch service e.g. Amazon ElasticSearch, Elastic Cloud.

ELK is used because it helps in aggregating log and audit data from the different sources, processing and enhancing this data (Logstash, Beats), storing them in one central data store (Elasticsearch), and providing analysis and visualization tools (Kibana).

Let us consider we have a simple LAMP based application running on servers behind ELB on AWS and we want to collect OS metrics and Apache logs.

For ELK stack, we will consider AWS managed Amazon ElasticSearch which comes with pre-installed Kibana. While Logstash and Beats (Filebeat, Metricbeat, Packetbeat, Winlogbeat, Auditbeat, Heartbeat) would work as data shippers.

“Beats is the platform for single-purpose data shippers. They install as lightweight agents and send data from hundreds or thousands of machines to Logstash or Elasticsearch.” Detailed information can be found

Above architecture contains two Web servers running behind ELB which have Beats agents installed. All the required configuration is baked into an AMI (Amazon Machine Image). One can dynamically configure it using Amazon CloudFormation tools as well.

Beats collect logs and metrics, parse it, and send to Elasticsearch. Amazon Elasticsearch is configured in high-availability mode and also there isNGINX proxy server in front of Kibana.

The Complete Elasticsearch solution is inside Private Subnet since it’s intended to be used inside an organization. So Kibana is accessible through VPN. It further increases the security of logs and metrics.

Conclusion

Log Management and Analytics enable enterprises to collect, manage, and analyse log data in order to improve their application and infrastructure management and monitoring. ELK collects any log data in real-time and enables our customers to perform analytics such as application troubleshooting and root cause analysis, application monitoring, IT infrastructure monitoring and troubleshooting, and application analytics.

Originally published at https://www.linkedin.com on January 8, 2018.


Published by HackerNoon on 2018/07/14