AWS Events Analysis with ELK by@mlabouardy

AWS Events Analysis with ELK

Read on Terminal Reader
react to story with heart
react to story with light
react to story with boat
react to story with money
image
Mohamed Labouardy HackerNoon profile picture

Mohamed Labouardy

Recording your AWS environment activity is a must have. It can help you monitor your environment’s security continuously and detect suspicious or undesirable activity in real-time. Hence, saving thousands of dollars. Luckily, AWS offers a solution called CloudTrail that allow you to achieve that. It records all events in all AWS regions and logs every API calls in a single S3 bucket.

image

From there, you can setup an analysis pipeline using the popular logging stack ELK (Elasticsearch, Logstash & Kibana) to read those logs, parse, index and visualise them in a single dynamic dashboard and even take actions accordingly:

image

To get started, create an AMI with the ELK components installed and preconfigured. The AMI will be based on an Ubuntu image:

To provision the AMI, we will use the following shell script:

Now the template is defined, bake a new AMI with Packer:

Once the AMI is created, create a new EC2 instance based on the AMI with Terraform. Make sure to grant S3 permissions to the instance to be able to read CloudTrail logs from the bucket:

Issue the following command to provision the infrastructure:

Head back to AWS Management Console, navigate to CloudTrail, and click on “Create Trail” button:

image

Give it a name and apply the trail to all AWS regions:

image

Click on “Create“, and the trail should be created as follows:

image

Next, configure Logstash to read CloudTrail logs on an interval basis. The geoip filter adds information about the geographical location of IP addresses, based on sourceIPAddress field. Then, it stores the logs to Elasticsearch automatically:

In order for the changes to take effect, restart Logstash with the command below:

A new index should be created on Elasticsearch (http://IP:9200/_cat/indices?v)

image

On Kibana, create a new index pattern that match the index format used to store the logs:

image

After creating index, we can start exploring our CloudTrail events:

image

Now that we have processed data inside Elasticsearch, let’s build some graphs. We will use the Map visualization in Kibana to monitor geo access to our AWS environment:

image

You can now see where the environment is being accessed from:

image

Next, create more widgets to display information about the identity of the user, the user agent and actions taken by the user. Which will look something like this:

image

You can take this further and setup alerts based on specific event (someone accesses your environment from an undefined location) to be alerted in near real-time.

Full code can be found on my GitHub. Make sure to drop your comments, feedback, or suggestions below — or connect with me directly on Twitter @mlabouardy.

react to story with heart
react to story with light
react to story with boat
react to story with money

Related Stories

L O A D I N G
. . . comments & more!