paint-brush
Let's Export Cloudwatch Logs to ELKby@mccalco
9,831 reads
9,831 reads

Let's Export Cloudwatch Logs to ELK

by Callum McCleanJuly 13th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Cloudwatch is an AWS service that allows storage and monitoring of your application logs from an array of AWS services. Sometimes we might need to deeply analyse our logs, not only to spot errors but to find insights into our application. This is where an ELK stack can outperform Cloudwatch. In this tutorial, we will export our logs from Cloudwatch into our ELK (Elasticsearch, Logstash, Kibana) stack step by step. For ease I will perform all steps through the AWS console, however, everything can be completed using the command line interface.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Let's Export Cloudwatch Logs to ELK
Callum McClean HackerNoon profile picture

Cloudwatch is an AWS service that allows storage and monitoring of your application logs from an array of AWS services. This can be really useful for creating alerts to notify developers when a certain threshold of errors has been hit, but sometimes we might need to deeply analyse our logs, not only to spot errors but to find insights into our application and improve performance. This is where an ELK (Elasticsearch, Logstash, Kibana) stack can really outperform Cloudwatch. ELK allows us to collate data from any source, in any format, and to analyse, search and visualise the data in real time.

In this tutorial, we will export our logs from Cloudwatch into our ELK stack step by step. For ease I will perform all steps through the AWS console, however, everything can also be completed using the command line interface.

Step 1 - Creating Logs

In order to export our logs, we will first need to create some. If you don’t already have logs in Cloudwatch from your existing infrastructure, we can create a basic Lambda function for the purposes of this tutorial. Using the console, I have created a Lambda function named ‘Lambda-CloudwatchToElastic’ on execution it simply writes a line to Cloudwatch using the JavaScript function ‘console.log’.

Let's check everything is working. Navigate to the AWS Cloudwatch service and select ‘Log groups’ from the left navigation pane.

Once you are on the log groups page, you should see a log group for your AWS service, in this case our Lambda.

Enter into the log group by clicking on its name. You will be presented with a page similar to the one shown in the screenshot below:

Great! Now that we have a function that is creating logs, we can move onto the next step. Here we will set up a subscription to export the logs to our ELK stack.

Step 2 - Create a subscription filter

Starting from the log group page in the last step, we need to create a subscription filter that will determine which logs should be sent to our ELK stack. To do this, select ‘Create Elasticsearch Subscription Filter’ from the ‘Actions’ dropdown.

The filter will allow us to specify the destination ELK domain, along with a filter pattern to determine which logs to steam. First of all, select the ELK stack you wish to use.

In this case, the ELK stack is named ‘logs-to-elk’. You need to choose an ‘IAM Execution Role’ for the subscription filter to assume when it executes. This is important because if the role does not have the correct permissions, your subscription filter will not work.

The IAM role you select must meet the following criteria:

It must have lambda.amazonaws.com in the trust relationship.

It must include the following policy:

This policy grants the subscription filter permission to perform any Elasticsearch action on any path in your selected ELK domain.

Next we need to apply a filter to the logs. AWS provide us with some default filters. I will use ‘Amazon Lambda’ because that is where the logs are coming from. You can choose your filter based on which AWS service you are using, or create a custom filter.

The UI allows us to test the filter we have created against our existing logs. This way we can ensure that our subscription filter will behave as we expect before deployment. Use the ‘Test Pattern’ section and check you see your expected results like below.

Once you are happy with your filter, you can deploy it using the ‘Start Streaming’ button at the bottom of the page.

Step 3 - Verify the deployment

This step isn't strictly necessary, however, some might be interested in how this filter works. AWS has created a new Lambda function on our behalf. The function is triggered by the creation of new logs in our specified Cloudwatch log group.

You can see this function and inspect its code in more detail by navigating to the AWS Lambda home page. The function will appear in the list alongside any other functions you may have.

If you wish to view the code for a full understanding of the subscription filter, open up the Lambda function and scroll down to the ‘Function Code’ section of the page. Here you will find ‘index.js’ a single JavaScript file that handles the streaming of our logs to ELK.

Step 4 - View the logs in Kibana

Once you have completed these steps, AWS should be streaming logs to your ELK stack in real time. Let's head over to Kibana to verify we are receiving the log data from our subscription filter.

Using Kibana we can see that our Lambda logs have indeed been imported to our ELK stack.

At this point you can choose to analyse, visualise and search your logs in any way you wish, all in real time. Clearly the Elk stack is much more feature rich than Cloudwatch alone. Analysing logs can lead to discovering insights into your applications performance and use, that can be invaluable. However, the ELK stack does have its own competitors, you can read our comparison of ELK and Graylog here.

Debugging

If you can’t see your logs in Kibana, make sure you have created an index in Elasticsearch, and that you have followed the IAM Role instructions detailed in step 2. The most common issues faced when creating subscription filters is that they do not have the correct permissions to write to Elasticsearch.

Conclusion

ELK can be a huge step up from using Cloudwatch for your application logs. It provides many more features and a richer user interface allowing you to perform complex analysis and create slick visualisations on data that is streamed in real time. If you have logs coming from other services
such as Kubernetes you may want to consider a different
approach.

Callum writes about AWS and log analytics for Coralogix