With the rising hype around all things Serverless, I’ve been getting regularly asked one simple question:
But, how can I guarantee my serverless application is secure?
My response is always the same. How can you guarantee anything? I’m just as scared about my EC2 instance getting breached as I am about my Lambdas.
Most server vulnerabilities are due to programmer error. That one line of code that does a tiny bit more than it should. That one app secret you misplaced. Those files you forgot to encrypt.
There are numerous things we, the developers, can do to write better software.
That said, the distributed nature of Serverless Architectures gives a malicious attacker more room to maneuver. The greatest asset of serverless is also its most dangerous foe; it gives attackers significantly more points of entry.
This had me genuinely worried, so I started digging for answers. That’s when I came across Puresec’s study, The Top Ten Most Critical Security Risks in Serverless Architectures.
I read it without lifting my eyes from my screen. So many things became crystal clear.
Sadly, there’s still a common bad practice among developers: we focus on security once the software we’re building is already up and running.
“I’m just going to deploy this app and hope I don’t get hacked…”
— The average developer
In a nutshell, the “getting around to security…eventually” mentality is what kills us. The top serverless vulnerabilities are remarkably similar to the top vulnerabilities, period.
Read on for the takeaways from the Puresec security study, and for measures you can take right now to strengthen the security of your application.
Note: I strongly suggest you read the whole Puresec study. It’s freaking awesome. If you want a quick recap of the risks take a look at the TL;DR below. Or just jump to the section you are interested in.
This is probably obvious to most of you — several steps in improving security lie in the quality of our application structure as a whole. The way we architect our software, and our level of attention to detail, will ultimately lead to a robust and secure software product.
That’s enough of my yapping. Let’s get started with the risks!
It’s common sense to always validate input. So why am I even talking about this? Because we often forget about the edge cases.
Do you regularly make sure the input is of the data type you are expecting? I tend to forget from time to time, so I imagine most people do. How about the several different event types that can trigger a serverless function? The events don’t necessarily need to be HTTP requests. You may only have checked to make sure the input from an HTTP event is validated. Don’t forget to check for the case when the event is not what you expect it to be.
And please, use a firewall. It’s easy to set up and makes a huge difference.
Moving on from the actual events, try to use predefined logic for database interaction. This will reduce the risk of injections. Especially if you make sure to run all the code with the minimum OS privileges required to get the job done.
Use built-in solutions for authenticating users and authorizing their access to resources. This is pretty straightforward with authorizers or AWS Cognito. Using them is a no-brainer.
Related: Here’s a post out on setting up robust IAM permissions, check it out.
You can rest assured using stateless authentication with Auth0 or JWT is perfectly fine. The real issue is not the actual authentication method, but instead insecure deployment settings containing components with public read access. We’ll talk more about this in the next section.
Some of you maybe don’t like stateless authentication, and that’s okay. You can use sessions just fine, but not running on the Lambdas themselves. Every serverless function is stateless by nature, so we can’t store persistent data on them. Hence, for sessions, we can use a dedicated Redis server.
AWS has Elasticache, which is a great Redis and Memcached service. You only have to make sure to have the Lambda and Elasticache running in the same VPC. (Here’s a quick tutorial for getting that set up.) Once you’ve done that you can add the AWSLambdaVPCAccessExecutionRole
to your Lambda’s IAM statements and be good to go.
With all the talk about actual authentication principles, that’s not the real issue here. Your application layer authentication may work flawlessly, but that doesn’t stop a malicious attacker from accessing S3 buckets with public read access. Please, never enable public read access, unless you are using a bucket for storing images or a static website. In which case, you only keep those files in that bucket, nothing else!
If you’re even the slightest bit worried about the privacy of your files, enable all the encryption methods you possibly can.
Luckily AWS has both client-side encryption for encrypting a file before it’s sent through the wire, as well as server-side encryption once it’s added to an S3 bucket. But, none of this makes any sense if your buckets have public read access enabled.
Keep track of your S3 ACLs and make sure the access levels are not littered with unnecessary permissions. You can enable server-side encryption (SSE) for protecting data in your buckets with both SSE-S3 and SSE-KMS. Pick whichever you feel will work best with your use case.
I’d also encourage you to use client-side encryption with the AWS SDK. Here’s a nice explanation for you to check out.
Developers are lazy most of the time (and I’m no exception) — we set a single permission level for a whole service containing tons of functions.
Even though this maybe makes sense at first, it can be very dangerous. Functions should only have those permissions that they need to fulfill their purpose.
A function that fetches some data from DynamoDB should not have permission to add data to DynamoDB, for instance. The same logic applies to adding images and retrieving them from S3. It doesn’t make sense for a function doing the GetObject
operation to have permission to do a PutObject
operation, now does it?
If you’re used to working with the Serverless Framework, you can easily configure the IAM Role Statements on a per-function basis (or just use this plugin).
Make sure to always follow the least privilege principle. One function, one use case, one permission model.
Here comes the difficult bit! Crappy logs equal missed error reports. If you miss critical errors, your users will suffer from greater downtime just because you weren’t notified properly in order to fix them.
But it’s just as dangerous the other way around. Make sure never to log out info containing sensitive data. To get a grip on this, you either need to become a master parser of CloudWatch logs, or use a 3rd party tool such as Dashbird.
Dashbird - Free Serverless Visibility & Debugging Tool_Get instant overview of your whole serverless stack and save money by optimising your lambda functions. Health metrics…_dashbird.io
From my experience with Dashbird, I’ve enjoyed that they have live monitoring and error reporting, timeout monitoring, live tailing, price calculations and many other features I’ve still not had the need to use. It gives you a bird’s eye perspective on your serverless app, pretty much simulating what a regular old-school server application would look like. It can also send error reports to a Slack channel. (We all know how much developers love Slack.)
Dashbird has a free tier, so you can go ahead and try it out if you think it’ll be useful for you.
Even though you don’t push environment variables to GitHub, malicious attackers can still access the values if they gain access to the system where your code is running.
Hence, the need to use KMS to encrypt environment variables. There’s a plugin for the Serverless Framework that makes it easy.
I love the principle of Lambda, where you pay for the amount of time your code is running. This pushes the developer to write efficient code. Efficient code is less error prone and you can anticipate it will not run for ridiculously long periods.
This makes it much easier to add timeouts.
The default timeout for a function when using the Serverless Framework is 6 seconds. It’s more than enough for pretty much any production-level HTTP request. The default memory usage is set at 1024 MB and that’s often more than enough.
If you ever worry about DoS or some hacker invoking your Lambdas for a stupid large about of time, you can always throttle incoming API calls. This goes through API Gateway, and it’s as simple as setting a few fields to limit the amount of requests per second.
Using the Serverless Framework is also helpful here, because it enables setting a monthly cap on the number of invocations a particular API can have. Incredibly convenient.
Debugging serverless architectures is still an issue.
Handling this is best done with sound programming practices. Write unit tests. Write readable code. Emulate the AWS environment locally, so you can run all the code locally before deploying to the cloud.
Stack traces should only ever be logged to the console or log files; never send stack traces to back to the client. Make sure to only send vague messages in error responses.
Many of the issues I mentioned here apply to general coding practices, regardless of whether you’re using traditional servers and Serverless Architectures. Writing clean code, keeping secrets safe, doing input validation and error handling, are universal concepts we as developers swear an oath to uphold.
The real issues come with deployment settings, per-function permissions, bad logging, insufficient error reporting, and financial exhaustion. These issues are still manageable, you’re just not used to solving them yet. Serverless is still a young paradigm that we need time to get used to.
The decentralized nature of using serverless pushes us to look for ways of grouping resources into logical groups. Its biggest advantage is also the largest drawback.
This article has showed you the basics of Serverless security, what to watch out for, and how you can patch as many vulnerabilities as possible. Hopefully, this has helped you gain more insight of the inner workings of Serverless Architectures.
If you want to take a look at Puresec’s guide, check it out here. Or if you want to read my latest articles, head over here.
Latest stories written by Adnan Rahić — Medium_Read the latest stories written by Adnan Rahić on Medium. Software engineer @bookvar_co. Coding educator @ACADEMY387…_medium.com
If I’ve intrigued you to learn more about serverless, feel free to take a peek at a course I authored on the subject.
Serverless JavaScript by Example [Video] | PACKT Books_Become dexterous with live demonstrations on serverless web development_www.packtpub.com
Or, maybe you just want to be notified about other cool things related to serverless. Then just subscribe below.
Hope you guys and girls enjoyed reading this as much as I enjoyed writing it. Do you think this tutorial will be of help to someone? Do not hesitate to share. If you liked it, smash the clap below so other people will see this here on Medium.
Originally published at serverless.com.