paint-brush
Severe Truth About Serverless Security and Ways to Mitigate Major Risksby@Sachenko
2,580 reads
2,580 reads

Severe Truth About Serverless Security and Ways to Mitigate Major Risks

by Roman SachenkoAugust 12th, 2019
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Serverless architecture, also known as FaaS (Function-as-a-Service), shows an annual growth rate that exceeds 700 percent. Cloud providers take responsibility for securing your project — but only partially. Vendors protect databases, operating systems, virtual machines, network, and other cloud components. But they are not in charge of the application layer, which includes the code, business logic, data, and cloud services configurations. It’s up to the app’�s owner to defend these parts against possible cyber attacks.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail

Coin Mentioned

Mention Thumbnail
featured image - Severe Truth About Serverless Security and Ways to Mitigate Major Risks
Roman Sachenko HackerNoon profile picture

Going serverless is like outsourcing routine tasks to remote pro teams. You get a shorter time to market, lower cost, and better scalability. You focus on features that make your users happy instead of managing a complex infrastructure. You can be more creative and innovative, as a third-party vendor will save you from all the boring tasks.

Sounds cool, and not just to me. Serverless architecture, also known as FaaS (Function-as-a-Service), shows an annual growth rate that exceeds 700 percent, sure evidence that companies highly value its advantages. Ok, but what about the disadvantages? Are there any pitfalls that people tend to overlook? Yes there are, and they mostly relate to security measures. 

Who is responsible for what in the serverless world?

When you build your application with AWS Lambda, Azure Functions, Google Cloud Functions or IBM BlueMix Cloud Functions, cloud providers take responsibility for securing your project — but only partially. Vendors protect databases, operating systems, virtual machines, the network, and other cloud components.  But they are not in charge of the application layer, which includes the code, business logic, data, and cloud services configurations. It’s up to the app’s owner to defend these parts against possible cyber attacks. 

Unfortunately, many businesses don’t realize these risks and, as a result, have no strategy to avoid them. As PureSec, a serverless security platform put it in its 2018 survey, ‘there’s a huge gap in security knowledge around serverless when compared to traditional applications’. To close or at least narrow this gap, let’s get down to the major challenges that come alongside the benefits of FaaS.

#1 More permissions to manage 

In the serverless ecosystem, we have lots of independent functions, each with its own set of services and responsibilities, its individual storage and state management system. This causes hundreds of interactions and may entail a situation when certain functions get more permissions than they are supposed to have. For example, functions that were developed to make calculations or send emails, can get access to database resources.

Precautions:

  • review each function and determine what it really needs to do
  • follow the rule of least privilege — minimize roles and permissions for functions so that each of them can do no more than performing its particular task. All extra permissions just foster the potential for attacks. 
  • as functions are deployed, they should be continuously scanned for suspicious activity. 

#2 More points of vulnerability

Your serverless application is more porous than monoliths by nature. It is split into small parts and each of them can be triggered in different ways — not only by API gateway commands, but also with cloud storage events, database changes, data streams, IoT telemetry signals, and emails, to name a few. This vast pool of event sources expands the attack surface and makes it more difficult to eliminate malicious event-data injections.

Keep in mind that traditional web application firewalls (WAFs) protect only those functions that are called by an API gateway leaving all other parts open to attacks. For example, serverless apps can be plagued by SQL (Structured Query Language) injections, which embed malicious code into a request. 

How would a successful injection attack impact your system? It totally depends on the role of the function infected with malware. If it has access to cloud storage, the virus could destroy data or upload corrupted files. After being transported to a database table, the viral code is capable of deleting records. The rule of least privilege mentioned in the previous section reduces the potential points of attack and prevents malware from spreading across the entire app. But of course, this precaution alone is not sufficient for securing your app. 

Precautions:

  • alongside WAFs, apply perimeter security to each function to protect it against data breaches. 
  • identify trusted sources and add them to the ‘whitelist. Use whitelist validation when possible.
  • remember Doctor House’s credo ‘Everybody lies’ and continuously monitor updates to your functions.
  • apply runtime defense solutions to protect your functions during execution. You may consider a serverless security library like  FunctionShield or Twistlock, a cloud security platform  which, among other things, provides threat-based protection for running containers. 

#3 More third-party dependencies

Dependencies of functions that rely on third-party software (open-source libraries, packages, etc.) are hard to monitor regardless of the app architecture. However, with serverless, the task to control them manually becomes extremely challenging.

Precautions:

  • avoid third-party packages with lots of dependencies.
  • derive components from reliable official sources via secure links. 
  • if you run a Node.js application, use package locks or NPM shrinkwrap to ensure that no updates will penetrate into your code until you review them. 
  • continuously use automated dependency scanners such as snyk.io or OWAS Dependency-Check to identify and fix vulnerabilities in third-party components.

#4 More data in storage and transit

Regardless of the architecture, sensitive data exposure is considered one of the major security problems. So most of the practices we use to secure traditional apps also work well for serverless. What we must consider is that attackers can target other data sources, extracting sensitive information from cloud storages and database tables instead of servers.

There’s one more issue. In the world of serverless, functions interact with each other as well as with third-party services, and exchange data more actively than in the case of the classic approach. The more information shared, the greater the risk of data leakage or destruction. 

Precautions:

  • identify at-risk data and reduce its storage to the necessary minimum.
  • all of the credentials within your functions that invoke third-party services or cross-account integrations should be temporary or encrypted. 
  • provide automatic encryption of sensitive data in transit.
  • use key management solutions offered by the cloud infrastructure to control cryptographic keys (e.g. AWS Key Management Service or Key Vault). 
  • set stricter constraints on allowed input and output messages coming through an API gateway. 
  • for additional security, send information  over HTTPS (HyperText Transfer Protocol Secure) endpoints only. 

 

#5 More hustle with authentication

After your app is deployed to the cloud, any person connected to the Internet can try to enter your system. You should cut off people with ill intentions while providing a smooth experience to your actual users. Unlike a traditional server-based application, serverless software is decentralized and has multiple access points including web browsers, mobile apps, etc. To ensure security, you need to authenticate all end users and control what resources they can access.

Precautions:

  • rather than build a complex authentication system from scratch, use one of the available access management services (such as Microsoft’s Azure AD or Auth0).
  • keep access privileges within the serverless infrastructure to a minimum by default and increase them manually when needed. 
  • if you allow users to edit data, perform additional validation for actions that can destroy or modify data. 


#6 More wallet-busting attacks 

Autoscaling is one of the killer features offered by serverless. It enables app owners to pay only for what they use and saves them effort, leaving scaling up to the cloud provider.  Alas, this technology not only created new opportunities for businesses but begot a new generation of hacker attacks called Denial of Wallet (DoW).

You can consider DoW to be an example of cyber-attacks adapting to new conditions. When a traditional app becomes a victim of the long-familiar denial-of-service (DOS) attacks, a flood of fake requests creates a kind of a traffic jam and makes your services unavailable for regular customers. Serverless architecture dictates a different scenario, though. While under attack, the app isn’t get blocked. Instead, it responds by scaling up in an attempt to deal with an avalanche of calls. 

What happens next? The cost of serverless infrastructure grows dramatically until your budget is exhausted. It’s up to the enterprises, not to the cloud service vendors, to pay the bill for the overrun. 

Precautions:

  • set budget limits and alarms based on your current spending (though this kind of protection may cause a DDoS attack when the hacker reaches the predefined limits). 
  • put limits on the number of API requests in a given time window. You can allow a client to make one call per second while blocking additional calls. 
  • use DDOS protection tools (a good example is Cloudflare which offers a suite of security features including WAF and rate limiting). 
  • if API gateways are internal and used only within other components, make them private and thus unapproachable for attackers.


#7 More APIs in the shadow

Gartner warns that by 2022, APIs will become the major source of data breaches. The problem is particularly acute for serverless applications built on top of microservices, with independent pieces of software interacting through numerous APIs. Published to the public cloud, these APIs become potentially available to hackers or malicious code. Worse yet, some of them stay invisible to traditional security tools. These hidden entry points called Shadow APIs pose a big challenge for keeping your serverless app secure, especially when it comes to complex enterprise software. 

Precautions:

  • integrate automated monitoring tools to discover APIs and, all-in-all, bring more visibility to your serverless tech stack (for instance, Epsagon provides autodiscovery of APIs and cloud resources that exists in the enterprise environment).

The reward is worth the risk

New opportunities often create new challenges, and serverless is not an exception. Cloud computing services offer your business incredible advantages in terms of cost-efficiency, scalability, and zero administration, yet to reap the financial benefits you need to invest in a security policy. 


Considering the increased attack surface and its complexity, it makes sense to use automated security tools for continuous monitoring, access management and discovery of vulnerabilities. At the same time, you can’t go without human brains as well.

Serverless shifts more responsibility onto developers, QA engineers and DevOps teams who are supposed to follow the best security-related practices. Combining a variety of precautions and tactics throughout the life of application from idea to deployment to maintenance, you’ll win this game and hit the jackpot.