Imagine a day as an engineer where you can focus on application and not infrastructure. Sound too good to be true? A lot of productive efforts could be channeled in the right direction. This is the way it should have been. Good news is that this is a reality through Serverless Computing.
Serverless Computing is an execution model that lets engineer stay focus on application goals, outsourcing the burden of infrastructure to Cloud Vendor (Amazon, Microsoft, Google). A user writes a server-side function which is run inside stateless containers hosted by Cloud vendor. Functions are triggered by a variety of events (e.g. HTTP, database requests) managed by the cloud vendor. The cloud vendor manages the allocation of resources responsible for executing a piece of code, and the user is charged only for the number of resources used to run the code. Amazon AWS Lambda, Microsoft Azure Functions, Google Cloud Functions are the popular implementations of Serverless Computing.
The first pay-as-you-execute was Zimki, offered in 2006. It was not commercially successful, having been shut in 2007. PiCloud offered pay-as-you-support in 2010. Amazon launched AWS Lamda, an abstract serverless computing offering in 2014. This was followed by Microsoft Azure function, Google Cloud functions, and IBM Cloud functions.
Let’s think about a traditional client-server model for an application like a ‘Movie ticketing’ system. Traditionally, the architecture will look something like the diagram below. Let’s say server-side is implemented, and client-side is in Javascript:
Traditional Model
With this architecture, the client can be relatively unintelligent, with much of the logic is implemented in server — authentication, searching, booking.
In traditional architecture, features, control, and security were managed by the central server application. In the Serverless Computing architecture, there is no central server. Instead, we have different components orchestrated by an API gateway.
Serverless Computing Model
This conveys a number of significant changes.
(1) Authentication logic in the original application is replaced with a third-party authentication service (e.g., Auth0)(2) The central server is implemented as a separate component (for e.g. “Search” and “Booking” functionality)(3) Breaking up different logical requirements into separately deployed components is a common approach when using “Serverless” Architecture.
Photo by Sarah Cervantes on Unsplash
Faster Time to market
Execution (Idea-to-delivery) is fast, supported by rich vendor eco-system. For e.g. you can easily collect, analyze, get insights into streaming data using AWS Kinesis, build conversational experiences using Azure Bot Services.
Development cost
You focus on core application functionality, relying on external services for other functions without having to develop them. It significantly reduces development cost. For e.g., integrate with Auth0 for authentication functionality instead of building one.
Infrastructure cost
Serverless is about running code without managing your own server system. It allows you to pay someone to manage servers, application framework, databases. You pay a small amount for the function actually running.
Simplicity
Serverless architecture doesn’t require using a specific framework. There is no learning curve involved in learning a new framework. You can implement functions in Java, Javascript, Python, Go, etc.
Auto Scaling
Scaling is automatic, managed by the provider. You no longer have to think about handling concurrent requests**,** performance hit, etc.
Photo by Markus Spiske on Unsplash
Services Sprawl
There are a number of moving pieces in the Serverless model compared to the traditional application model. It’s easy to create services, soon ending up in a sprawl of services. You need to have a good monitoring strategy for services to take control of the situation. Monitoring is a tricky area because of the ephemeral nature of services, offerings provided by the vendor are basic.
Security
Embracing Serverless architecture opens you up to a large number of services which have different security implementations. This increases surface area for malicious intent and increases the likelihood of an attack.
Vendor lock-in
Serverless features you’re using from one vendor will be implemented differently by another vendor. You’ll need to update your tools if you want to switch vendors.
Serverless computing offers significant benefits aspects including reduced time to market, reduced engineering & infrastructure costs. However, there are challenges of service sprawl, monitoring, security. Evaluate trade-offs before you take the plunge. Serverless Computing is in it’s adolescence years, it will be fascinating to see how it evolves from teenage to young adult.