Managing Your Lambda Empire with Serverlessby@elliot_f
6,673 reads
6,673 reads

Managing Your Lambda Empire with Serverless

by Elliot ForbesFebruary 11th, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

If you’ve been following me for a while, you will know that I am a huge fan of the concept of the new <a href="" target="_blank">Serverless</a> paradigm. However, when it comes to managing a large number of functions all behind an <a href="" target="_blank">API</a> gateway, it starts to become a bit of a burden.
featured image - Managing Your Lambda Empire with Serverless
Elliot Forbes HackerNoon profile picture

If you’ve been following me for a while, you will know that I am a huge fan of the concept of the new Serverless paradigm. However, when it comes to managing a large number of functions all behind an API gateway, it starts to become a bit of a burden.

Thankfully, new tools such as Serverless have proved to be incredibly useful when it comes to managing a large number of functions together.

In this article we are going to have a look at some of the amazing advantages that Serverless provides when dealing with a vast empire of different lambda functions across multiple environments!

Key Advantages

Serverless allows you to develop and deploy a wide number of changes to multiple AWS Lambda functions with ease. This is a huge win. By allowing us to define everything about our Lambda functions within a yml file, it reduces the time for us to deploy changes down to less than a minute.

When it comes to developing anything, shortening the feedback loop is vital. You want to minimize the number of times where a simple change will take 5–10 minutes to deploy and check.

The quick feedback loop we experience using a tool like serverless truly makes this architecture style a very viable option when it comes to building highly scalable production applications with minimal fuss.

Cross Platform

One of the key concerns of many Product Owners is that they are becoming to reliant on the services of a single cloud service provider. This is otherwise known as “vendor lock-in” and is a very real issue.

You don’t want to develop a multi-million dollar solution that relies purely on a single cloud provider, only to find out that in 6 months time, the particular service you are relying upon will be deprecated.

Serverless addresses this command by remaining agnostic to what underlying cloud service provider you use. You aren’t locked into AWS and you can design your serverless applications to be deployable across multiple cloud service providers.


Installing the Serverless tool requires npm. It can be installed by calling:

npm install -g serverless

If you’ve already setup your aws credentials file on your machine and have a user set up with the correct privileges, you should be good to go!

Structuring Your Functions

As I’m currently writing an imgur-like website built purely on AWS, I’ll be basing this article on what I’ve done so far for that particular project!

Say I wanted to develop a series of different endpoints for my new project. I could define all of these individual functions within a distinct python file within an imgur/ directory.

.serverless/imgur/- ...serverless.yml

Within my serverless.yml file within the root directory I could then do something like so:

So within the above config, I’ve essentially defined 2 distinct API endpoints that are powered using functions that I’ve defined within my imgur/ directory.

Once I’ve made my changes to my function, in order to deploy these changes, I simply have to call:

serverless deploy

And the rest is handled for me.

This. is. amazing!

An example of the serverless deploy command

As you can see from the above screenshot, the cli handles the creation of a CloudFormation file, it packages and uploads the artifacts and then also pushes them to S3 and deploys them.

Finally, once everything is done, it prints out the “Service Information”. Within this information, you should see a list of all the endpoints deployed and what HTTP verbs are needed to hit them. This entire process took about 20 seconds for the 2 endpoints I have defined.

Managing Resource Access

More often than not, you’ll want your lambda functions to be able to talk to other AWS resources, serverless handles this exceptionally well. Within the confines of your serverless.yml file, you are able to specify to a very granular level, the different access levels of your functions.

Dev, Test & Production

When it comes to segregating your development environment from your test and production environments, serverless allows you to specify what stage you want to deploy to through either the yml file or through command-line arguments.

This means that I can build a full deployment pipeline around this within the likes of Jenkins that will automatically deploy tests to test, perform integration tests and then push straight to production, with no added fuss. This is ideal.


Hopefully, you found this article useful! Having just recently started working with Serverless, I can already see massive potential with the framework and I very much look forward to doing more development with it in the future.

I’m currently trying to build my YouTube channel and hit the new 1k subscriber mark to remain a partner, if you want to support me then please feel free to head over there and subscribe:

I’d be really eager to hear the thoughts of anyone who has been using Serverless in production for a while now! Let me know either in the comments section below or through my twitter: Elliot Forbes