Hackernoon logoWriting a self-sufficient AWS Lambda Function by@cwidanage

Writing a self-sufficient AWS Lambda Function

Chathura Widanage Hacker Noon profile picture

@cwidanageChathura Widanage

If you have already gone through the SLAppForge’s news articles and blogs, you might already know that bunch of Engineers from Sri Lanka are working hard to make lives of serverless developers a lot easier.

While we are experimenting with various AWS services, we try a lot of crazy things to make Sigma IDE the best friend of fellow serverless developers.

If you have used AWS console to write a lambda function, you might have noticed that, they have a built in IDE in the AWS console itself, where you can code interpreted languages in browser and deploy instantly. In case of NodeJs, this functionality becomes useless when you want to use third party libraries. (We have already solved this problem in Sigma though 😎). So the only option left before we had Sigma was, writing the code in your lcoal machine and upload it as a ZIP file with required dependencies(node_modules).

However, following piece of code can solve this problem to a certain extent.

In this code, the first step is reading the code itself as a String and looking for the required dependencies!

Snake eating itself 🤢
If you are not aware already, with each lambda function, you get a non persistent space of 512MB at /tmp location of the lambda container.

Next step is, executing npm install on all identified dependencies to generate the required node_modules folder.

Since, /tmp is the only writable folder in Lambda container, we have to use prefix flag as follows.

At the same time, you have to set HOME environment variable of your Lambda container to /tmp.

Setting environment variables in Lambda

At the end of downloading all dependencies, we change the reference of require function to one of our own, which appends /tmp/node_modules/ to all dependencies.

Now when you spawn a container for this particular lambda for the first time, it is going to download required node modules to /tmp/node_modules directory. This is going to take a noticeable amount of time in the first run (anyway cold start time is a problem in lambda, but this approach takes even more time), and from the second time onward(just for this particular container), the execution time will be normal 😊.

First Run (Cold Start) — Billed Duration: 1100 ms
A warm run 🔥 — Billed Duration: 300 ms

Obviously this is not a production ready solution, but this might pave the path to solve a burning problem in Sigma(Hopefully). We will let you know about what we solved, once we add that feature to Sigma🧐. You will definitely love that feature. Stay tuned & hope you enjoyed!

Update (2018/02/24)

Thank you very much Eric for catching this😊

Based on this suggestion, we can get rid of the initial code parsing cycle as follows. This code loads dependencies dynamically at require time, still keeping our lambda function self sufficient!!

However, for this particular example, you will hardly notice a difference in cold start time & run times .

But that will be noticeable in the following example.

Difference between two approaches

These two approaches behave differently at both startup time & run time.

In the first approach, since function preloads(download) all required dependencies, run time experiences a similar environment as if you’ve upload your lambda function as a ZIP file(with node_modules directory).

In the second approach, startup will be faster and run time will take some time(only at the first time, it encounters an unique dependency), since it has to download the dependencies.

Update (2018/04/26)

Test Framework of Sigma IDE is implemented based on the above concepts. For more information,

Call To Action

  • Clap. Appreciate and let others find this article.
  • Comment. Share your views on this article.
  • Follow me. Chathura Widanage to receive updates on articles like this.
  • Keep in touch. LinkedIn, Twitter


Join Hacker Noon

Create your free account to unlock your custom reading experience.