This paper is available on arxiv under CC BY-SA 4.0 DEED license.
Authors:
(1) Juan Mera Men´endez;
(2) Martin Bartlett.
Despite its already widespread popularity, it continues to gain adoption. More and more developers and architects continue to adopt and apply the FaaS (Function as a Service) model in cloud solutions. The most extensively used FaaS service is AWS Lambda, provided by Amazon Web Services. Moreover, despite the new trends in programming languages, Java still maintains a significant share of usage. The main problem that arises when using these two technologies together is widely known: significant latencies and the dreaded cold start. However, it is possible to greatly mitigate this problem without dedicating too much effort. In this article, various techniques, strategies and approaches will be studied with the aim of reducing the cold start and significantly improving the performance of Lambda functions with Java. Starting from a system that involves AWS lambda, java, DynamoDB and Api Gateway. Each approach will be tested independently, analyzing its impact through load tests. Subsequently, they will be tested in combination in an effort to achieve the greatest possible performance improvement.
Index Terms—AWS Lambda, AWS, Amazon Web Services, Java, performance, cold start, FaaS, Function as a Service, Serverless, GraalVM, Snapstart, JAVA TOOL OPTIONS
FOLLOWING the trend of recent years, the serverless approach [9] continues to gain popularity due to its scalability, flexibility, agility, and other well-known benefits. According to the survey conducted by Datadog in August 2023 [13], one of the most popularly used services within this architectural model is AWS Lambda (provided by Amazon Web Services) and over a 10% of Lambda functions are written in Java. [11] Furthermore Java is the primary language for 30% of professional developers.
Being the performance one of the main criteria when comparing solutions, the survey shows that cold start of java lambdas are roughly twice as long as the cold starts of other languages like Node.js and Python [3], [4](which are the most used languages with AWS lambda). Indeed, it becomes evident why performance is the first aspect brought to the table when discussing the combination of Java with AWS Lambda. And rightfully so, if it’s not taken into account when building the system, it’s common to encounter time-out issues in API Gateway due to excessively high latency of the functions. Furthermore, another noteworthy benefit of improving the performance of Lambda functions is the reduction in their cost. This is because the cost model for this service is based on computing time. By reducing this time, the cost is also reduced.
Thus, the primary goal of the study is to optimize the way Java code is executed on AWS Lambda and provide a set of best practices that assist this approach in being competitive in terms of performance. To achieve that goal, a series of techniques, approaches or strategies are proposed with the aim of improving function performance and mitigating cold starts:
• Profile the function to see which memory configuration is better suited in terms of performance.
• Use Snapstart to reduce the cold-start latency. • Choose the Arm64 Lambda architecture against the default x86 64 architecture.
• Use the AWS SDK v2 for Java. • Pre-compile the source code with GraalVM to avoid initialize all classes in runtime.
• Java Lambda function customization settings: JAVA TOOL OPTIONS environment variable.
• Other good practices
The improvement caused by each technique will be measured independently since some of them are not compatible with each other. The advantages and drawbacks of the techniques will be discussed as well. For the implementation of each technique or approach, we start with an initial, unoptimized system, which will be explained in the following section. The performance of this system has also been measured and will be used to calculate the improvements experienced relative to it. After this, the techniques will be combined, and the result will be measured.