Using AWS Lambda functions for micro jobs with no startup delay

Written by kavehmz | Published 2018/02/11
Tech Story Tags: aws | aws-lambda | serverless | golang | micro-jobs

TLDRvia the TL;DR App

Background

  • Past: Dealing with rigid bare-metal servers. minimum granularity : -
  • Recently — Setting up an auto-scaling environment, using Kubernetes or other tools. Scaling up and down based on incoming requersts. minimum granularity : one VM
  • Today — Using Lambda or Google functions we can scale up to 10,000 cpu in few seconds and then scale down to nothing. minimum granularity: one core, 128MB ram

For both AWS Lambda and Google cloud functions there are two catches. They have high startup time, near 10ms, and they have minimum 100ms time granularity.

It means if I want to serve my http requests that take 12ms, first I will face near 10ms delay just to start the function, then both platforms will charge me for 100ms of time, even though I just needed 12ms. This makes Cloud function not suitable for normal usage.

Here will eliminate both issues.

Solution

The idea is to invoke a Lambda function but instead of asking it to do one request we will keep it around to serve many more with sub microsecond delay. And because it will serve many requests we might not care about 100s time granularity neither.

Method is easy. But you need to know about gRPC and bidirectional connection a bit. Both very simple concepts.

We create a http server and also a gRPC server. When we have traffic we invoke one or more Lambda functions.

Those Lambda functions create a bidirectional connection to our gRPC server which is running along with http server.

Now http server relays the load to Lambda functions through grpc and gets the result back.

We just need a mechanism to invoke enough Lambda function to handle our traffic.

I wrote a framework to help in this case: https://github.com/kavehmz/jobber

Solution does not depend on these, but I picked gRPC as RPC framework and protobuf as data interchange format and Go to implement it.

Data interchange format

Protobuf is a simple format.

Payload definition is at payload/payload.proto.

If you are gonna encode/decode your data what is there is enough, otherwise edit payload.proto and regenereate the go file.

Lambda scheduler

To invoke Lambda function you need to pass the scheduler to NewJobber.

I implemented two schedulers.

  • goroutine.Goroutine: a dummy scheduler which is there only for test purposes.
  • awslambda.LambdaScheduler: A simple scheduler which can invoke a lambda function to send the jobs to it.

s := grpc.NewServer()taskMachine = jobber.NewJobber(jobber.Scheduler(&goroutine.Goroutine{GrpcHost: "localhost:50051"}))taskMachine.RegisterGRPC(s)

Scheduler need to implements the following inteface:

interface {// Inbound is called before a new task is added.Inbound()// Done is called when a task is doneDone()// Timedout is called when no response was received on time for a taskTimedout()}

Test run

example includes a dummy test case and also an example of using lamba scheduler.

just to see how it all works simply do the following

In one terminal run the example

$go run example/goroutine/main.go2018/02/11 14:48:57 Start listening gRPC at 500512018/02/11 14:49:17 minion: job inbound 0 02018/02/11 14:49:17 worker[1]: Hi, I was invoked and I am trying to connect to accept jobs2018/02/11 14:49:17 worker[1]: I Joined the workforce2018/02/11 14:49:17 server: A new minion joined to help2018/02/11 14:49:17 server: got a job2018/02/11 14:49:17 worker[1]: received a task from server data:"This is the payload I will send to Lambda."2018/02/11 14:49:18 worker[1]: task is done2018/02/11 14:49:18 server: received the response2018/02/11 14:49:18 server: send the response2018/02/11 14:49:18 server: send the response back to client2018/02/11 14:49:18 minion: job done2018/02/11 14:49:18 Example: received data:"2018-02-11 14:49:18.623911211 +0100 CET m=+21.589500545" <nil>

In another terminal send a request

$ curl 'http://localhost:8000/'2018-02-11 14:49:18.623911211 +0100 CET m=+21.589500545

In the code, your request goes to http server. Your handler will call Do and wait for response. Managing Lambda functions and sending and receiving message is done by jobber

resp, err := myJobber.Do(&payload.Task{Data: "This is the payload I will send to Lambda."})if err != nil {resp = &payload.Result{Data: "Because of error result was returned as nil"}}log.Println("Example: Recevied", resp, err)fmt.Fprint(w, resp.Data)

Test Lambda

If you are familiar with Lambda function, setting up one is easy.

But notice lambda functions need to connect back you the grpc server which Jobber depends on. So they must be in the same network (VPC), or somehow they need to have access you your grpc port.

You can see an example at example/lambda and a sample Lambda function which does nothing at example/aws_func.


Written by kavehmz | A binary world janitor. Specialized in code, architecture cleanup and cost reduction
Published by HackerNoon on 2018/02/11