paint-brush
How to Add a Cache Layer to Serverless GraphQL AppSync APIby@yi
2,886 reads
2,886 reads

How to Add a Cache Layer to Serverless GraphQL AppSync API

by Yi AiApril 7th, 2019
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

In this post, I show an example of simple application based on the <strong>Serverless</strong> <strong>framework</strong>. The application uses Amazon Appsync, Lambda, DynamoDB, <strong>Amazon DynamoDB Accelerator (DAX)</strong>, to demonstrate how to use Amazon DAX as caching layer in front of AWS Appsync Resolvers.

Company Mentioned

Mention Thumbnail
featured image - How to Add a Cache Layer to Serverless GraphQL AppSync API
Yi Ai HackerNoon profile picture

In this post, I show an example of simple application based on the Serverless framework. The application uses Amazon Appsync, Lambda, DynamoDB, Amazon DynamoDB Accelerator (DAX), to demonstrate how to use Amazon DAX as caching layer in front of AWS Appsync Resolvers.

Amazon DynamoDB Accelerator (DAX) is a fully managed, highly available, in-memory cache for DynamoDB that delivers up to a 10x performance improvement — from milliseconds to microseconds — even at millions of requests per second.

A simple serverless CMS

The example application in this post is a simple CMS. I use Serverless framework templates to simplify the setup for Appsync. The sections in template that create the Appsync, Dynamodb, Cognito identity pool and user pool, DAX cluster, roles, security groups, and subnet groups, and you can use them for automatic and repeatable deployments.

The following diagram illustrates application solution:

Getting started with Serverless Framework

To start, you need to have the Serverless Framework installed:

$ npm install -g serverless

Now, let’s create a new serverless project:

$ sls create --template aws-nodejs --path simpleCMS
$ cd simpleCMS
$ npm init

The directory that is created includes two files — handler.js is the Lambda function. The serverless.yml file contains the configurations of the backend.

Add Plugins

You need to add two plugins of Serverless framework: serverless-appsync-plugin and serverless-pseudo-parameters:

$ npm install serverless-pseudo-parameters
$ npm install serverless-appsync-plugin

Edit the serverless.yml file and add the plugins to plugins section:

plugins:
  - serverless-appsync-plugin
  - serverless-pseudo-parameters

Define the AWS AppSync Schema

Let’s look at the schema. Schema files are text files, usually named schema.graphql, the sample scheme should look like as below:

<a href="https://medium.com/media/5cf9c8c080c05375eb1fd89c8c6f206c/href">https://medium.com/media/5cf9c8c080c05375eb1fd89c8c6f206c/href</a>

Create Mapping Templates

Now we have our schema defined we need to add resolvers for it, The mapping template files should be located in a directory called mapping-templates relative to the serverless.yml file.

Pipeline resolvers

AppSync team released pipeline resolvers last year, Pipeline resolvers offer the ability to serially execute operations against data sources.

In this example, there are two functions attached Query and Mutation APIs , The pipeline resolver to run the following logic:

  • Connect to the Lambda datasource, get/put item from/to DAX.
  • If the Lambda function raises an error, connect to DynamoDb datasource, get/put item from/to DynamoDb directly.

Let’s create the request template for adding post mutation in a file called mapping-templates/Mutation-addPost-request.vtl , note that it is a pipeline resolver. It will save the argument to context stash and invoke the attached pipeline functions.

$util.qr($context.stash.put("id", $util.autoId()))
$util.qr($context.stash.put("title", $ctx.args.post.title))
$util.qr($context.stash.put("createdAt", $util.time.nowISO8601()))
$util.qr($context.stash.put("content", $ctx.args.post.content))
{}

Then create template for the first pipeline function mapping-templates/Function-addPostViaDax-request.vtl which will invoke a lambda function to insert post to DAX.

{
    "version": "2017-02-28",
    "operation": "Invoke",
    "payload": {
        "field": "addPostViaDax",
        "arguments": $utils.toJson($context.stash)
    }
}

Next let’s create the second pipeline function template mapping-templates/Function-addPostToDB-request.vtl which will insert post to DynamoDB directly if previous lambda function execrated failed.

#if($ctx.prev.result.errorMessage)
{
    "version" : "2017-02-28",
    "operation" : "PutItem",
    "key": {
       "id" : $util.dynamodb.toDynamoDBJson($context.stash.id)
    },
    "attributeValues" : {
        "title" : $util.dynamodb.toDynamoDBJson($context.stash.title) ,
        "content" : $util.dynamodb.toDynamoDBJson($context.stash.content),
       "createdAt":  $util.dynamodb.toDynamoDBJson($context.stash.createdAt)
    },
}
#else
  #return($ctx.prev.result)
#end

Define Dependent Services

Now, I need a DynamoDB table, Appsync, Amazon Cognito user pool and identity pool, DAX in a Vpc, Security group and some IAM roles to hook them all together. These are placed in the resources section of the serverless.yml.

Define the AppSync GrpahQL API

Let’s define the AWS AppSync resource, Add the following example config to the custom section of serverless.yml

<a href="https://medium.com/media/765d21742279f0a537037d3b1ee293fd/href">https://medium.com/media/765d21742279f0a537037d3b1ee293fd/href</a>

As you can see, each query and mutation has an entry in the mapping templates.

Define the DynamoDB table

The next section of this code example creates a DynamoDB table.

  PostsTable:
    Type: AWS::DynamoDB::Table
    Properties:
      TableName: ${self:custom.postsTable}
      AttributeDefinitions:
        - AttributeName: id
          AttributeType: S
      KeySchema:
        - AttributeName: id
          KeyType: HASH
      ProvisionedThroughput:
        ReadCapacityUnits: 5
        WriteCapacityUnits: 1
      StreamSpecification:
        StreamViewType: NEW_IMAGE

This table has only a single hash key. The ProvisionedThroughput ReadCapacityUnits are kept low because DAX serves most of the read traffic. DynamoDB is called only if DAX has not cached the item.

Define the DAX cluster

Now let’s create DAX cluster, the template sets up following services:

  • DAX cluster with one t2.small node.
  • Security group with a rule to allow Lambda to send traffic to DAX on TCP port 8111.
  • An IAM role allow DAX access to DynamoDB.
  • A VPC refers to this security group.
  • Security groups control how network traffic is allowed to flow in a VPC.

<a href="https://medium.com/media/8d3a78df7bdef234caa4e5034f7a01b1/href">https://medium.com/media/8d3a78df7bdef234caa4e5034f7a01b1/href</a>

Define VPC into Lambda

To properly connect AWS Lambda to DAX, I need to deploy your lambda functions into the same VPC of DAX, Configure provider section in serverless.yml file:

provider:
  vpc:
    securityGroupIds:
      - Fn::GetAtt: [DaxVpcSecurityGroup, GroupId]
    subnetIds:
      - Ref: daxSubnet1
      - Ref: daxSubnet2

The other parts of the AWS configuration like authentication, schema, and data sources etc. you can find them in My GitHub repo.

The code

The lambda function code is in handler.js. The Lambda handler uses environment variables for configuration: POSTS_TABLE_NAME is the name of the table containing post data, and DAX_ENDPOINT is the DAX cluster endpoint. These variables are configured automatically in the serverless.yml.

When there is mutation or query request, Appsync would invoke lambda function data source to insert or get data from DAX, If Lambda function raises an error, Pipeline resolver will go to next pipeline function and connect to DynamoDb datasource, get/put item from/to DynamoDb.

<a href="https://medium.com/media/9a63674059e79e70365f2f6037a6beca/href">https://medium.com/media/9a63674059e79e70365f2f6037a6beca/href</a>

Deploy your API

To deploy your API, run the following:

$ serverless deploy -v

Conclusion

In this post, I showed how to use Serverless framework to create Appsync lambda data source that uses DAX and DynamoDB to implement simple quey and mutation. I hope you have found this article useful, the full code example is available on my GitHub repo.