paint-brush
A Guide to Building an AWS Serverless API and CICD Pipelineby@pchandel09
7,395 reads
7,395 reads

A Guide to Building an AWS Serverless API and CICD Pipeline

by Puneet ChandelOctober 27th, 2023
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

This article demonstrates the development of a Serverless API utilizing AWS services and establishes a Continuous Integration/Continuous Deployment (CICD) pipeline within the AWS environment.

People Mentioned

Mention Thumbnail
featured image - A Guide to Building an AWS Serverless API and CICD Pipeline
Puneet Chandel HackerNoon profile picture


Overview

This article demonstrates the development of a Serverless API utilizing AWS services and establishes a Continuous Integration/Continuous Deployment (CICD) pipeline within the AWS environment.


Part 1: Explores the creation of a Lambda function to handle requests from the API Gateway and persist the data in DynamoDB using the Serverless Application Model.


Part 2: Details the necessary steps to configure a CodeCommit repository in AWS and set up a CICD pipeline that automatically initiates a build process upon the submission of new changes to the repository.

Prerequisites

For this project, you'll need an AWS account (Free Tier is sufficient). The following AWS components will be utilized:


  • AWS API Gateway
  • AWS Lambda
  • AWS DynamoDB
  • AWS CodeCommit
  • AWS CodeBuild
  • AWS CodePipeline
  • AWS S3
  • Miscellaneous - CloudWatch, IAM, etc


Ensure you have your local development environment set up as follows:


  1. Install AWS CLI: Follow the guide here to install the AWS Command Line Interface.


  2. Install AWS SAM (Serverless Application Model): Install SAM CLI by following the instructions here.


  3. Choose an IDE: Use IntelliJ or a similar IDE for development. I prefer IntelliJ


  4. Maven for Packaging: Make sure you have Maven installed for packaging your application.


  5. Optional: Docker (if you need to test Lambda functions locally): Install Docker if you plan to test Lambda functions locally.


These tools and components will form the foundation of the project.

Part 1: Development

In this section, we will see the process of creating a starter project using AWS SAM, including completing the handler class, building, deploying to AWS, and conducting testing using Postman


Environment Setup

  1. AWS Setup:

    Go to the AWS console at https://aws.amazon.com/console/, Log in using your admin user credentials.

    1. Create a User in IAM:
      • In IAM, create a user dedicated for local development and CLI/SAM usage.
    2. Create a DynamoDB Table:
      • Name: "users", Primary Key: "id" (Type: String)
  2. Configure AWS CLI on Local Machine:

    • Open a terminal and run $ aws configure
    • Provide the access keys for the IAM user created earlier and other default values
  3. Initialize a Project using AWS Serverless Application Model (SAM):

    • Open your terminal and run $ sam init
    • Choose the AWS quick start template.
    • Opt for a "Hello World" example.
    • Select Java 11 or 17, package type as zip, and use Maven as the dependency manager.
    • Enable logging and monitoring with CloudWatch and XRay.
  4. Rename the Project: Rename the project to your preferred name.

  5. Open the Project in IntelliJ: Launch IntelliJ, and open the project.

  6. Add Dependencies to pom.xml:

    • Add the necessary dependencies to the pom.xml file. You only need to add DynamoDB, as other dependencies will be automatically included by SAM.

        <dependencies>
              <dependency>
                  <groupId>com.amazonaws</groupId>
                  <artifactId>aws-lambda-java-core</artifactId>
                  <version>1.2.2</version>
              </dependency>
              <dependency>
                <groupId>com.amazonaws</groupId>
                <artifactId>aws-lambda-java-events</artifactId>
                <version>3.11.0</version>
              </dependency>
              <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>4.13.2</version>
                <scope>test</scope>
              </dependency>
              <dependency>
                  <groupId>com.amazonaws</groupId>
                  <artifactId>aws-java-sdk-dynamodb</artifactId>
                  <version>1.12.573</version>
              </dependency>
          </dependencies>
      



Write the Handler Class

For the lambda function, edit the handler class generated automatically by sam, and add the following code; this is a simple code, and for real projects, you may want to use more modular code.

public class UserRequestHandler implements RequestHandler<Map<String,String>, Map<String,String>> {

    private AmazonDynamoDB amazonDynamoDB;
    private String DYNAMODB_TABLE_NAME = "users";
    private Regions REGION = Regions.US_EAST_1;

    @Override
    public Map<String,String> handleRequest(Map<String,String> input, Context context) {
        this.initDynamoDbClient();
        LambdaLogger logger = context.getLogger();
        logger.log("Input payload:" + input.toString());
        String userId = UUID.randomUUID().toString();

        String firstName= input.get("firstName");
        String lastName= input.get("lastName");

        Map<String, AttributeValue> attributesMap = new HashMap<>();

        attributesMap.put("id", new AttributeValue(userId));
        attributesMap.put("firstName", new AttributeValue(firstName));
        attributesMap.put("lastName", new AttributeValue(lastName));

        logger.log(attributesMap.toString());

        amazonDynamoDB.putItem(DYNAMODB_TABLE_NAME, attributesMap);


        Map<String, String> response = new HashMap<>();
        response.put("id", userId);
        response.put("firstName", firstName);
        response.put("lastName", lastName);

        return response;

    }

    private void initDynamoDbClient() {
        this.amazonDynamoDB = AmazonDynamoDBClientBuilder.standard()
                .withRegion(REGION)
                .build();
    }
}


Update SAM Template file

The SAM template file plays a pivotal role in building and deploying changes to AWS. Update the file for the project. The key elements to focus on within this file are the Lambda function names and the API Gateway endpoints. They are central to the functionality of your serverless application.


AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
  pc-aws-user-api

  Sample SAM Template for pc-aws-user-api

# More info about Globals: https://github.com/awslabs/serverless-application-model/blob/master/docs/globals.rst
Globals:
  Function:
    Timeout: 20
    MemorySize: 128

    Tracing: Active
  Api:
    TracingEnabled: true
Resources:
  UserRequestHandlerLambdaFunction:
    Type: AWS::Serverless::Function # More info about Function Resource: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessfunction
    Properties:
      CodeUri: PcAwsUsersApi
      Handler: com.pc.aws.users.UserRequestHandler::handleRequest
      Runtime: java11
      Architectures:
      - x86_64
      MemorySize: 512
      Environment: # More info about Env Vars: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#environment-object
        Variables:
          PARAM1: VALUE
          JAVA_TOOL_OPTIONS: -XX:+TieredCompilation -XX:TieredStopAtLevel=1 # More info about tiered compilation https://aws.amazon.com/blogs/compute/optimizing-aws-lambda-function-performance-for-java/
      Events:
        PcUsers:
          Type: Api # More info about API Event Source: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#api
          Properties:
            Path: /users
            Method: post

  ApplicationResourceGroup:
    Type: AWS::ResourceGroups::Group
    Properties:
      Name:
        Fn::Sub: ApplicationInsights-SAM-${AWS::StackName}
      ResourceQuery:
        Type: CLOUDFORMATION_STACK_1_0
  ApplicationInsightsMonitoring:
    Type: AWS::ApplicationInsights::Application
    Properties:
      ResourceGroupName:
        Ref: ApplicationResourceGroup
      AutoConfigurationEnabled: 'true'
Outputs:
  # ServerlessRestApi is an implicit API created out of Events key under Serverless::Function
  # Find out more about other implicit resources you can reference within SAM
  # https://github.com/awslabs/serverless-application-model/blob/master/docs/internals/generated_resources.rst#api
  PcAwsUsersApi:
    Description: API Gateway endpoint URL for Prod stage
    Value: !Sub "https://${ServerlessRestApi}.execute-api.${AWS::Region}.amazonaws.com/Prod/users/"
  UserRequestHandlerLambdaFunction:
    Description: Lambda Function ARN
    Value: !GetAtt UserRequestHandlerLambdaFunction.Arn
  UserRequestHandlerLambdaFunctionIamRole:
    Description: Implicit IAM Role created
    Value: !GetAtt UserRequestHandlerLambdaFunctionRole.Arn


Build and Deploy Code using SAM

In IntelliJ, open the terminal, and execute the following commands:

$ sam build

$ sam deploy –guided


When prompted, provide the Stack name as "PcAwsUsersApi", and choose the default options.


The output will showcase the CloudFormation stack that has been created and will also provide the API Gateway endpoint.

Test the API

Before testing the API, grant DynamoDB access to the Lambda role created by SAM.

  • Open the Lambda function in the AWS Console.


  • Navigate to "Permissions", and locate the role name.


  • Add the "DynamoDBFullAccess" permission to this role.


This step ensures that the Lambda function has the necessary permissions to interact with DynamoDB.




Navigate to API Gateway, then follow these steps to obtain the URL for your service:

  1. Go to API Gateway in the AWS Console.

  2. Select your API.

  3. In the left sidebar, click on "Stages."

  4. Under "Stages", select the "Prod" stage.

  5. In the Stage Editor section, you'll find the "Invoke URL". Copy this URL.



  • Launch the Postman application.
  • Create a POST Request: Set up a new request with the following details:
    • Method: POST
    • URL: Paste the API Gateway endpoint URL obtained earlier.
  • Set Headers: Add any necessary headers (e.g., Content-Type: application/json).


  • Add Request Body: Create a JSON object representing the user data you want to send in the request. This would be the data you expect to be stored in DynamoDB.


Part II Implement CICD Pipeline

In this section, we'll demonstrate how to build a CICD pipeline using AWS CodeCommit, CodeBuild, and CodePipeline. The pipeline will initiate a build process that fetches code from the repository, builds it using the build file, and triggers CloudFormation to create the stack.


Create CodeCommit Repository

Login to AWS, and search for CodeCommit service.


Create a new repository in AWS CodeCommit to store your project's source code.

Commit Code to Repository

In IntelliJ, open the terminal and enter the following commands to commit the code created in Step I

$ git init → This initialize local git 
$ git add . → This will stage files
$ git commit -m "commit to CodeCommit"


Push Changes to the remote Repo

Copy the CodeCommit repo URL from aws console.

$ git remote add origin <repo URL>

$ git push --set-upstream origin master --> This will prompt for user/password



Create AWS CodeBuild project

Set up a CodeBuild project that specifies how to build your application. This includes defining build environments, build commands, and dependencies.


  • Create a S3 bucket to store the build files.


  • Search for AWS Code Build, create a new Project, and provide CodeCommit as a source for code and S3 for artifacts storage


To set up CodeBuild, you'll need to create a build specification file named buildspec.yml in your project directory. This file will contain the build commands and instructions for CodeBuild.


You can refer to the official documentation here for detailed instructions on creating a buildspec.yml file.


version: 0.2

phases:
  install:
    runtime-versions:
      java: corretto11
  pre_build:
    commands:
      - echo Nothing to do in the pre_build phase...
  build:
    commands:
      - echo Build started on `date`
      - sam build
      - sam package --output-template-file pcoutputtemplate.yaml  --s3-bucket com-appsdev-pc-001
  post_build:
    commands:
      - echo Build completed on `date`
artifacts:
  files:
    - pcoutputtemplate.yaml


Push the new file to the repository from local.


After this, you can build the project to see if the Code is Pulled from Repo and artifacts are built.


Create a CodePipeline

  • Create a new pipeline in AWS CodePipeline.


  • Connect the pipeline to your CodeCommit repository as the source.


  • Configure the pipeline to use CodeBuild as the build stage.


  • Add a deploy stage to trigger CloudFormation using the template you created.


The pipeline will automatically build and deploy code based on the git commit.



Test the Pipeline


  • Open your project in IntelliJ.


  • Make a small change in one of your files, for example, add another logger line.


  • Commit the change to your local Git repository.


  • Push the commit to the CodeCommit repository.


$ git branch CR01
$ git checkout CR01
$ git add .
$ git commit -m “CR01”
$ git push --set-upstream origin CR01

You cand also create a pull request in aws code commit, just for simplicity I am merging from local 

$ git checkout master
$ git merge --ff-only CR01
$ git push
  • Go to the AWS CodePipeline console.


  • You should see the pipeline start automatically. It will pull the latest code from the repository, build the project using CodeBuild, and deploy it using CloudFormation.


  • Monitor the progress in the CodePipeline dashboard.


This will simulate the process of making changes to your code, pushing them to the repository, and having the CICD pipeline automatically trigger and deploy the updated code.







































By harnessing AWS API Gateway, Lambda, DynamoDB, CodeCommit, CodeBuild, and CodePipeline, the article demonstrates how to architect a resilient and automated deployment process.


Thank you for reading. May your serverless endeavors be met with success and innovation!