The AWS CDK is a software development framework to define cloud infrastructure as code and provision it through CloudFormation. The CDK integrates fully with AWS services and allows developers to use high-level construct to define cloud infrastructure in code. In this article, we will build a CDK version of AWS EC2 Instance Scheduler solution that enables us to easily configure custom start and stop schedules for our Amazon EC2 and Amazon RDS instances. The Architecture At the end of this article, we have a pipeline to deploy a serverless solution that start and stop Amazon EC2 instances in an autoscalling group and Amazon RDS instances based on a schedule. Here’s the architecture diagram: Through the article we will be creating the following resources: Deploy an AWS Step Function with two parallel tasks. Create an SNS topic to send notifications. Create a Cloudeatch event rule which will trigger Step Function based on a schedule. Create a CI/CD pipeline with CodeBuild and CodePipeline. Prerequisites To deploy the CDK application, there are a few prerequisites that need to be met: Setup an AWS account. Install the latest aws-cli. Install AWS . CDK CLI Deploy a (optional). multi-AZ wordpress website with RDS Before you begin First, create an AWS CDK project by entering the following commands at the command line. $ mkdir cdk-sample $ cdk-sample cdk init --language=javascript cd Next, install CDK modules, we will use below modules in our project. $npm install -cdk/core -cdk/aws-codebuild -cdk/aws-codepipeline -cdk/aws-codepipeline-actions -cdk/aws-events -cdk/aws-events-targets -cdk/aws-iam -cdk/aws-lambda -cdk/aws-sns -cdk/aws-ssm -cdk/aws-stepfunctions -cdk/aws-stepfunctions-tasks @aws @aws @aws @aws @aws @aws @aws @aws @aws @aws @aws @aws We need to add a stage parameter because we want to deploy our stack to multiple stages (dev and production). Let’s add in . DEP_ENV bin/cdk-sample.js cdk = ( ); BaseStackConstruct = ( ); deployEnv = process.env.DEPLOY_ENV || ; app = cdk.App(); BaseStackConstruct(app, , { : { : }, : deployEnv, }); #!/usr/bin/env node const require '@aws-cdk/core' const require '../lib/base-stack' const 'dev' const new new `CDKSample` env region 'ap-southeast-2' stage Define the base stack class Now let’s define our base stack class. In the base stack class, we’ll add the code to instantiate three separate stacks: SNS stack, StepFunction stack and CodePipeline stack. Make the code look like the following. { Construct } = ( ); SNSAlertStac = ( ); StepfunctionStack = ( ); CodePipelineStack = ( ); lambda = ( ); .exports = { (scope, id, props) { (scope, id, props); sNSAlertStac = SNSAlertStac(scope, ); (!process.env.MANUAL_DEPLOY) { cfnParametersCode = lambda.Code.fromCfnParameters(); StepfunctionStack(scope, , { : sNSAlertStac.topic, : cfnParametersCode, }); CodePipelineStack(scope, , { : cfnParametersCode, : props.stage, }); } { StepfunctionStack(scope, , { : sNSAlertStac.topic, }); } } }; const require '@aws-cdk/core' const require './sns-stack' const require './lambda-stack' const require './codepipeline-stack' const require '@aws-cdk/aws-lambda' module class BaseStackConstruct extends Construct constructor super // stack for SNS const new `SNSAlert- ` ${props.stage} // Deploy step functions from codepipeline if const new `Stepfunction- ` ${props.stage} snsTopic lambdaCode new `CodepipelienStack- ` ${props.stage} lambdaCode stage else //deploy step function from local env new `Stepfunction- ` ${props.stage} snsTopic We can set optional environment variable to true if we want to deploy only step function locally. MANUAL_DEPLOY $ MANUAL_DEPLOY= && cdk deploy Stepfunction-dev export true Define SNS stack We’ll add the code (lib/sns-stack.js) to create an SNS topic and subscribe an email to the created topic. { Stack } = ( ); sns = ( ); subs = ( ); EMAIL_SUBSCRIPTION = ; .exports = { (scope, id, props) { (scope, id, props); .topic = sns.Topic( , , { : , }); .topic.addSubscription( subs.EmailSubscription(EMAIL_SUBSCRIPTION)); } }; const require '@aws-cdk/core' const require '@aws-cdk/aws-sns' const require '@aws-cdk/aws-sns-subscriptions' const 'test@test.com' module class SNSAlertStac extends Stack constructor super this new this 'AlertTopic' displayName 'Step function execution failed' this new Define Step function and Lambdas Now we’ll expand our file and add Lambda functions and Step function. lib/lambda-stack.js In this example we will create two lambda functions, to update auto scaling group, and to start/stop RDS instances. updateScalingGroupFn updateDBClusterFn { Stack, ScopedAws, Duration } = ( ); { , Runtime, Code } = ( ); { PolicyStatement } = ( ); { SfnStateMachine } = ( ); { Rule, Schedule } = ( ); ssm = ( ); sfn = ( ); tasks = ( ); { join } = ( ); .exports = { (scope, id, props) { (scope, id, props); autoScalingGroupName = ssm.StringParameter.valueForStringParameter( , ); dBInstanceName = ssm.StringParameter.valueForStringParameter( , ); { accountId, region } = ScopedAws( ); updateScalingGroupFn = ( , , { : Runtime.NODEJS_12_X, : , : props.lambdaCode || Code.fromAsset(join(__dirname, )), : { autoScalingGroupName, }, }); updateDBClusterFn = ( , , { : Runtime.NODEJS_12_X, : , : props.lambdaCode || Code.fromAsset(join(__dirname, )), : { dBInstanceName, }, }); statementUpdateASLGroup = PolicyStatement(); statementUpdateASLGroup.addActions( ); statementUpdateASLGroup.addResources( ); updateScalingGroupFn.addToRolePolicy(statementUpdateASLGroup); statementDescribeASLGroup = PolicyStatement(); statementDescribeASLGroup.addActions( ); statementDescribeASLGroup.addResources( ); updateScalingGroupFn.addToRolePolicy(statementDescribeASLGroup); statementDescribeDBCluster = PolicyStatement(); statementDescribeDBCluster.addActions( ); statementDescribeDBCluster.addResources( ); updateDBClusterFn.addToRolePolicy(statementDescribeDBCluster); statementToggleDBCluster = PolicyStatement(); statementToggleDBCluster.addActions([ , , ]); statementToggleDBCluster.addResources( ); updateDBClusterFn.addToRolePolicy(statementToggleDBCluster); } }; const require '@aws-cdk/core' const Function require '@aws-cdk/aws-lambda' const require '@aws-cdk/aws-iam' const require '@aws-cdk/aws-events-targets' const require '@aws-cdk/aws-events' const require '@aws-cdk/aws-ssm' const require '@aws-cdk/aws-stepfunctions' const require '@aws-cdk/aws-stepfunctions-tasks' const require 'path' module class StepfunctionStack extends Stack constructor super // retriving autoscaling group name and rds db cluster name from SSM const this '/cdksample/autoscalling/name' const this '/cdksample/dbcluster/name' const new this //creating lambda functions const new Function this 'updateScalingGroupFn' runtime handler 'lambda_update_asg/index.handler' code '../src' environment const new Function this 'updateDBCluster' runtime handler 'lambda_update_db_cluster/index.handler' code '../src' environment //IAM policies for updateScalingGroupFn lambda const new 'autoscaling:UpdateAutoScalingGroup' `arn:aws:autoscaling: : :autoScalingGroup:*:autoScalingGroupName/ ` ${region} ${accountId} ${autoScalingGroupName} const new 'autoscaling:DescribeAutoScalingGroups' '*' //IAM policies for updateDBCluster lambda const new 'rds:DescribeDBInstances' '*' const new 'rds:StartDBInstance' 'rds:StopDBInstance' '*' Next, we will continue by adding our step function definitions. Add the following code to . lib/lambda-stack.js updateScalingGroupTask = tasks.LambdaInvoke( , , { : updateScalingGroupFn, } ); updateDBClusterTask = tasks.LambdaInvoke( , , { : updateDBClusterFn, } ); sendFailureNotification = tasks.SnsPublish( , , { : props.snsTopic, : sfn.TaskInput.fromDataAt( ), } ); stepChain = sfn.Parallel( , ) .branch(updateScalingGroupTask) .branch(updateDBClusterTask) .addCatch(sendFailureNotification); toggleAWSServices = sfn.StateMachine( , , { : stepChain, : Duration.minutes( ), }); const new this 'Update asg task' lambdaFunction const new this 'StopStart db cluster task' lambdaFunction const new this 'Publish alert notification' topic message '$.error' const new this 'Stop and Start EC2 Instances and RDS in parallel' const new this 'StateMachine' definition timeout 5 As we can see, for each Lambda there is a corresponding taskEvent. We only want to parallelize the Lamba workload (one task for the ec2 autoscaling group and another task for RDS instances). Hence we will add a chain to our workflow by calling the function. branch() To send a notification to Amazon SNS topic if the Lambda fails, we can add an error handling chain to the workflow by calling function. addCatch() Finally, let’s define a CloudWatch Events rule, CloudWatch is triggering execution of state machine workflow every day 7am and 6pm (UTC). Add the following code to . lib/lambda-stack.js Rule( , , { : Schedule.expression( ), : [ SfnStateMachine(toggleAWSServices)], }); new this 'Rule' schedule 'cron(0 7,18 * * ? *)' targets new Define pipeline stack We define our final stack, codepiepeline-stack. It has a source Action targeting the Github repository, a build Action that builds previously defined stacks, and finally a deploy Action that uses AWS CloudFormation. It takes the Cloudformation template (cdk.out/*.template.json) generated by the AWS CDK build action and passes it to AWS CloudFormation for deployment. Create lib/codepipeline-stack.js and put the following code in it. cdk = ( ); codebuild = ( ); codepipeline = ( ); codepipeline_actions = ( ); ssm = ( ); packageJson = ( ); { (scope, id, props) { (scope, id, props); sourceOutput = codepipeline.Artifact(); cdkBuildOutput = codepipeline.Artifact( ); lambdaBuildOutput = codepipeline.Artifact( ); branch = props.stage === ? : ; codepipeline.Pipeline( , , { : , : [ { : , : [ codepipeline_actions.GitHubSourceAction({ : , : sourceOutput, : ssm.StringParameter.valueForStringParameter( , ), : , branch, : , }), ], }, { : , : [ codepipeline_actions.CodeBuildAction({ : , : codebuild.PipelineProject( , , { : codebuild.BuildSpec.fromObject({ : + , : { : { : , }, : { : , }, : { : [ , ], }, }, artifacts: { : { : { : , : [ ], }, : { : , : [ ], }, }, }, }), }), : sourceOutput, : [cdkBuildOutput, lambdaBuildOutput], }), ], }, { : , : [ codepipeline_actions.CloudFormationCreateUpdateStackAction({ : , : cdkBuildOutput.atPath( ), : , : , }), codepipeline_actions.CloudFormationCreateUpdateStackAction({ : , : cdkBuildOutput.atPath( ), : , : , : { ...props.lambdaCode.assign(lambdaBuildOutput.s3Location), }, : [lambdaBuildOutput], : , }), ], }, ], }); } } .exports = CodePipelineStack; const require '@aws-cdk/core' const require '@aws-cdk/aws-codebuild' const require '@aws-cdk/aws-codepipeline' const require '@aws-cdk/aws-codepipeline-actions' const require '@aws-cdk/aws-ssm' const require '../package.json' . class CodePipelineStack extends cdk Stack constructor super const new const new 'CdkBuildOutput' const new 'LambdaBuildOutput' const 'dev' 'dev' 'master' new this 'Pipeline' pipelineName `DeployScheduleService- ` ${props.stage} stages stageName 'Source' actions new actionName 'Code' output oauthToken this '/github/token' owner 'yai333' repo 'cdkexample' stageName 'Build' actions new actionName 'Build_CDK_LAMBDA' project new this 'Build' buildSpec version ` . ` ${ packageJson.version.split( )[ ] } '.' 0 ${+packageJson.version.split( ).slice( ).join( )} '.' -2 '' phases install commands 'npm install' build commands `export DEPLOY_ENV= && npm run cdk synth` ${props.stage} post_build commands 'cd src && npm install' 'ls src -la' // save the generated files in the correct artifacts 'secondary-artifacts' CdkBuildOutput 'base-directory' 'cdk.out' files '**/*' LambdaBuildOutput 'base-directory' 'src' files '**/*' input outputs stageName 'Deploy' actions new actionName 'Deploy_SNS_Stack' templatePath `SNSAlert- .template.json` ${props.stage} stackName `SNSStack- ` ${props.stage} adminPermissions true new actionName 'Deploy_Lambda_Stack' templatePath `Stepfunction- .template.json` ${props.stage} stackName `StepfunctionStack- ` ${props.stage} adminPermissions true parameterOverrides extraInputs runOrder 2 module Next, create dev branch and check the code into Git then push it to Github repo. $ git branch dev $ git checkout dev $ git add . $ git commit -m "xxxx" $ git push Deploying the pipeline Now we can deploy the pipeline with multiple Stages. Deploy pipeline to dev stage, source action targets the Github repository dev branch. =dev && cdk deploy CodepipelienStack-dev $export DEPLOY_ENV Deploy pipeline to the production stage, source action targets the Github repository master branch. $ DEPLOY_ENV=production && cdk deploy CodepipelienStack-production export After the deployment finishes, we should have a three-stage pipeline that looks like the following. Once all stacks have deployed, we can explore it in the AWS Console and give it a try. Navigate to the Step Function in the console and click “Start execution”. We should see it passes. Let’s check EC2 autoscaling group’s DesiredCapacity and RDS instance’s status. rds - -instances : [ { : , : , : , : ... $aws describe db "DBInstances" "DBInstanceIdentifier" "cxxxxxx" "DBInstanceClass" "db.t2.small" "Engine" "mysql" "DBInstanceStatus" "stopped" rds - -instances : [ { : , : 0, : 0, : 0, ... $aws describe db "AutoScalingGroups" "AutoScalingGroupName" "cdk-sample-WebServerGroup-xxxxxx" "MinSize" "MaxSize" "DesiredCapacity" Finally, let’s check the cloudwatch event rule. We should see the cloudwatch event rule looks like the following. $ { : [ { : , : , : , : , : } ] } aws events list-rules "Rules" "Name" "StepfunctionStack-dev-Rulexxxxxxx" "Arn" "arn:aws:events:ap-southeast-2:xxxxxxx:rule/StepfunctionStack-dev-Rulexxxxxxx-xxxxxxx" "State" "ENABLED" "ScheduleExpression" "cron(0 7,18 * * ? *)" "EventBusName" "default" That’s about it, Thanks for reading! I hope you have found this article useful, You can find the complete project in my . GitHub repo