Photo by on Tim Easley Unsplash The world is being eaten by CRUD APIs, why not learn to build one? Projects similar to this one are a very common interview take-home project. In this tutorial we’ll build an API using Express.js backed by a MongoDB database all deployed with Docker Compose and tested with Mocha and Travis CI. The design of this API and its various components is loosely based on the . 12 Factor App Methodology Setup You’ll need Node.js, Docker, and Docker Compose installed. Additionally you’ll require a free Github account and a free Travis CI account. This tutorial assumes a basic familiarity with Javascript (some ES6 features like arrow functions will also be used) and Bash (or the shell for your OS). All of the code for this tutorial can also be found on . Github First let’s create a new directory ( ) and into it. Then create a new Node.js package inside with . Also initialize a new Git repository with . Next we’ll install npm dependencies using and . After that create a directory structure that looks like this: mkdir cd npm init git init npm i --save express mongodb body-parser npm i --save-dev mocha tape supertest crud-api├── docker/├── models/├── routes/└── tests/ The next couple of steps are auxiliary files for environment and ignore. First let’s create a and a . The Docker ignore file informs your Docker builds of which files to not include while the Git ignore does the same for your Git commits. Mine look like this: .dockerignore .gitignore After those you’ll need a file that looks like my . This provides information to Docker Compose that is essential for our application but should never be stored in version control. .env example.env Optionally, but highly recommended, you could create a and a . Let’s write some code! README.md LICENSE Write the Server We’ll be building this Express server across 3 files: , , and . Express.js is a minimalist Node.js web framework that utilizes the concept of middleware. As a request comes into the server it flows through each middleware in order until it either hits the end and errors out or a function does some computation based on it and returns. Let’s start off with . index.js routes/routes.js models/Document.js index.js index.js This should look familiar if you’ve worked with Express.js before but we’ll go through this section by section. First thing is always imports, may not be an obvious package but it’s a very important piece of middleware. It parses the body of incoming requests to make it easier to work with inside our routes in the next section. Following that we setup our database name based on development or production environment, assign the MongoDB url from environment variables, and set options for our MongoDB client. The first option uses the newer parser otherwise you’ll get a method deprecation error and the other two govern behavior when our database client loses connection. Next we import our router, set our server port from environment but defaulting to 80, and the next two lines are boilerplate Express setup. body-parser Now we get into the meat of our server, the middleware stack. You generally want your middleware parsers first in an API so every request gets parsed. Then your request hits your server’s router, if it matches any of the routes described inside than the corresponding function will trigger and all is well. If no routes are matched then our server will return a . body-parser 404 Not Found Our final section connects our to our MongoDB instance. Always handle your errors in some manner but in this case we want to log an error and exit. After error handling we assign our database connection to a server global variable and start our server. The alerts our test suite when our server has properly started and we export the server so we can import it in our tests. MongoClient app.emit Once you’ve written all of that let’s build out our routes! routes/routes.js In this file we initialize and export an Express Router that contains all of our API routes. Each route takes the form of the router object followed by the HTTP method. Inside each route we then define the path and an arrow lambda to handle our request ( ) and response ( ). The variable is commonly called if a function does not return and instead wants to pass the request further down the middleware stack. In each route inside the arrow lambda we start off by getting the MongoDB connection from our server level variable followed by the collection we want. In this case we only have one collection, . And then in each case we call a method to return some data or an error from the database. In order here are the functions and their corresponding route: req res next req.app.locals.db documents gets all documents in the collection — find() /documents/all gets a specific document in this case based on a document id provided by the client — findOne() /documents/:id uploads a new document into the database — insertOne() /documents/new removes a document based on a document id provided by the client — deleteOne() /documents/delete/:id changes a document based on a JSON request body sent by the client — updateOne() /documents/edit/:id A colon in front of a word in a route denotes a parameter that is accessed inside the handlers using . is automatically assigned to each document by MongoDB which is why we don’t have a unique identifier or other primary key in our data model. After we’ve made a database query we either get back data ( ) or an error ( ). Our error handling behavior is to use our object to send back an HTTP 400 and JSON that contains the error. Otherwise we send back HTTP 200 and the result of the query. req.params _id result err res After that we export our router object so we can use it in as routing middleware for the route. This means that our full path for every API route is . Finally, we move on to defining our data model. index.js /api /api/documents/ models/Document.js This file should be fairly legible. It’s a Javascript class with a constructor that takes 3 strings and stores them. This acts as the schema for the data in MongoDB. With the server taken care of let’s move on to deploying our server with Docker! Dockerize Your API First we need to write the for our production build. We start off by basing our container on the image. The first layer after that downloads a script that will wait for arbitrary services to start so services do not come online without their dependencies and we make it executable with . After that we set our working directory. Next we let our application know that we are in production by assigning the environment variable (in our server we only check if but we might want to explicitly check for later on). Before copying other files we copy our and install our dependencies so that Docker can cache them for subsequent builds. Following that we get our ARG port from and and expose that to the internal Docker network. Succeeding that we copy all of our files. Finally we execute our script to wait for MongoDB to come online and then run our server. Dockerfile.production node:10.12.0-alpine chmod +x NODE_ENV NODE_ENV === 'dev' 'prod' package.json docker-compose.yml .env Our is similar to the first, in fact it’s based on the first. After pulling the production container we install and our test runner Mocha. Finally, we use our test runner ( specifies a behavior from an older version so Mocha exits once it’s finished). The rest of this file should be almost the same as the production version. Dockerfile.test dev-dependencies --exit Now that we have both of our Dockerfiles in place let’s build the that brings all of our containers online. We’re using version of which is somewhat different than version so be careful when viewing Stack Overflow or other tutorials. Our main section is and consists of our Node.js/Express.js and our MongoDB . In we first encounter ; since our file is in we need to set the context to the project root, specify a Dockerfile relative to root, and inject our port environment variable. Next we load our containing vital application secrets not meant to be committed to Git. We want our server to fail often and gracefully so we have Docker Compose always restart it. Next off we bind our container’s exposed port. Finally is necessary for our script that ensures that our server doesn’t come online before our database. Coming up we have our which is based on a published, on Docker Hub, image instead of a local Docker image that needs to be built. Once again we import our for this section. Then we mount a place to store data so it will persist between containers. Next we MongoDB’s port to the internal Docker network but not the host OS. Finally, we issue a command so Docker Compose can start our MongoDB instance. docker-compose.yml 3.0 docker-compose.yml 2.x services backend database backend build docker/ .env WAIT_HOSTS database .env expose There are only a couple of differences in this file but they’re fairly important. First we change the Dockerfile to . Second we use a different data storage location for testing so as not to contaminate our production data. Dockerfile.test At last we come to our npm scripts. These are just aliases to commands so we don’t have type long commands and have a single location to change a command if necessary. The flag indicates the location of a since we aren’t storing them in the project root. The flag after backgrounds the process after all containers come online or fail. docker-compose -f docker-compose.yml -d up Once you’ve complete all of that it’s time to move on to our final section and test our code and deployment. Test Your Code We’re going to write one integration test; for an actual, production application you probably want a series of unit tests in addition to at least one integration test. For unit tests you’d need a unique function for each test with a function to insert a test document and an function to remove it after the test completes. it() beforeEach() afterEach() Let’s start off by writing a function that waits for to trigger in indicating that our server has successfully started. Once it has we call the callback so Mocha knows to move on to the test. before() app.emit index.js done() We start off with a block which contains a single function since we only have one test. Each sequential step is described as a function. In order we insert a new document, get all documents and store a document id to use for the other tests, get a specific document, update a specific document, and finally delete a specific document. describe() it() test() Once we’ve written all of our tests we can move on to our Travis CI configuration. Travis doesn’t start most services by default so we explicitly start Docker. Then we check our Docker version (not strictly necessary, handy for debugging version mismatch or if Docker isn’t running), copy our to so our build runs correctly, and we stop some unnecessary services so our build and tests run faster. After that we use our npm scripts to build our production containers and use those as a source to run our tests. example.env .env Once you’ve written all of that you should be able to push to Github and check your repositories page on Travis CI for a passing build! To run your services locally type and once that’s finished . Once those have completed you can run to show all running containers which should show your MongoDB container and a container named . You can now run which should return . I recommend embedding the current build state in your using the following Markdown snippet: npm run build npm run production docker ps docker_backend curl localhost:80/api/documents/all {“error": “No documents in database"} README.md [](https://travis-ci.org/ <your Github username> / <your repository name> ) Thanks for reading, please leave a clap or several if this tutorial was helpful to you! Joe Cieslik is the CEO of Whiteboard Dynamics, a full stack development team specializing in functional programming and Android. You can hire myself or my team to build your next killer app at whiteboarddynamics.co .