One of my favorite parts of my job as a Developer Evangelist at is building sample applications. It is an enthralling way to engage and interact with potential and existing customers, as well as show off the fun technology we use and build with every single day. The applications I build range from small code snippets outlining how to perform basic operations, such as marking an item as “read” in Stream, to large microservice based applications that generally require robust backend architectures, like . Stream Winds Last year, I created a post on . Now that and are nearly in terms of requests per second, I thought it might be interesting to show you how I go about structuring my APIs with (just to toss in a little friendly competition / play devil-oper’s advocate 😉). how to build a RESTful API with Restify Express Restify neck and neck Express Structuring Your API The way you choose to structure your API is one of the most important decisions you’ll make. You must ensure that it’s smart, flexible, and easy to use — this is a . If it’s not easy to use, other developers will not understand what you’re building nor will they be able to figure out how to build on top of it. Think before you build (I know. Planning sucks. Especially when you are excited to get going, but it *pays off*). must ├── build.sh ├── dist │ ├── … ├── package.json ├── src │ ├── config │ │ └── index.js │ ├── controllers │ │ ├── … │ ├── models │ │ ├── … │ ├── package.json │ ├── routes │ │ ├── … │ ├── server.js │ ├── utils │ │ ├── … All source code is stored in . It compiles down from ES6+ to ES5 into the directory for execution on the server. You’re probably asking yourself why you’d take the extra step to write in something that is just going to be compiled down? Good question. ES6+ standards provide some pretty killer additional functionalities, such as arrow functions, modified scoping, destructuring, rest/spread parameter handling, and ! /src /dist more Let’s have a look at the compilation that takes place in the file: build.sh That is ALL you need to be able to write in a super awesome language while having it still be supported in all the usual places! That said, the code above *may* look like gibberish, so let’s break it down 🤓: #!/bin/bash Denotes that this is an executable bash file 2. rm -rf dist && mkdir dist Removes the directory if it exists (cleanup). /dist Creates a new directory. /dist 3. npx babel src — out-dir dist — ignore node_modules Compiles every file to ES5 and moves the files to the directory, with the exception of (those are already compiled). /dist node_modules 4. cp src/package.json dist By design, npx doesn’t migrate json files, so we need to copy it ourselves using the command. cp 5. cd dist && yarn install — production — modules-folder node_modules Move into the directory and install the npm modules using /dist yarn Running the build is as simple as running the following command from your terminal: OR if you are like me and enjoy automating , you can create an npm script like so: Note: You will need to ensure that the build.sh file is executable… everything Which can be executed by running the following from your terminal: The Main File The following file, , contains the most important logic and sits on the top-level of our codebase. The beginning portion imports all of the necessary npm modules, followed by our and . server.js config logger utility Next, we take advantage of the Express method to invoke several of our imported middleware libraries ( , , and our ). **Please note** that there are several other middleware libraries that we include for additional functionality (e.g. email, logging, jwt authentication, etc.). Last but not least, after a bit of initialization, we dynamically include all routes and pass the context to the route for binding. use cors compression body-parser API Note: The customer logger module can be used with most logging services ( Papertrail , Loggly , etc.). For this demo, as well as other projects, I like to use Papertrail. You may need to adjust the settings and ENV variables if you use something other than Papertrail. Routing To keep things tidy and organized, all routing logic (e.g. GET /users) is kept in its own route file inside of a directory. /routes As you can see, the contents of the route file above hold all references to the controllers for , , , and operations. This works because we import and reference the , passing along the necessary parameters and/or data with API call. GET POST PUT DELETE User Controller every Controllers Controllers include the database model associated with the data that they will be handling, receiving data from the routes, and then making an informed decision on how to handle the data. Finally, the controllers communicate through the models which then talk to the database, and return a status code with a payload. If you’re a visual person, a production instance should look a little something like this: And, the code for an example user controller would look like this: Mongoose Models (MongoDB) is a wonderful ODM (Object Data Modeling) library for Node.js and . If you’re familiar with the reference ORM (Object Resource Mapping) and libraries for Node.js, such as and , Mongoose is pretty straightforward. The massive benefit with Mongoose is how easy it is to structure MongoDB schemas — there’s no need to fuss around with custom business logic. Mongoose MongoDB Sequelize Bookshelf What’s even more exciting are the many goodies like middleware, , object population, and schema validation either baked in, or one (I love yarn) or one install away. It’s truly remarkable how popular the project has become among developers who use MongoDB. plugins yarn npm When it comes to Mongoose models, I tend to keep things somewhat flat (or at least a maximum of 3 deeply nested objects) to avoid confusion. Here’s an example of a user model pulled directly from a project currently under development here at Stream: Note: When it comes to hosting and running MongoDB, I like to use MongoDB Atlas . It’s a database as a service provided by the makers of MongoDB themselves. If you don’t want to use a free MongoDB Atlas instance, you’re welcome to use a local version. Additionally, if you want to monitor your data, MongoDB Compass is an excellent choice! Utilities Custom utilities can be used for a variety of things — basically, anything you want. I generally reserve them for separating concerns and keeping my code clean. Some examples include establishing database connections, sending emails, logging to an external service, and even communicating with HTTP based service here at Stream. I’m often asked the question of when to turn something into a utility and my answer is always the same… When you find yourself reusing code OR jamming third-party services into code where it just doesn’t feel right. 1) 2) Here’s an example of a utility I wrote to help called the . This integration was completed in about a dozen lines of code: Stream Personalization REST API The code above can now be called from any file like so: Final Thoughts APIs are the building blocks of modern applications. They govern how an application can talk to another, as well as to the database. While we have other flavors of API structures ( , etc.), RESTful APIs continue to pull their own weight and aren’t going anywhere soon. GraphQL If you’re interested in seeing a fully built out skeleton for a REST API built with Node.js, Express, Mongoose, and MongoDB, head over to this . GitHub repo As always, if you have any questions, please don’t hesitate to reach out to me on or below in the comments. Thank you! Twitter
Share Your Thoughts