Most software developers encounter three main problems: naming things, caching, and off-by-one errors. 🤦🏻♂️ In this tutorial, we’ll deal with caching. We’ll walk through how to implement RESTful request caching with Redis. We’ll also set up and deploy this system easily with Heroku. For this demo, we’ll build a Node.js application with the Fastify framework, and we’ll integrate caching with Redis to reduce certain types of latency. Ready to dive in? Let’s go! Node.js + Fastify + long-running tasks As I’m sure readers know, Node.js is a very popular platform for building web applications. With its support for JavaScript (or TypeScript, or both at the same time!), Node.js allows you to use the same language for both the frontend and the backend of your application. It also has a rich event loop that makes asynchronous request handling more intuitive. The concurrency model in Node.js is very performant, able to handle upwards of 15,000 requests per second. But even then, you might still run into situations where the request latency is unacceptably high. We’ll show this with our application. As you follow along, you can always browse the codebase for this mini demo at my GitHub repository. Initialize the basic application By using Fastify, you can quickly get a Node.js application up and running to handle requests. Assuming you have Node.js installed, you’ll start by initializing a new project. We’ll use npm as our package manager. After initializing a new project, we will install our Fastify-related dependencies. ~/project$ npm i fastify fastify-cli fastify-plugin Then, we update our package.json file to add two scripts and turn on the ES module syntax. We make sure to have the following lines: "type": "module","main": "app.js","scripts": {"start": "fastify start -a 0.0.0.0 -l info app.js","dev": "fastify start -p 8000 -w -l info -P app.js"}, From there, we create our first file (routes.js) with an initial route: // routes.jsexport default async function (fastify,opts) {fastify.get("/api/health", async (, reply) => {return reply.send({ status: "ok" });});} Then, we create our app.js file that prepares a Fastify instance and registers the routes: // app.jsimport routes from "./routes.js";export default async (fastify, opts) => {fastify.register(routes);}; These two simple files—our application and our route definitions—are all we need to get up and running with a small Fastify service that exposes one endpoint: /api/health. Our dev script in package.json is set to run the fastify-cli to start our server on localhost port 8000, which is good enough for now. We start up our server: ~/project$ npm run dev Then, in another terminal window, we use curl to hit the endpoint: ~$ curl http://localhost:8000/api/health{"status":"ok"} Add a simulated long-running process We’re off to a good start. Next, let’s add another route to simulate a long-running process. This will help us gather some latency data. In routes.js, we add another route handler within our exported default async function: fastify.get("/api/user-data", async (_, reply) => {await sleep(5000);const userData = readData();return reply.send({ data: userData });}); This exposes another endpoint: /api/user-data. Here, we have a method to simulate reading a lot of data from a database (readData) and a long-running process (sleep). We define those methods in routes.js as well. They look like this: import fs from "fs";function readData() {try {const data = fs.readFileSync("data.txt", "utf8");return data;} catch (err) {console.error(err);}}function sleep(ms) {return new Promise((resolve) => {setTimeout(resolve, ms);});} With our new route in place, we restart our server (npm run dev). Measure latency with curl How do we measure latency? The simplest way is to use curl. Curl captures various time profiling metrics when it makes requests. We just need to format curl’s output so that we can easily see the various latency values available. To do this, we define the output we want to see with a text file (curl-format.txt): time_namelookup: %{time_namelookup}time_connect: %{time_connect}time_appconnect: %{time_appconnect}time_pretransfer: %{time_pretransfer}time_redirect: %{time_redirect}time_starttransfer: %{time_starttransfer}------------------- ---------time_total: %{time_total} With our output format defined, we can use it with our next curl call: curl -w "@curl-format.txt" \ -o /dev/null -s \ "http://localhost:8000/api/user-data" The response we receive looks like this: time_namelookup: 0.000028stime_connect: 0.000692stime_appconnect: 0.000000stime_pretransfer: 0.000772stime_redirect: 0.000000stime_starttransfer: 5.055683s----------time_total: 5.058479s Well, that’s not good. Over five seconds is way too long for a transfer time (the time it takes the server to actually handle the request). Imagine if this endpoint was being hit hundreds or thousands of times per second! Your users would be frustrated, and your server may crash under the weight of continually re-doing this work. Redis to the rescue! Caching your responses is the first line of defense to reduce your transfer time (assuming you’ve addressed any of the poor programming practices that might be causing the latency!). So, let’s assume we’ve done everything we can do to reduce latency, but our application still needs five seconds to put this complex data together and return it to the user. In our scenario, because the data is the same every time for every request to /api/user-data, we have a perfect candidate for caching. With caching, we’ll perform the necessary computation once, cache the result, and return the cached value for all subsequent requests. Redis is a performant, in-memory key/value store, and it’s a common tool used for caching. To leverage it, we first install Redis on our local machine. Then, we need to add Fastify’s Redis plugin to our project: ~/project$ npm i @fastify/redis Register the Redis plugin with Fastify We create a file, redis.js, which configures our Redis plugin and registers it with Fastify. Our file looks like this: // redis.jsconst REDIS_URL = process.env.REDIS_URL || "redis://127.0.0.1:6379";import fp from "fastify-plugin";import redis from "@fastify/redis";const parseRedisUrl = (redisUrl) => {const url = new URL(redisUrl);const password = url.password;return {host: url.hostname,port: url.port,password,};};export default fp(async (fastify) => {fastify.register(redis, parseRedisUrl(REDIS_URL));}); Most of the lines in this file are dedicated to parsing a REDIS_URL value into a host, port, and password. If we have REDIS_URL set properly at runtime as an environment variable, then registering Redis with Fastify is simple. After configuring our plugin, we just need to modify app.js to use it: // app.jsimport redis from "./redis.js";import routes from "./routes.js";export default async (fastify, opts) => {fastify.register(redis);fastify.register(routes);}; Now we have access to our Redis instance by referencing fastify.redis anywhere within our app. Modify our endpoint to use caching With Redis in the mix, let’s change our /api/user-data endpoint to use caching: fastify.get("/api/user-data", async (_, reply) => {const { redis } = fastify;// check if data is in cacheconst data = await redis.get("user-data", (err, val) => {if (val) {return { data: val };}return null;});if (data) {return reply.send(data);}// simulate a long-running taskawait sleep(5000);const userData = readData();// add data to the cacheredis.set("user-data", userData);return reply.send({ data: userData });}); Here, you see that we’ve hardcoded in Redis a single key, user-data, and stored our data under that key. Of course, our key could be a user ID or some other value that identifies a particular type of request or state. Also, we could set a timeout value to expire our key, in the case that we expect data to change after a certain window of time. If there is data in the cache, then we’ll return it and skip all the time-consuming work. Otherwise, do the long-running computation, add the result to the cache, and then return it to the user. What do our transfer times look like after hitting this endpoint two more times (the first one to add the data into the cache, and the second one to retrieve it)? time_namelookup: 0.000023stime_connect: 0.000560stime_appconnect: 0.000000stime_pretransfer: 0.000729stime_redirect: 0.000000stime_starttransfer: 0.044512s----------time_total: 0.047479s Much better! We’ve reduced our request times from several seconds to milliseconds. That’s a huge improvement in performance! Redis has many more features that may be useful here, including having key/value pairs timeout after a certain amount of time; that’s a more common scenario in production environments. Using Redis in your Heroku deployment Up to this point, we’ve only shown how this works in a local environment. Now, let’s go one step further and deploy it all to the cloud. Fortunately, Heroku provides many options for deploying web applications and working with Redis. Let’s walk through how to get set up there. After signing up for a Heroku account and installing their CLI tool, we’re ready to create a new app. In our case, we’ll call our app fastify-with-caching. Here are our steps: Step 1: Login to Heroku ~/projects$ heroku login...Logging in... done Step 2: Create the Heroku app When we create our Heroku app, we’ll get back our Heroku app URL. We take note of this because we’ll use it in our subsequent curl requests. ~/project$ heroku create -a fastify-with-cachingCreating ⬢ fastify-with-caching... donehttps://fastify-with-caching-3e247d11f4ad.herokuapp.com/ | https://git.heroku.com/fastify-with-caching.git Step 3: Add the Heroku Data for Redis add-on We need to set up a Redis add-on that meets our application’s needs. For our demo project, it’s sufficient to create a Mini-tier Redis instance: ~/project$ heroku addons:create heroku-redis:mini -a fastify-with-cachingCreating heroku-redis:mini on ⬢ fastify-with-caching……redis-transparent-98258 is being created in the background.… Spinning up the Redis instance may take two or three minutes. We can check the status of our instance periodically: ~/project$ heroku addons:info redis-transparent-98258...State: creating Not too long after, we see this: State: created We’re just about ready to go! When Heroku spins up our Redis add-on, it also adds our Redis credentials as config variables attached to our Heroku app. We can run the following command to see these config variables: ~/project$ heroku config -a fastify-with-caching=== fastify-with-caching Config VarsREDIS_TLS_URL: rediss://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15940REDIS_URL: redis://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15939 (Your credentials, of course, will be unique and different from what you see above.) Notice that we have a REDIS_URL variable all set up for us. It’s a good thing our redis.js file is coded to properly parse an environment variable called REDIS_URL. Step 4: Create a Heroku remote Finally, we need to create a Heroku remote in our git repo so that we can easily deploy with git. ~/project$ heroku git:remote -a fastify-with-cachingset git remote heroku to https://git.heroku.com/fastify-with-caching.git Step 5: Deploy! Now, when we push our branch to our Heroku remote, Heroku will build and deploy our application. ~/project$ git push heroku main...remote: Building source:remote:remote: -----> Building on the Heroku-22 stackremote: -----> Determining which buildpack to use for this appremote: -----> Node.js app detectedremote:remote: -----> Creating runtime environment...remote: -----> Compressing...remote: Done: 50.8Mremote: -----> Launching...remote: Released v4remote: https://fastify-with-caching-3e247d11f4ad.herokuapp.com/ deployed to Herokuremote:remote: Verifying deploy... done. Our application is up and running. It’s time to test it. Test our deployed application We start with a basic curl request to our /api/health endpoint: $ curl https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/health{"status":"ok"} Excellent. That looks promising. Next, let’s send our first request to the long-running process and capture the latency metrics: $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-datatime_namelookup: 0.035958stime_connect: 0.101336stime_appconnect: 0.249308stime_pretransfer: 0.249389stime_redirect: 0.000000stime_starttransfer: 5.384986s------------------- ----------time_total: 6.554382s When we send the same request a second time, here’s the result: $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-datatime_namelookup: 0.025807s time_connect: 0.091763s time_appconnect: 0.236050s time_pretransfer: 0.236119s time_redirect: 0.000000s time_starttransfer: 0.334859s ------------------- ---------- time_total: 1.276264s Much better! Caching allows us to bypass the long-running processes. From here, we can build out a much more robust caching mechanism for our application across all our routes and processes. We can continue to lean on Heroku and Heroku’s Redis add-on when we need to deploy our application to the cloud. Bonus Tip: Clearing the cache for future tests By the way, if you want to test this more than once, then you may occasionally need to delete the user-data key/value pair in Redis. You can use the Heroku CLI to access the Redis CLI for your Redis instance: ~$ heroku redis:cli -a fastify-with-cachingConnecting to redis-transparent-98258 (REDIS_TLS_URL, REDIS_URL):ec2-18-213-142-76.compute-1.amazonaws.com:15940> DEL user-data1 Conclusion In this tutorial, we explored how caching can greatly improve your web service's response time in cases where identical requests would produce identical responses. We looked at how to implement this with Redis, the industry-standard caching tool. We did this all with ease within a Node.js application that leverages the Fastify framework. Lastly, we deployed our demo application to Heroku, using their built-in Heroku Data for Redis instance management to cache in the cloud. Most software developers encounter three main problems: naming things, caching, and off-by-one errors. 🤦🏻♂️ Most software developers encounter three main problems: naming things, caching, and off-by-one errors. 🤦🏻♂️ In this tutorial, we’ll deal with caching . We’ll walk through how to implement RESTful request caching with Redis. We’ll also set up and deploy this system easily with Heroku . caching Heroku For this demo, we’ll build a Node.js application with the Fastify framework, and we’ll integrate caching with Redis to reduce certain types of latency. Node.js Redis Ready to dive in? Let’s go! Node.js + Fastify + long-running tasks As I’m sure readers know, Node.js is a very popular platform for building web applications. With its support for JavaScript (or TypeScript, or both at the same time!), Node.js allows you to use the same language for both the frontend and the backend of your application. It also has a rich event loop that makes asynchronous request handling more intuitive. The concurrency model in Node.js is very performant, able to handle upwards of 15,000 requests per second . But even then, you might still run into situations where the request latency is unacceptably high. We’ll show this with our application. handle upwards of 15,000 requests per second handle upwards of 15,000 requests per second per second As you follow along, you can always browse the codebase for this mini demo at my GitHub repository. As you follow along, you can always browse the codebase for this mini demo at my GitHub repository . GitHub repository GitHub repository Initialize the basic application By using Fastify , you can quickly get a Node.js application up and running to handle requests. Assuming you have Node.js installed, you’ll start by initializing a new project. We’ll use npm as our package manager. Fastify Fastify npm npm After initializing a new project, we will install our Fastify-related dependencies. ~/project$ npm i fastify fastify-cli fastify-plugin ~/project$ npm i fastify fastify-cli fastify-plugin ~/project$ npm i fastify fastify-cli fastify-plugin ~/project$ npm i fastify fastify-cli fastify-plugin ~/project$ npm i fastify fastify-cli fastify-plugin Then, we update our package.json file to add two scripts and turn on the ES module syntax. We make sure to have the following lines: We make sure to have the following lines: "type": "module","main": "app.js","scripts": {"start": "fastify start -a 0.0.0.0 -l info app.js","dev": "fastify start -p 8000 -w -l info -P app.js"}, "type": "module","main": "app.js","scripts": {"start": "fastify start -a 0.0.0.0 -l info app.js","dev": "fastify start -p 8000 -w -l info -P app.js"}, "type": "module","main": "app.js","scripts": {"start": "fastify start -a 0.0.0.0 -l info app.js","dev": "fastify start -p 8000 -w -l info -P app.js"}, "type": "module","main": "app.js","scripts": {"start": "fastify start -a 0.0.0.0 -l info app.js","dev": "fastify start -p 8000 -w -l info -P app.js"}, "type": "module", "main": "app.js", "scripts": { "start": "fastify start -a 0.0.0.0 -l info app.js", "dev": "fastify start -p 8000 -w -l info -P app.js" }, From there, we create our first file (routes.js) with an initial route: // routes.jsexport default async function (fastify,opts) {fastify.get("/api/health", async (, reply) => {return reply.send({ status: "ok" });});} // routes.jsexport default async function (fastify,opts) {fastify.get("/api/health", async (, reply) => {return reply.send({ status: "ok" });});} // routes.jsexport default async function (fastify,opts) {fastify.get("/api/health", async (, reply) => {return reply.send({ status: "ok" });});} // routes.jsexport default async function (fastify,opts) {fastify.get("/api/health", async (, reply) => {return reply.send({ status: "ok" });});} // routes.js export default async function (fastify, opts) {fastify.get("/api/health", async ( , reply) => { return reply.send({ status: "ok" }); }); } // routes.js opts) { fastify.get("/api/health", async ( Then, we create our app.js file that prepares a Fastify instance and registers the routes: // app.jsimport routes from "./routes.js";export default async (fastify, opts) => {fastify.register(routes);}; // app.jsimport routes from "./routes.js";export default async (fastify, opts) => {fastify.register(routes);}; // app.jsimport routes from "./routes.js";export default async (fastify, opts) => {fastify.register(routes);}; // app.jsimport routes from "./routes.js";export default async (fastify, opts) => {fastify.register(routes);}; // app.js import routes from "./routes.js"; export default async (fastify, opts) => { fastify.register(routes); }; // app.js These two simple files—our application and our route definitions—are all we need to get up and running with a small Fastify service that exposes one endpoint: /api/health. Our dev script in package.json is set to run the fastify-cli to start our server on localhost port 8000, which is good enough for now. We start up our server: ~/project$ npm run dev ~/project$ npm run dev ~/project$ npm run dev ~/project$ npm run dev ~/project$ npm run dev Then, in another terminal window, we use curl to hit the endpoint: curl curl ~$ curl http://localhost:8000/api/health{"status":"ok"} ~$ curl http://localhost:8000/api/health{"status":"ok"} ~$ curl http://localhost:8000/api/health{"status":"ok"} ~$ curl http://localhost:8000/api/health{"status":"ok"} ~$ curl http://localhost:8000/api/health {"status":"ok"} Add a simulated long-running process We’re off to a good start. Next, let’s add another route to simulate a long-running process. This will help us gather some latency data. In routes.js, we add another route handler within our exported default async function: fastify.get("/api/user-data", async (_, reply) => {await sleep(5000);const userData = readData();return reply.send({ data: userData });}); fastify.get("/api/user-data", async (_, reply) => {await sleep(5000);const userData = readData();return reply.send({ data: userData });}); fastify.get("/api/user-data", async (_, reply) => {await sleep(5000);const userData = readData();return reply.send({ data: userData });}); fastify.get("/api/user-data", async (_, reply) => {await sleep(5000);const userData = readData();return reply.send({ data: userData });}); fastify.get("/api/user-data", async (_, reply) => { await sleep(5000); const userData = readData(); return reply.send({ data: userData }); }); This exposes another endpoint: /api/user-data. Here, we have a method to simulate reading a lot of data from a database (readData) and a long-running process (sleep). We define those methods in routes.js as well. They look like this: import fs from "fs";function readData() {try {const data = fs.readFileSync("data.txt", "utf8");return data;} catch (err) {console.error(err);}}function sleep(ms) {return new Promise((resolve) => {setTimeout(resolve, ms);});} import fs from "fs";function readData() {try {const data = fs.readFileSync("data.txt", "utf8");return data;} catch (err) {console.error(err);}}function sleep(ms) {return new Promise((resolve) => {setTimeout(resolve, ms);});} import fs from "fs";function readData() {try {const data = fs.readFileSync("data.txt", "utf8");return data;} catch (err) {console.error(err);}}function sleep(ms) {return new Promise((resolve) => {setTimeout(resolve, ms);});} import fs from "fs";function readData() {try {const data = fs.readFileSync("data.txt", "utf8");return data;} catch (err) {console.error(err);}}function sleep(ms) {return new Promise((resolve) => {setTimeout(resolve, ms);});} import fs from "fs"; function readData() { try { const data = fs.readFileSync("data.txt", "utf8"); return data; } catch (err) { console.error(err); } } function sleep(ms) { return new Promise((resolve) => { setTimeout(resolve, ms); }); } With our new route in place, we restart our server (npm run dev). Measure latency with curl How do we measure latency? The simplest way is to use curl. Curl captures various time profiling metrics when it makes requests. We just need to format curl’s output so that we can easily see the various latency values available. To do this, we define the output we want to see with a text file (curl-format.txt): time_namelookup: %{time_namelookup}time_connect: %{time_connect}time_appconnect: %{time_appconnect}time_pretransfer: %{time_pretransfer}time_redirect: %{time_redirect}time_starttransfer: %{time_starttransfer}------------------- ---------time_total: %{time_total} time_namelookup: %{time_namelookup}time_connect: %{time_connect}time_appconnect: %{time_appconnect}time_pretransfer: %{time_pretransfer}time_redirect: %{time_redirect}time_starttransfer: %{time_starttransfer}------------------- ---------time_total: %{time_total} time_namelookup: %{time_namelookup}time_connect: %{time_connect}time_appconnect: %{time_appconnect}time_pretransfer: %{time_pretransfer}time_redirect: %{time_redirect}time_starttransfer: %{time_starttransfer}------------------- ---------time_total: %{time_total} time_namelookup: %{time_namelookup}time_connect: %{time_connect}time_appconnect: %{time_appconnect}time_pretransfer: %{time_pretransfer}time_redirect: %{time_redirect}time_starttransfer: %{time_starttransfer}------------------- ---------time_total: %{time_total} time_namelookup: %{time_namelookup} time_connect: %{time_connect} time_appconnect: %{time_appconnect} time_pretransfer: %{time_pretransfer} time_redirect: %{time_redirect} time_starttransfer: %{time_starttransfer} ------------------- --------- time_total: %{time_total} With our output format defined, we can use it with our next curl call: curl -w "@curl-format.txt" \ -o /dev/null -s \ "http://localhost:8000/api/user-data" curl -w "@curl-format.txt" \ -o /dev/null -s \ "http://localhost:8000/api/user-data" curl -w "@curl-format.txt" \ -o /dev/null -s \ "http://localhost:8000/api/user-data" curl -w "@curl-format.txt" \ -o /dev/null -s \ "http://localhost:8000/api/user-data" curl -w "@curl-format.txt" \ -o /dev/null -s \ "http://localhost:8000/api/user-data" The response we receive looks like this: time_namelookup: 0.000028stime_connect: 0.000692stime_appconnect: 0.000000stime_pretransfer: 0.000772stime_redirect: 0.000000stime_starttransfer: 5.055683s----------time_total: 5.058479s time_namelookup: 0.000028stime_connect: 0.000692stime_appconnect: 0.000000stime_pretransfer: 0.000772stime_redirect: 0.000000stime_starttransfer: 5.055683s----------time_total: 5.058479s time_namelookup: 0.000028stime_connect: 0.000692stime_appconnect: 0.000000stime_pretransfer: 0.000772stime_redirect: 0.000000stime_starttransfer: 5.055683s----------time_total: 5.058479s time_namelookup: 0.000028stime_connect: 0.000692stime_appconnect: 0.000000stime_pretransfer: 0.000772stime_redirect: 0.000000stime_starttransfer: 5.055683s----------time_total: 5.058479s time_namelookup: 0.000028s time_connect: 0.000692s time_appconnect: 0.000000s time_pretransfer: 0.000772s time_redirect: 0.000000s time_starttransfer: 5.055683s ---------- time_total: 5.058479s Well, that’s not good. Over five seconds is way too long for a transfer time (the time it takes the server to actually handle the request). Imagine if this endpoint was being hit hundreds or thousands of times per second! Your users would be frustrated, and your server may crash under the weight of continually re-doing this work. Redis to the rescue! Caching your responses is the first line of defense to reduce your transfer time (assuming you’ve addressed any of the poor programming practices that might be causing the latency!). So, let’s assume we’ve done everything we can do to reduce latency, but our application still needs five seconds to put this complex data together and return it to the user. In our scenario, because the data is the same every time for every request to /api/user-data, we have a perfect candidate for caching. With caching, we’ll perform the necessary computation once , cache the result, and return the cached value for all subsequent requests. the data is the same every time once Redis is a performant, in-memory key/value store, and it’s a common tool used for caching. To leverage it, we first install Redis on our local machine. Then, we need to add Fastify’s Redis plugin to our project: Redis Redis Fastify’s Redis plugin Fastify’s Redis plugin ~/project$ npm i @fastify/redis ~/project$ npm i @fastify/redis ~/project$ npm i @fastify/redis ~/project$ npm i @fastify/redis ~/project$ npm i @fastify/redis Register the Redis plugin with Fastify We create a file, redis.js, which configures our Redis plugin and registers it with Fastify. Our file looks like this: // redis.jsconst REDIS_URL = process.env.REDIS_URL || "redis://127.0.0.1:6379";import fp from "fastify-plugin";import redis from "@fastify/redis";const parseRedisUrl = (redisUrl) => {const url = new URL(redisUrl);const password = url.password;return {host: url.hostname,port: url.port,password,};};export default fp(async (fastify) => {fastify.register(redis, parseRedisUrl(REDIS_URL));}); // redis.jsconst REDIS_URL = process.env.REDIS_URL || "redis://127.0.0.1:6379";import fp from "fastify-plugin";import redis from "@fastify/redis";const parseRedisUrl = (redisUrl) => {const url = new URL(redisUrl);const password = url.password;return {host: url.hostname,port: url.port,password,};};export default fp(async (fastify) => {fastify.register(redis, parseRedisUrl(REDIS_URL));}); // redis.jsconst REDIS_URL = process.env.REDIS_URL || "redis://127.0.0.1:6379";import fp from "fastify-plugin";import redis from "@fastify/redis";const parseRedisUrl = (redisUrl) => {const url = new URL(redisUrl);const password = url.password;return {host: url.hostname,port: url.port,password,};};export default fp(async (fastify) => {fastify.register(redis, parseRedisUrl(REDIS_URL));}); // redis.jsconst REDIS_URL = process.env.REDIS_URL || "redis://127.0.0.1:6379";import fp from "fastify-plugin";import redis from "@fastify/redis";const parseRedisUrl = (redisUrl) => {const url = new URL(redisUrl);const password = url.password;return {host: url.hostname,port: url.port,password,};};export default fp(async (fastify) => {fastify.register(redis, parseRedisUrl(REDIS_URL));}); // redis.js const REDIS_URL = process.env.REDIS_URL || "redis://127.0.0.1:6379"; import fp from "fastify-plugin"; import redis from "@fastify/redis"; const parseRedisUrl = (redisUrl) => { const url = new URL(redisUrl); const password = url.password; return { host: url.hostname, port: url.port, password, }; }; export default fp(async (fastify) => { fastify.register(redis, parseRedisUrl(REDIS_URL)); }); // redis.js Most of the lines in this file are dedicated to parsing a REDIS_URL value into a host, port, and password. If we have REDIS_URL set properly at runtime as an environment variable, then registering Redis with Fastify is simple. After configuring our plugin, we just need to modify app.js to use it: // app.jsimport redis from "./redis.js";import routes from "./routes.js";export default async (fastify, opts) => {fastify.register(redis);fastify.register(routes);}; // app.jsimport redis from "./redis.js";import routes from "./routes.js";export default async (fastify, opts) => {fastify.register(redis);fastify.register(routes);}; // app.jsimport redis from "./redis.js";import routes from "./routes.js";export default async (fastify, opts) => {fastify.register(redis);fastify.register(routes);}; // app.jsimport redis from "./redis.js";import routes from "./routes.js";export default async (fastify, opts) => {fastify.register(redis);fastify.register(routes);}; // app.js import redis from "./redis.js"; import routes from "./routes.js"; export default async (fastify, opts) => { fastify.register(redis); fastify.register(routes); }; // app.js Now we have access to our Redis instance by referencing fastify.redis anywhere within our app. Modify our endpoint to use caching With Redis in the mix, let’s change our /api/user-data endpoint to use caching: fastify.get("/api/user-data", async (_, reply) => {const { redis } = fastify;// check if data is in cacheconst data = await redis.get("user-data", (err, val) => {if (val) {return { data: val };}return null;});if (data) {return reply.send(data);}// simulate a long-running taskawait sleep(5000);const userData = readData();// add data to the cacheredis.set("user-data", userData);return reply.send({ data: userData });}); fastify.get("/api/user-data", async (_, reply) => {const { redis } = fastify;// check if data is in cacheconst data = await redis.get("user-data", (err, val) => {if (val) {return { data: val };}return null;});if (data) {return reply.send(data);}// simulate a long-running taskawait sleep(5000);const userData = readData();// add data to the cacheredis.set("user-data", userData);return reply.send({ data: userData });}); fastify.get("/api/user-data", async (_, reply) => {const { redis } = fastify;// check if data is in cacheconst data = await redis.get("user-data", (err, val) => {if (val) {return { data: val };}return null;});if (data) {return reply.send(data);}// simulate a long-running taskawait sleep(5000);const userData = readData();// add data to the cacheredis.set("user-data", userData);return reply.send({ data: userData });}); fastify.get("/api/user-data", async (_, reply) => {const { redis } = fastify;// check if data is in cacheconst data = await redis.get("user-data", (err, val) => {if (val) {return { data: val };}return null;});if (data) {return reply.send(data);}// simulate a long-running taskawait sleep(5000);const userData = readData();// add data to the cacheredis.set("user-data", userData);return reply.send({ data: userData });}); fastify.get("/api/user-data", async (_, reply) => { const { redis } = fastify; // check if data is in cache const data = await redis.get("user-data", (err, val) => { if (val) { return { data: val }; } return null; }); if (data) { return reply.send(data); } // simulate a long-running task await sleep(5000); const userData = readData(); // add data to the cache redis.set("user-data", userData); return reply.send({ data: userData }); }); // check if data is in cache // simulate a long-running task // add data to the cache Here, you see that we’ve hardcoded in Redis a single key, user-data, and stored our data under that key. Of course, our key could be a user ID or some other value that identifies a particular type of request or state. Also, we could set a timeout value to expire our key , in the case that we expect data to change after a certain window of time. set a timeout value to expire our key set a timeout value to expire our key If there is data in the cache, then we’ll return it and skip all the time-consuming work. Otherwise, do the long-running computation, add the result to the cache, and then return it to the user. What do our transfer times look like after hitting this endpoint two more times (the first one to add the data into the cache, and the second one to retrieve it)? time_namelookup: 0.000023stime_connect: 0.000560stime_appconnect: 0.000000stime_pretransfer: 0.000729stime_redirect: 0.000000stime_starttransfer: 0.044512s----------time_total: 0.047479s time_namelookup: 0.000023stime_connect: 0.000560stime_appconnect: 0.000000stime_pretransfer: 0.000729stime_redirect: 0.000000stime_starttransfer: 0.044512s----------time_total: 0.047479s time_namelookup: 0.000023stime_connect: 0.000560stime_appconnect: 0.000000stime_pretransfer: 0.000729stime_redirect: 0.000000stime_starttransfer: 0.044512s----------time_total: 0.047479s time_namelookup: 0.000023stime_connect: 0.000560stime_appconnect: 0.000000stime_pretransfer: 0.000729stime_redirect: 0.000000stime_starttransfer: 0.044512s----------time_total: 0.047479s time_namelookup: 0.000023s time_connect: 0.000560s time_appconnect: 0.000000s time_pretransfer: 0.000729s time_redirect: 0.000000s time_starttransfer: 0.044512s ---------- time_total: 0.047479s Much better! We’ve reduced our request times from several seconds to milliseconds. That’s a huge improvement in performance! Redis has many more features that may be useful here, including having key/value pairs timeout after a certain amount of time; that’s a more common scenario in production environments. Using Redis in your Heroku deployment Up to this point, we’ve only shown how this works in a local environment. Now, let’s go one step further and deploy it all to the cloud. Fortunately, Heroku provides many options for deploying web applications and working with Redis. Let’s walk through how to get set up there. After signing up for a Heroku account and installing their CLI tool , we’re ready to create a new app. In our case, we’ll call our app fastify-with-caching. Here are our steps: signing up for a Heroku account signing up for a Heroku account CLI tool CLI tool Step 1: Login to Heroku ~/projects$ heroku login...Logging in... done ~/projects$ heroku login...Logging in... done ~/projects$ heroku login...Logging in... done ~/projects$ heroku login...Logging in... done ~/projects$ heroku login ... Logging in... done Step 2: Create the Heroku app When we create our Heroku app, we’ll get back our Heroku app URL. We take note of this because we’ll use it in our subsequent curl requests. ~/project$ heroku create -a fastify-with-cachingCreating ⬢ fastify-with-caching... donehttps://fastify-with-caching-3e247d11f4ad.herokuapp.com/ | https://git.heroku.com/fastify-with-caching.git ~/project$ heroku create -a fastify-with-cachingCreating ⬢ fastify-with-caching... donehttps://fastify-with-caching-3e247d11f4ad.herokuapp.com/ | https://git.heroku.com/fastify-with-caching.git ~/project$ heroku create -a fastify-with-cachingCreating ⬢ fastify-with-caching... donehttps://fastify-with-caching-3e247d11f4ad.herokuapp.com/ | https://git.heroku.com/fastify-with-caching.git ~/project$ heroku create -a fastify-with-cachingCreating ⬢ fastify-with-caching... donehttps://fastify-with-caching-3e247d11f4ad.herokuapp.com/ | https://git.heroku.com/fastify-with-caching.git ~/project$ heroku create -a fastify-with-caching Creating ⬢ fastify-with-caching... done https://fastify-with-caching-3e247d11f4ad.herokuapp.com/ | https://git.heroku.com/fastify-with-caching.git Step 3: Add the Heroku Data for Redis add-on We need to set up a Redis add-on that meets our application’s needs. For our demo project, it’s sufficient to create a Mini-tier Redis instance: Redis add-on Redis add-on ~/project$ heroku addons:create heroku-redis:mini -a fastify-with-cachingCreating heroku-redis:mini on ⬢ fastify-with-caching……redis-transparent-98258 is being created in the background.… ~/project$ heroku addons:create heroku-redis:mini -a fastify-with-cachingCreating heroku-redis:mini on ⬢ fastify-with-caching……redis-transparent-98258 is being created in the background.… ~/project$ heroku addons:create heroku-redis:mini -a fastify-with-cachingCreating heroku-redis:mini on ⬢ fastify-with-caching……redis-transparent-98258 is being created in the background.… ~/project$ heroku addons:create heroku-redis:mini -a fastify-with-cachingCreating heroku-redis:mini on ⬢ fastify-with-caching……redis-transparent-98258 is being created in the background.… ~/project$ heroku addons:create heroku-redis:mini -a fastify-with-caching Creating heroku-redis:mini on ⬢ fastify-with-caching…… redis-transparent-98258 is being created in the background.… Spinning up the Redis instance may take two or three minutes. We can check the status of our instance periodically: ~/project$ heroku addons:info redis-transparent-98258...State: creating ~/project$ heroku addons:info redis-transparent-98258...State: creating ~/project$ heroku addons:info redis-transparent-98258...State: creating ~/project$ heroku addons:info redis-transparent-98258...State: creating ~/project$ heroku addons:info redis-transparent-98258 ... State: creating Not too long after, we see this: State: created State: created State: created State: created State: created We’re just about ready to go! When Heroku spins up our Redis add-on, it also adds our Redis credentials as config variables attached to our Heroku app. We can run the following command to see these config variables: ~/project$ heroku config -a fastify-with-caching=== fastify-with-caching Config VarsREDIS_TLS_URL: rediss://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15940REDIS_URL: redis://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15939 ~/project$ heroku config -a fastify-with-caching=== fastify-with-caching Config VarsREDIS_TLS_URL: rediss://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15940REDIS_URL: redis://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15939 ~/project$ heroku config -a fastify-with-caching=== fastify-with-caching Config VarsREDIS_TLS_URL: rediss://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15940REDIS_URL: redis://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15939 ~/project$ heroku config -a fastify-with-caching=== fastify-with-caching Config VarsREDIS_TLS_URL: rediss://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15940REDIS_URL: redis://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15939 ~/project$ heroku config -a fastify-with-caching === fastify-with-caching Config Vars REDIS_TLS_URL: rediss://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15940 REDIS_URL: redis://:p171d98f7696ab7eb2319f7b78083af749a0d0bb37622fc420e6c1205d8c4579c@ec2-18-213-142-76.compute-1.amazonaws.com:15939 (Your credentials, of course, will be unique and different from what you see above.) Notice that we have a REDIS_URL variable all set up for us. It’s a good thing our redis.js file is coded to properly parse an environment variable called REDIS_URL. Step 4: Create a Heroku remote Finally, we need to create a Heroku remote in our git repo so that we can easily deploy with git. create a Heroku remote create a Heroku remote ~/project$ heroku git:remote -a fastify-with-cachingset git remote heroku to https://git.heroku.com/fastify-with-caching.git ~/project$ heroku git:remote -a fastify-with-cachingset git remote heroku to https://git.heroku.com/fastify-with-caching.git ~/project$ heroku git:remote -a fastify-with-cachingset git remote heroku to https://git.heroku.com/fastify-with-caching.git ~/project$ heroku git:remote -a fastify-with-cachingset git remote heroku to https://git.heroku.com/fastify-with-caching.git ~/project$ heroku git:remote -a fastify-with-caching set git remote heroku to https://git.heroku.com/fastify-with-caching.git Step 5: Deploy! Now, when we push our branch to our Heroku remote, Heroku will build and deploy our application. ~/project$ git push heroku main...remote: Building source:remote:remote: -----> Building on the Heroku-22 stackremote: -----> Determining which buildpack to use for this appremote: -----> Node.js app detectedremote:remote: -----> Creating runtime environment...remote: -----> Compressing...remote: Done: 50.8Mremote: -----> Launching...remote: Released v4remote: https://fastify-with-caching-3e247d11f4ad.herokuapp.com/ deployed to Herokuremote:remote: Verifying deploy... done. ~/project$ git push heroku main...remote: Building source:remote:remote: -----> Building on the Heroku-22 stackremote: -----> Determining which buildpack to use for this appremote: -----> Node.js app detectedremote:remote: -----> Creating runtime environment...remote: -----> Compressing...remote: Done: 50.8Mremote: -----> Launching...remote: Released v4remote: https://fastify-with-caching-3e247d11f4ad.herokuapp.com/ deployed to Herokuremote:remote: Verifying deploy... done. ~/project$ git push heroku main...remote: Building source:remote:remote: -----> Building on the Heroku-22 stackremote: -----> Determining which buildpack to use for this appremote: -----> Node.js app detectedremote:remote: -----> Creating runtime environment...remote: -----> Compressing...remote: Done: 50.8Mremote: -----> Launching...remote: Released v4remote: https://fastify-with-caching-3e247d11f4ad.herokuapp.com/ deployed to Herokuremote:remote: Verifying deploy... done. ~/project$ git push heroku main...remote: Building source:remote:remote: -----> Building on the Heroku-22 stackremote: -----> Determining which buildpack to use for this appremote: -----> Node.js app detectedremote:remote: -----> Creating runtime environment...remote: -----> Compressing...remote: Done: 50.8Mremote: -----> Launching...remote: Released v4remote: https://fastify-with-caching-3e247d11f4ad.herokuapp.com/ deployed to Herokuremote:remote: Verifying deploy... done. ~/project$ git push heroku main ... remote: Building source: remote: remote: -----> Building on the Heroku-22 stack remote: -----> Determining which buildpack to use for this app remote: -----> Node.js app detected remote: remote: -----> Creating runtime environment ... remote: -----> Compressing... remote: Done: 50.8M remote: -----> Launching... remote: Released v4 remote: https://fastify-with-caching-3e247d11f4ad.herokuapp.com/ deployed to Heroku remote: remote: Verifying deploy... done. Our application is up and running. It’s time to test it. Test our deployed application We start with a basic curl request to our /api/health endpoint: $ curl https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/health{"status":"ok"} $ curl https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/health{"status":"ok"} $ curl https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/health{"status":"ok"} $ curl https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/health{"status":"ok"} $ curl https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/health {"status":"ok"} Excellent. That looks promising. Next, let’s send our first request to the long-running process and capture the latency metrics: $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-datatime_namelookup: 0.035958stime_connect: 0.101336stime_appconnect: 0.249308stime_pretransfer: 0.249389stime_redirect: 0.000000stime_starttransfer: 5.384986s------------------- ----------time_total: 6.554382s $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-datatime_namelookup: 0.035958stime_connect: 0.101336stime_appconnect: 0.249308stime_pretransfer: 0.249389stime_redirect: 0.000000stime_starttransfer: 5.384986s------------------- ----------time_total: 6.554382s $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-datatime_namelookup: 0.035958stime_connect: 0.101336stime_appconnect: 0.249308stime_pretransfer: 0.249389stime_redirect: 0.000000stime_starttransfer: 5.384986s------------------- ----------time_total: 6.554382s $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-datatime_namelookup: 0.035958stime_connect: 0.101336stime_appconnect: 0.249308stime_pretransfer: 0.249389stime_redirect: 0.000000stime_starttransfer: 5.384986s------------------- ----------time_total: 6.554382s $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-data time_namelookup: 0.035958s time_connect: 0.101336s time_appconnect: 0.249308s time_pretransfer: 0.249389s time_redirect: 0.000000s time_starttransfer: 5.384986s ------------------- ---------- time_total: 6.554382s When we send the same request a second time, here’s the result: $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-datatime_namelookup: 0.025807s time_connect: 0.091763s time_appconnect: 0.236050s time_pretransfer: 0.236119s time_redirect: 0.000000s time_starttransfer: 0.334859s ------------------- ---------- time_total: 1.276264s $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-datatime_namelookup: 0.025807s time_connect: 0.091763s time_appconnect: 0.236050s time_pretransfer: 0.236119s time_redirect: 0.000000s time_starttransfer: 0.334859s ------------------- ---------- time_total: 1.276264s $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-datatime_namelookup: 0.025807s time_connect: 0.091763s time_appconnect: 0.236050s time_pretransfer: 0.236119s time_redirect: 0.000000s time_starttransfer: 0.334859s ------------------- ---------- time_total: 1.276264s $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-datatime_namelookup: 0.025807s time_connect: 0.091763s time_appconnect: 0.236050s time_pretransfer: 0.236119s time_redirect: 0.000000s time_starttransfer: 0.334859s ------------------- ---------- time_total: 1.276264s $ curl \ -w "@curl-format.txt" \ -o /dev/null -s \ https://fastify-with-caching-3e247d11f4ad.herokuapp.com/api/user-data time_namelookup: 0.025807s time_connect: 0.091763s time_appconnect: 0.236050s time_pretransfer: 0.236119s time_redirect: 0.000000s time_starttransfer: 0.334859s ------------------- ---------- time_total: 1.276264s Much better! Caching allows us to bypass the long-running processes. From here, we can build out a much more robust caching mechanism for our application across all our routes and processes. We can continue to lean on Heroku and Heroku’s Redis add-on when we need to deploy our application to the cloud. Bonus Tip: Clearing the cache for future tests By the way, if you want to test this more than once, then you may occasionally need to delete the user-data key/value pair in Redis. You can use the Heroku CLI to access the Redis CLI for your Redis instance: ~$ heroku redis:cli -a fastify-with-cachingConnecting to redis-transparent-98258 (REDIS_TLS_URL, REDIS_URL):ec2-18-213-142-76.compute-1.amazonaws.com:15940> DEL user-data1 ~$ heroku redis:cli -a fastify-with-cachingConnecting to redis-transparent-98258 (REDIS_TLS_URL, REDIS_URL):ec2-18-213-142-76.compute-1.amazonaws.com:15940> DEL user-data1 ~$ heroku redis:cli -a fastify-with-cachingConnecting to redis-transparent-98258 (REDIS_TLS_URL, REDIS_URL):ec2-18-213-142-76.compute-1.amazonaws.com:15940> DEL user-data1 ~$ heroku redis:cli -a fastify-with-cachingConnecting to redis-transparent-98258 (REDIS_TLS_URL, REDIS_URL):ec2-18-213-142-76.compute-1.amazonaws.com:15940> DEL user-data1 ~$ heroku redis:cli -a fastify-with-caching Connecting to redis-transparent-98258 (REDIS_TLS_URL, REDIS_URL): ec2-18-213-142-76.compute-1.amazonaws.com:15940> DEL user-data 1 Conclusion In this tutorial, we explored how caching can greatly improve your web service's response time in cases where identical requests would produce identical responses. We looked at how to implement this with Redis, the industry-standard caching tool. We did this all with ease within a Node.js application that leverages the Fastify framework. Lastly, we deployed our demo application to Heroku, using their built-in Heroku Data for Redis instance management to cache in the cloud.