It goes without question that performance on the web is important. Our apps and sites should load fast in order to keep the user’s attention, keep them engaged and deliver a positive experience.
Development, however is usually done on above-average machines connected to a strong network. However, not all users access the web from a powerful device or a strong signal.
What we want to achieve is to not let the user think that our app is loading. Even if we can’t serve him everything at once we want to give him the necessary feedback to keep him engaged. To make sure that our web app is performant we need to program for the worst, not the best case.
There’s a TL;DR at the end of the article if you want only the main points and lessons learned.
There are two costs that we need to take into account when we talk about performance. The first is the cost to send the code to the user’s browser. The smaller the files we send the faster the user’s browser will receive them.
When we talk about performance we need to first understand the two concepts of laziness.
First — if we don’t do something we won’t have to waste resources on it. This means that we needn’t waste time and memory on processes that won’t benefit the user’s experience of the product.
The second concept is that if we can do something later we never want to do it now. Translated this means that if we can postpone sending particular resources to the browser, we should always do that.
The examples in this article will be based on my experience with React and Webpack but they can be applied to other technologies as well.
We need to send more and more code to our users and sending a giant bundle containing the whole app is not ideal when we are chasing performance.
The idea behind this technique is to give to the user only what he needs at any given time. If he opens a particular page he only needs the code for that page, not the whole app.
By using smaller bundles we can act more lazily and send only the bare minimum in order to make sure the user gets a positive experience and doesn’t wander away thinking whether he’s turned the oven off. Once he’s engaged with the application we can preload other bundles in the background.
You can actually see the amount of unused code you send over from the Chrome Dev Tools. Once you open them, press
Cmd + Shift + P and type
coverage. Pick the first option from the dropdown and press the reload icon.
The way we signal to Webpack what we want to split into a separate bundle is by using dynamic imports. The
import keyword can be used as a function which takes the path to the module we want to be split in a separate bundle and returns a promise.
When the module is loaded and the promise resolves we can have access to what it exports. It’s important to note that if your module has a default export you need to get the
default property from the
module object in order to access it.
In the context of React, the modules which we will be splitting are going to be the different components. For that purpose we can use React Loadable. It gives us access to a higher order component to do the dynamic import.
It’s important to note here that libraries such as React Loadable or Loadable Components, just provide us with a higher order component to make the dynamic import feel more graceful. The magic behind the hood is all done by Webpack.
Also, bear in mind that React 17 will bring some changes to the ecosystem that may leave us without the need of using such wrapper components.
One of the most common paradigms in code splitting is to split components at the route level. This will leave us with a separate bundle for each top level route.
The improvement that we get here is that the user will have to load only the resources for the page that he visits.
While this is an excellent way to start splitting the codebase in smaller chunks, often the biggest performance hits comes from a single library or component.
There are ways to handle the situation gracefully. We will look into how we can load such information in the background while the user’s browser is idle a bit later in the article.
Since import can act as a function the natural trail of thought is that we will be able to pass variables to it and request different chunks on the fly.
While this is not completely wrong, it is not entirely right either. The chunks must be created at build time, so when Webpack sees that we’re using a variable in our import path it will do some homework first.
Whenever we do something like this, Webpack will go into the
themes folder and create a separate chunk for each file there. So no matter what we request from it, there will already be a bundle with that name ready for us to use.
Webpack keeps track of chunks by giving each one an id. So when you fetch a dynamically loaded bundle you will most likely see a file with a name similar to
1.bundle.js loaded in the Developer Tools.
However, by using Webpack’s magic comments and making a small change in the configuration we can give the different chunks more descriptive names.
To be fair, I’m not such a big fan of the magic comments but I suppose it’s the clearest way to handle naming. You can use magic comments with React Loadable, but I’ve left only the
import statement for clarity.
Adding this line in your Webpack configuration will instruct it to use the name you’ve specified and add a chunk hash for caching purposes.
Preloading and prefetching are two techniques that we can use is addition to code splitting to further improve our performance.
Before going into the technical details and implementation of yet another concept, we need to understand why we need this in the first place.
Earlier in the article we discussed the importance of component level splitting when some of them use expensive libraries. However, loading the chunk when the user presses a button is still not ideal, for it could lead to flashes of empty content or a short freeze of the UI before what we need is rendered.
Whenever we are confident that the user will need a particular bundle, we can use preloading or prefetching to pull it before it is explicitly required. This is again achieved with a Webpack magic comment.
What’s the difference between the two and when to use one over the other? While both preloading and prefetching will fetch a chunk before it is actually required, they will do it with a different level of importance.
Preloaded chunks will be loaded with higher priority in parallel to its parent chunk. Mark chunks to be preloaded only if you are confident that the user will interact with them immediately. This can be a dropdown or the contents of a tab.
Prefetched chunks have lower priority and will be loaded in the browser’s idle time. In other words, mark chunks to be prefetched if the user may need them at some point. This can be the next page he is most likely to visit. He won’t request it immediately but you want it to be there when he does.
React Loadable provides components with the static method
preload. It gives you manual control of when you want to fetch the component’s bundle. This is exceptionally useful when we want to be lazy compliant.
We can wait until we are absolutely sure that the user will need an expensive piece of functionality to preload it.
Webpack’s way to configure the chunks we’re loading is through the so called magic comments. To be honest, I haven’t really had to use more than just the basic ones we’ve seen so far. However, it’s good to know that we are not limited and have more options if we need them.
webpackMode comment is used to tell Webpack how to resolve the given chunk. By default all chunks are loaded with
lazy mode. This means that they will have a separate chunk created for them.
The other options are
weak. All in all, I haven’t had to use them so I can’t really comment on them. The problems I’ve faced have so far been solvable using the default
lazy mode. You can read more in Webpack’s docs.
Most times you will want to specify a chunk name and mark the chunk to be prefetched or preloaded. Thankfully we can use as many as we need by separating them with commas.
We can’t really talk about performance unless there is something we can measure. In order to know where to improve we need to have some insights. In the context of code splitting we need to be aware of how big our bundles are and what is actually in them.
There Webpack Bundle Analyzer is your biggest friend. It provides you with a visual representation of your application’s chunks so you can see what goes in each one.
You can see how much space each module takes and what chunk it was put in. It’s also an excellent way to find unneeded code. For example, if you see that you’ve added the whole lodash library you should probably examine your code and pull only the functions that you need.
When it comes to performance improvements, I strongly believe that code splitting is the technique that will bring you the most benefits.
A common worry when it comes to adopting such concepts is how future proof they are. In other words, what is the chance of code splitting falling out of grace leaving you with a whole codebase to revise.
This is a worthy consideration, having in mind that until lately the best practice was to bundle everything together in a single file. However, I’d argue that introducing code splitting is one of the best decisions if your team plans to keep up with technology.
With HTTP/2 sending multiple files at once will not be an issue anymore. The main reason to bundle everything together was to avoid sending multiple requests.
Moving forward, by using HTTP/2 Push, we will be able to send multiple files at once without any additional overhead.
Hey, I’m running a small newsletter in which I share random thoughts, musings and insights about software development. No tutorials, no ads. Just some things I’ve found worthy of pondering delivered to your inbox every few weeks. If this sounds appealing to you you can subscribe HERE.
If the content of this article was helpful to you I’d appreciate if you hold down the clap button for a bit. This way it will reach and help more people. Share it with friends and colleagues that may find it useful and send any feedback my way!
Create your free account to unlock your custom reading experience.