Kat Busch

@katbusch

So your website is slow? Let’s fix that.

A lot of websites are slow.

Many web developers don’t even know our websites slow because we’re on fast internet with great hardware and we live close to our servers.

But what if you have users on the opposite side of the globe from your server? What if their bandwidth is miniscule? What if they’re on a phone? It could take them ages to load your site. And users’ slow CPUs might spend an eternity computing the next animation frame.

What’s an engineer to do? Here I’ll introduce how to think about web performance and how to go about identifying and fixing web performance problems. This article will familiarize you with web performance concepts. I’ll point you in the right direction so you can figure out where to get started and where to invest your time when you’re tackling your website’s performance.

What is a page load?

To understand why websites can be slow, let’s go through the stages of everything that goes into loading a web page. This article focuses on page load speed, not how responsive your website is once it’s done loading, but some of the same tools apply to both stages.

Getting connected

Before you can start loading any website, you’ve got to open a connection between the browser and the target site. This includes DNS lookup to find your website’s IP address; a TCP handshake to establish a connection; and an SSL handshake to set up encryption (I hope!). That means you’ve already got several round trips to and from your servers before the user even begins to load your content. If your servers are on the other side of the planet from your client, each round trip is going to be over 100 milliseconds. You can’t beat the speed of light! If you’re looking for a page load of less than a second, you already have slashed off almost a third of your time with the three round trips needed to initiate a secure connection.

Note that there’s a quick fix here if your stack is up to date. HTTP/2 can use caching to reduce SSL setup to only one round trip — just one of many reasons to consider HTTP/2 if you don’t use it already.

Server response

Okay, you’ve connected! Now your servers have to start doing some work to deliver bytes to the user. Depending on the design and nature of your service, response time can vary wildly. Is your website a bit of static content, or do you need to do all sorts of database lookups and computation to prepare a response? Do you compute the whole page and push it at once, or do you send a shell and push other content as it becomes available? Are you rendering React on the server? Your server response could take anywhere from a few milliseconds to hundreds of milliseconds or more.

Content download

So some sort of response has been prepared. Now clients have to download that response from your servers. Transferring a large response could take a while on a slow or flakey connection.

The browser can quickly begin to parse the response. As soon as it starts parsing, it’ll probably receive instructions to go and download a bunch more stuff: CSS, images, and JavaScript. At this point, many websites tap content delivery networks (CDNs) that are experts at delivering static content quickly around the globe.

And thus the great download of JavaScript, CSS, and images begins. Some large modern websites tend to measure their JavaScript alone in hundreds of kilobytes to megabytes, even after compression. If you have a 100 megabit connection, that will only take tens or hundreds of milliseconds. But if you’re on a 5 megabit connection (like some mobile networks), you’re looking at over a second of content download time, and if you’re on a slower or flakier network, this could take many seconds.

Parsing and execution

Luckily, the browser usually parallelizes this download step with parsing and execution. Once your CSS is in, rendering begins even if JavaScript is still loading. When any JavaScript comes down, the browser begins to go through the rather expensive tasks of parsing and then executing it. It will also lazily parse JavaScript — parsing is CPU intensive and on a slow client it can add seconds to page load time.

Everything else

Your JavaScript might kick off requests to your server that download more data. Websites can have post-load pipelines to show you more stories on your newsfeed, more products on your store, load a new menu, download higher res images, etc. You might have some CPU-heavy JavaScript computations that need to be done while your user is interacting with your page, like if you have too much going on in a React render function. Maybe you offload some decoding to a Web Worker. Websites diverge a lot after the initial page load.

What to do about it

Wow! That’s a lot of stuff that goes on just to load a page! The good news is there’s a lot of great tools to help you dig in and understand your performance.

Profile, profile, profile

You might be tempted after reading this to go tear out a bunch of JavaScript code because your number of kilobytes of JavaScript seems high. Stop!

As with any performance problem, the first step is simply to profile. No point putting your codebase through the ringer if it turns out performance isn’t a pressing problem for you.

You need to understand your performance so that you can decide whether it’s a problem. There are a lot of great tools for profiling web pages. On your own machine you can use the Chrome, Firefox and Edge profilers. These can give you an idea of how much time is spent in various stages of the page load, from network requests to JavaScript execution. Sometimes performance will vary wildly from one browser to the next because of implementation differences.

You can use these browser dev tools to emulate slower connections, so you can see what some of your users might be experiencing. Pretending to be on 2G is a fun one.

Make sure you test your page with static resources both uncached and cached.

You can also use tools like webpagetest to see what page loads are like from different browsers and different places around the world.

Log, log, log

On top of that, strongly consider logging page load times for your users. The Navigation Timing API and Resource Timing API are your friends. Send the information from these APIs on your users’ machines to your servers and collect it for analysis.

But careful when interpreting results: these APIs measure many different stages of your page load. Make sure you figure out which event actually lines up with when the user can view or interact with your page.

This user data will let you see the actual performance real users are experiencing and help you get a breakdown of where time is being spent. Logging will help you home in on problems by looking at which countries, browsers, and devices are suffering.

Some teams find it helpful to log performance metrics on every commit or run perf tests on every commit to identify degrading performance quickly. After your profiling, you probably have a good idea of which metrics you need to track and log for each commit: bytes of JavaScript, number of images, etc.

Now for the fun part: make your page fly

If your investigations reveal that your website is slower than you want to be in a way that you believe is impacting user experience, it’s time to get to work. Your profiling information should already have revealed to you where to get started.

There are lots of small changes that can speed things up: remove redirects, preconnect to the CDN, etc.

For many web pages, the main thing you can do to make your website faster is load less stuff. Google recommends you keep your page weight to 1 megabyte uncompressed.

Minify JavaScript, compress your content, turn on tree shaking. Use async and defer whenever possible to load scripts that aren’t needed right away. Remove unnecessary images. Load only the JavaScript you actually need. Remove code for inaccessible features. Redesign your page to load fewer images! Lots of huge images on your homepage might look really snazzy on your fast office internet, but what does it look like on a slower connection where they load pixel by pixel?

But be careful to keep the big picture in mind instead of only adding on some hacks that will make your code more complicated and will break in a few months. Upgrading to HTTP/2.0 won’t solve all your problems if you’re just loading too much data. On the other hand, a thoughtful rearchitecting to make your website’s performance stable for a long period might be the best investment of your time. For instance, you might want to make it so that your page’s components load in parallel, use GraphQL to load only necessary data, or implement server-side rendering.

Eyes on the prize: user experience

But don’t miss the forest for the trees. Your end goal should always be focused on your users and improving their experience. Performance is just one aspect of the overall experience of your website. If you’re smart about your development process, you’ll make a page that’s not only snappy but also delightful to use.

More by Kat Busch

Topics of interest

More Related Stories