This is part 1 of a three-part series of articles where I hope to share some of the experiences I’ve had developing for the web, and how they can help make a faster web experience for users. In this part, the focus is on the front-end. (P.S I am a back-end developer so I may miss out somethings here and there)
The web development scene has gone through a lot of changes in terms of how web applications and sites get developed or deployed. The reason for most of these changes is simply one thing, speed! Speed of development, speed of deployment, speed in execution. This is because the web, for a long time, has lagged behind all native platforms. Slow network connections were not helping too. But all of that has been changing and is continuing to change. Device specs are evolving, new techniques and technologies are being developed to solve all these problems.
Here is a non exhaustive list of things you could do on the front-end/client-side to make your web app/site faster. Enjoy!
I’ve had the privilege of working with some of the greatest front-end developers I know at The Devshop. One of them, Freeman, always says “JPGs are for crazy people” [replace crazy people with a less blog post worthy name]. He is of the mind that backgrounds which are images are not only ugly but also make a site slower to load. In lieu, he advises using something like a gradient which is obviously faster to load than an image, which makes a lot of sense. A perfect example of this, is the design of Dialogue, a social media app we developed to beam tweets with a particular keyword or hashtag.
Designing with speed in mind means the developer is cautious about what the user’s browser has to download before the user can actually see or use the app/site. Remember the user may be on a slow or flaky network using metered mobile data, hence the price they have to pay to get value out of your app/site must always be low. But this doesn’t only help on the user side but also ensures that the browser downloads fewer assets, has less render blocking operations to do and then gives that blazing fast First Paint.
Browsers handle HTML and CSS progressively, meaning that it renders elements as they come. For instance when the body element is rendered, the browser would want to immediately apply styles to it, if it already has the styles. If not, then it would wait until it has them, which is usually downloading an external stylesheet. This is in essence what is meant by render-blocking scripts. This is why inline styling is advised as this reduces the number of times the browser has to wait for styles to apply to elements. This is also the reason why styles are loaded at the top, so that the browser already has all it needs for it to do the painting.
This involves spreading the assets of the website across different domains and sub-domains too. This helps because a browser can only establish a limited number of connections (6 I think) to the same domain at the same time. So spreading the assets means the browser is downloading more at the same time.
This is where CDNs shine. These Content Delivery Networks ensure high availability world-wide by having copies of your files hosted in multiple regions across the globe. Not only would they then serve your files faster, but also they are being served from a domain that is not your website’s domain. This means the browser is instead saving one more connection to your website’s server.
Even if you don’t use a CDN, you can spread your assets onto sub-domains of your website. The browser treats sub-domains as different domains altogether when compared to your top level domain. This is because sub-domains need their own DNS Lookup. I hear you saying “Won’t that then mean the browser has more DNS Lookups to do?”. Well, yes, but not really.
Modern browsers implement something called DNS Prefetching. The definition from the W3C Spec says
“The dns-prefetch link relation type is used to indicate an origin that will be used to fetch required resources, and that the user agent should resolve as early as possible.”
Essentially the browser would preresolve a domain before it is actually required to do so. This is done by adding the following <link> in the <head> of the HTML.
<link rel="dns-prefetch" href="https://subdomain.mydomain.com">
<link rel="dns-prefetch" href="https://cdn.com">
So you can do this for your sub-domains and CDN domains to improve the perceived latency by the user.
For images, I highly recommend Cloudinary which is an end-to-end media management platform in the cloud. Offload the burden of image/video hosting to them, they do quite a good job at serving your media across their global CDNs.
These two are sometimes confused to be the same (I also once thought so too) but in fact they are not. However they work so well together to achieve great strides in reducing file size and ultimately speed up websites.
Minification is the process of removing white space, comments and unnecessary semi-colons in a file which only make the code readable to the developer but not necessarily useful to the browser.
The replacing text is usually smaller i.e fewer characters than the text it replaces and this is how Gzipping reduces the file size.
To illustrate this, here’s an example from CSS-tricks.com.
So this means in your build-pipeline you must add just one more step where you minify and gzip your assets, perhaps before you push them to a CDN. Some servers can gzip assets for you but that’s a story for the Part 3 of this series.
If you follow most of my posts or know me, you’d know that I am a huge fan of PWAs. Progressive Web Apps are the web’s successful attempt at sitting at the native-boys’ table and claiming a spot on the user’s home screen. This technology allows a web app/site’s icon to be saved on the user’s home screen. This obviously means that the app/site must then be available on demand, even when the user’s device is offline! Like, no more dinosaur!
Beyond offline use, service workers are used to significantly speed up subsequent loads of a web app/site. This is done by implementing one of the caching strategies described here depending on the nature/priority of the files to be cached.
There are a ton of tutorials out there on how to write your own service worker but I highly recommend just using WorkboxJs. This is a library that handles generating the service worker for you, even at build time. This generated service worker is based on the configuration you give to WorkboxJs. I wrote about it in this post.
It’s 2018! The year the main thread becomes the UI thread. Web workers are perfect for offloading any potentially compute heavy tasks. Web workers, much like service workers, run in their own thread separate from the main thread. Having a different spec from service workers, they are then very useful for performing compute heavy tasks like applying filters to images, searching a large data set etc.
I recently had a problem with one of the apps I was building for my small startup, frello. I realized reduced performance on mobile when searching through a data table. To solve this problem, I moved the search function into a web worker. Essentially onkeyup, an array/data and search keyword are passed to the worker which then does the searching based on the passed keyword. The results are then passed back to the main thread using postMessage() and then displayed in the table. A lot of jank was removed and there was visible performance gains as the keyboard wouldn’t freeze anymore when typing.
Another route I’d like to explore is making both threads share access to the same data-store so that, onkeyup only the search phrase is passed. I’m sure there would be more performance gains from doing that.
Not convinced? Checkout greenlet by Jason Miller, a small script to move any async code onto a worker. So tasks like retrieving data from a remote service, can be done via a fetch() in a web worker. At first I thought this whole craze is insane but come to think of it, it does make a lot of sense.
The asynchronous loading of the chunks is made possible by the ES6 dynamic import() function that is not a function (don’t ask me why, this might help you though). The import() loads asynchronous modules in a non blocking way.
This however means the browser would have to asynchronously load the chunks, with every page load. But if you have a service worker installed, and you’re pre-caching all your assets correctly, this asynchronous load may only happen the first time only (or never), because the chunk gets stored in the cache.
Feel free to share your opinions, additions, suggestions etc. Thank you ❤️.