paint-brush
Make The Web Great Again: One end at a time. (Part 1— Frontend)by@bzmpilime
412 reads
412 reads

Make The Web Great Again: One end at a time. (Part 1— Frontend)

by Bakani PilimeJanuary 31st, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

This is part 1 of a three-part series of articles where I hope to share some of the experiences I’ve had developing for the web, and how they can help make a faster web experience for users. In this part, the focus is on the front-end. (P.S I am a back-end developer so I may miss out somethings here and there)

People Mentioned

Mention Thumbnail

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Make The Web Great Again: One end at a time. (Part 1— Frontend)
Bakani Pilime HackerNoon profile picture

This is part 1 of a three-part series of articles where I hope to share some of the experiences I’ve had developing for the web, and how they can help make a faster web experience for users. In this part, the focus is on the front-end. (P.S I am a back-end developer so I may miss out somethings here and there)

A bit of background…

PS: I have no associations with the public figure linked with these type of caps.

The web development scene has gone through a lot of changes in terms of how web applications and sites get developed or deployed. The reason for most of these changes is simply one thing, speed! Speed of development, speed of deployment, speed in execution. This is because the web, for a long time, has lagged behind all native platforms. Slow network connections were not helping too. But all of that has been changing and is continuing to change. Device specs are evolving, new techniques and technologies are being developed to solve all these problems.

Here is a non exhaustive list of things you could do on the front-end/client-side to make your web app/site faster. Enjoy!

1. Design with speed in mind

I’ve had the privilege of working with some of the greatest front-end developers I know at The Devshop. One of them, Freeman, always says “JPGs are for crazy people” [replace crazy people with a less blog post worthy name]. He is of the mind that backgrounds which are images are not only ugly but also make a site slower to load. In lieu, he advises using something like a gradient which is obviously faster to load than an image, which makes a lot of sense. A perfect example of this, is the design of Dialogue, a social media app we developed to beam tweets with a particular keyword or hashtag.

Designing with speed in mind means the developer is cautious about what the user’s browser has to download before the user can actually see or use the app/site. Remember the user may be on a slow or flaky network using metered mobile data, hence the price they have to pay to get value out of your app/site must always be low. But this doesn’t only help on the user side but also ensures that the browser downloads fewer assets, has less render blocking operations to do and then gives that blazing fast First Paint.

2. Reduce Render-Blocking operations

Browsers handle HTML and CSS progressively, meaning that it renders elements as they come. For instance when the body element is rendered, the browser would want to immediately apply styles to it, if it already has the styles. If not, then it would wait until it has them, which is usually downloading an external stylesheet. This is in essence what is meant by render-blocking scripts. This is why inline styling is advised as this reduces the number of times the browser has to wait for styles to apply to elements. This is also the reason why styles are loaded at the top, so that the browser already has all it needs for it to do the painting.

Javascript is executed. This is another common cause of blocked rendering. Whenever the browser comes across a <script> tag it must, download the Javascript file, if its an external file, parse and then execute it. The browser would wait for all this to happen because Javascript can change elements hence it must be fully loaded before rendering next elements. This is why it is advised to load Javascript at the bottom of the HTML. This way, the browser has rendered all the elements and painted them. Even though interactivity would not be available yet, but at least the user is not staring at a blank page 😉.

(Sidebar, Chrome obtains its speed by using a Just-In-Time Compiler that compiles Javascript into machine code unlike an interpreter which would produce bytecode or any other intermediate code. Read about that here.)

3. Parallel Asset Loading

This involves spreading the assets of the website across different domains and sub-domains too. This helps because a browser can only establish a limited number of connections (6 I think) to the same domain at the same time. So spreading the assets means the browser is downloading more at the same time.

This is where CDNs shine. These Content Delivery Networks ensure high availability world-wide by having copies of your files hosted in multiple regions across the globe. Not only would they then serve your files faster, but also they are being served from a domain that is not your website’s domain. This means the browser is instead saving one more connection to your website’s server.

Even if you don’t use a CDN, you can spread your assets onto sub-domains of your website. The browser treats sub-domains as different domains altogether when compared to your top level domain. This is because sub-domains need their own DNS Lookup. I hear you saying “Won’t that then mean the browser has more DNS Lookups to do?”. Well, yes, but not really.

Modern browsers implement something called DNS Prefetching. The definition from the W3C Spec says

“The dns-prefetch link relation type is used to indicate an origin that will be used to fetch required resources, and that the user agent should resolve as early as possible.”

Essentially the browser would preresolve a domain before it is actually required to do so. This is done by adding the following <link> in the <head> of the HTML.


<link rel="dns-prefetch" href="https://subdomain.mydomain.com"><link rel="dns-prefetch" href="https://cdn.com">

So you can do this for your sub-domains and CDN domains to improve the perceived latency by the user.

For Javascript, I’d recommend JSDeliver which is an Open Source CDN which is free, fast and reliable. I really like it because it can serve scripts from npm, version and minify them on-the-fly. So to load the Javascript SDK, on npm, for the news API I once wrote about here, simply use the following script.

<script type="text/javascript" src="https://cdn.jsdelivr.net/npm/[email protected]/index.min.js>

For images, I highly recommend Cloudinary which is an end-to-end media management platform in the cloud. Offload the burden of image/video hosting to them, they do quite a good job at serving your media across their global CDNs.

4. Gzipping & Minification

Lol.

These two are sometimes confused to be the same (I also once thought so too) but in fact they are not. However they work so well together to achieve great strides in reducing file size and ultimately speed up websites.

Minification is the process of removing white space, comments and unnecessary semi-colons in a file which only make the code readable to the developer but not necessarily useful to the browser.


Gzipping is a compression technique that replaces repetitive text with pointers to the first occurrence of that text. This is particularly helpful in CSS files where a lot of repetition is bound to happen. This is the same with Javascript files where there could be multiple references to a particular function, thus the function name becomes the repetitive text that gets replaced. The replacing text is usually smaller i.e fewer characters than the text it replaces and this is how Gzipping reduces the file size.

To illustrate this, here’s an example from CSS-tricks.com.

Please allow those savings to sink in!

So this means in your build-pipeline you must add just one more step where you minify and gzip your assets, perhaps before you push them to a CDN. Some servers can gzip assets for you but that’s a story for the Part 3 of this series.

5. The Service Worker

I still don’t know what the flying cat is for 😂

If you follow most of my posts or know me, you’d know that I am a huge fan of PWAs. Progressive Web Apps are the web’s successful attempt at sitting at the native-boys’ table and claiming a spot on the user’s home screen. This technology allows a web app/site’s icon to be saved on the user’s home screen. This obviously means that the app/site must then be available on demand, even when the user’s device is offline! Like, no more dinosaur!

At the heart of PWAs is the service worker. This is a special type of the Javascript Workers, which are simply scripts that run in the background, in their own threads. Service Workers are popularly known for acting as a proxy between the network and the client, for caching. With this, you can cache all the assets required for your app to boot up and give value to the user even when they are offline. This obviously means even from the design, offline capabilities are to be factored in.

Beyond offline use, service workers are used to significantly speed up subsequent loads of a web app/site. This is done by implementing one of the caching strategies described here depending on the nature/priority of the files to be cached.

There are a ton of tutorials out there on how to write your own service worker but I highly recommend just using WorkboxJs. This is a library that handles generating the service worker for you, even at build time. This generated service worker is based on the configuration you give to WorkboxJs. I wrote about it in this post.


Workbox, the easier way of adding a service worker to your web app._Workbox is a collection of libraries and build tools that make it easy to store your website’s files locally, on your…_codeburst.io

6. Utilize Web Workers for huge tasks

It’s 2018! The year the main thread becomes the UI thread. Web workers are perfect for offloading any potentially compute heavy tasks. Web workers, much like service workers, run in their own thread separate from the main thread. Having a different spec from service workers, they are then very useful for performing compute heavy tasks like applying filters to images, searching a large data set etc.

I recently had a problem with one of the apps I was building for my small startup, frello. I realized reduced performance on mobile when searching through a data table. To solve this problem, I moved the search function into a web worker. Essentially onkeyup, an array/data and search keyword are passed to the worker which then does the searching based on the passed keyword. The results are then passed back to the main thread using postMessage() and then displayed in the table. A lot of jank was removed and there was visible performance gains as the keyboard wouldn’t freeze anymore when typing.

Another route I’d like to explore is making both threads share access to the same data-store so that, onkeyup only the search phrase is passed. I’m sure there would be more performance gains from doing that.

Not convinced? Checkout greenlet by Jason Miller, a small script to move any async code onto a worker. So tasks like retrieving data from a remote service, can be done via a fetch() in a web worker. At first I thought this whole craze is insane but come to think of it, it does make a lot of sense.

7. Code Splitting

This is breaking your application into chunks which are then loaded asynchronously, following the typical way a user would load your web app/site. For example, let’s assume your SPA has a login, with code splitting, you can separate the code that handles loading the first page that the user visits from the code that handles the login. The latter code is then loaded asynchronously when the user navigates to the login part. This way, for the initial load, the user’s browser had to download just enough of the Javascript (and/or CSS) required to show just that first part. Checkout this great article that explains how to code split VueJs applications with Webpack.


3 Code Splitting Patterns For VueJS and Webpack_Code splitting a single page app is a great way to improve its initial loading speed. Since a user doesn’t have to…_medium.com

The asynchronous loading of the chunks is made possible by the ES6 dynamic import() function that is not a function (don’t ask me why, this might help you though). The import() loads asynchronous modules in a non blocking way.

This however means the browser would have to asynchronously load the chunks, with every page load. But if you have a service worker installed, and you’re pre-caching all your assets correctly, this asynchronous load may only happen the first time only (or never), because the chunk gets stored in the cache.

Code splitting significantly reduces the size of bundled Javascript files. This ultimately improves the page load as highlighted in the second technique above.

Conclusion

I deliberately left out The Stack, because that is a highly opinionated discussion altogether. However there are some Javascript frameworks which make your view a whole lighter and thus faster to load, leaving you with less reactions to the different angles you could use to view the same orange-ish color. (Read between the lines 😂). UI libraries too also play a significant role but that is also dependent on the designer or organisation’s design pattern inclinations, end-user demographics among many other reasons.

Feel free to share your opinions, additions, suggestions etc. Thank you ❤️.