paint-brush
SEO Analysis: How to Improve JavaScript Module for "Related Stories" & HackerNoon's Internal Linkingby@murrough-foley
218 reads

SEO Analysis: How to Improve JavaScript Module for "Related Stories" & HackerNoon's Internal Linking

by Murrough FoleyAugust 30th, 2022
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

For those impatient SEOs, HackerNoon should be generating much more organic traffic than it is. After deep diving on the site, despite the huge number of great links, the top quality content, and great branding, HackerNoon is struggling to increase online visibility due to poor internal linking and a reliance on a JavaScript Module, "Related Stories", which (probably) doesn't fire for Googlebot. If you find this beneficial and have an open-source or commercial project you want me to take a look at, connect with me, Murrough Foley, on Linkedin.

People Mentioned

Mention Thumbnail
Mention Thumbnail

Company Mentioned

Mention Thumbnail
featured image - SEO Analysis: How to Improve JavaScript Module for "Related Stories" & HackerNoon's Internal Linking
Murrough Foley HackerNoon profile picture

Launched in 2016 covering tech, entrepreneurship and productivity. HackerNoon has gone through an amazing transformation over the years.

From covering a very limited number of topics at the outset, David Smooke and team have built a platform that people love to publish on with a stringent editorial process that ensures all the articles provide value.

This SEO review is part of a series that looks at similar platforms where people can publish their tech stories across the web. It grew out of my own curiosity about where best to publish and each of the platforms in the series has their own positives and negatives.

But first of all let's take a look at where HackerNoon is at for organic traffic.

Has HackerNoon Organic Traffic Stagnated?

Below is a screen grab from Ahrefs organic traffic estimates. It does not reflect traffic from social or direct traffic and even its organic traffic estimates are wildly inaccurate. Ahrefs, despite being a great tool, just can't accurately track all the long tail keywords, especially around the tech and emerging tech sectors. It is useful to indicate the general trends of a site, and the below estimates should be seen in that light:

HackerNoon was making great progress until around May 2020, and then SEO traffic seemed to drop off in a big way.

This analysis is to try and isolate what happened at this time and how to fix it. Like all things in SEO, I should say that there are usually more than one factors but for this case, I believe I found the main issue.

Before I dive into it, and show the timeline, we need to set a bit of context first.

Why Projects Rank and Fail To Rank

Developers are first and foremost looking at functionality, then looking at the UX of a web app or site and SEO often comes as an afterthought, unfortunately. Take a look at the sheer number of areas a developer has to be familiar with and then consider the growth of new technologies and you can understand why SEO best practice sometimes drops through the cracks.

But the consequences of ignoring SEO can be pretty sobering as visitors fail to find a new web site or organic growth is stunted.

Tried & Tested Platforms

The majority of the internet is built on reliable, tried and tested platforms like Wordpress (love it or hate it), Magento, Squarespace, phpBB and Shopify. HackerNoon is built on its own CMS platform.

Many of these platforms are themselves commercial enterprises with a paying user base that won't be happy if an update or change to the platform affects organic visibility. They have a very real commercial need to get the SEO bit right. Any disruption to organic rankings on their platforms will see an exodus of customers and a drop in income.

Interestingly, Squarespace have worked hard over the years to improve their reputation and woo the SEO crowd. I'm not sure it's worked.

The point here is that most of the web relies on tried and tested platforms that know, through years of trial and error, what works and what doesn't.

We Built This Internet On HTML

Google, Bing and the increasing number of new search engines were all built around a simple idea.

  • Parse server side HTML
  • Understand the topic and questions a page (or url) can answer
  • Process the number and quality of inbound links
  • Rate/rank the page accordingly for a variety of terms
  • Serve the most useful to the user

This hasn't really changed, despite all the new technologies and updates that cause hand-wringing. It's still quality, targeted content and links.

Google Is a Business That Wants Data Structured in Specific Ways

Google is a business and there are costs associated with every page crawled, every article parsed and every link followed. Time and time again they have indicated to webmasters and developers best practices that will help provide users with better results at a cheaper cost for Google. These methods can include;

  • Limiting the number of indexable pages to quality, usable content
  • Using Schema Markup to help parse and process information
  • Simplify the crawling process by using clean, well-organised code
  • Improve site crawl-ability via hierarchical and intuitive internal linking

Google wants us as webmasters, SEOs, developers and entrepreneurs to make their job easy, reduce their costs and assist them in providing the best search results possible.

And for that, we get improved search engine visibility for our projects, more clicks and visitors which may mean more income.

It's a pretty easy-to-understand proposition.

The Growth Of JavaScript & Dynamic Content

Client-side rendered JavaScript is great and has given rise to some fantastic sites and web apps with great functionality.

A dynamic page or dynamic module can indicate the activity on a site, provide immediate feedback to a user, or be part of a full-fledged app that works from your browser. Isn't technology great?

But what's great for the user, isn't always great for a search engine and there are two main issues with client-side rendered content;

  1. It increases the costs for Google
  2. It may delay crawling or content may not be crawled at all

I won't go into too much detail about the dangers of client-side rendering of a web-page or JavaScript SEO, because it's frankly be covered really well elsewhere.

But just to say that a page should not rely completely on dynamic content. Important links and static content should be available to the initial crawl of Googlebot. This speeds up indexing and ensures that the page context (its primary topic) is understood and that the site is crawled quicker.

Here's Erick Hendriks touching on Google's crawl and rendering processes from 2019. He also briefly talks about Javascript, Web Rendering Service (WRS) and CPU costs.

JavaScript on HackerNoon

So what happens when we disable JavaScript on Hackernoon?

Well, the first thing that happens to an article page, is that the megamenu disappears. Comparing the number of outbound links on a typical page rendered using Javascript and old school HTML, the number of (internal) outbound internal links drops from around 450, to about 5.

So without JavaScript, HackerNoon has almost no internal linking and is completely reliant on Google's ability to complete it's process of crawling, queuing, rendering, processing and hopefully indexing the links found in the menu.

To be fair, Google has gotten much better at this over the last number of years and I've no doubt that the devs at HackerNoon have done a great coding job.

Is this the reason that HackerNoon's organic traffic is flat? Probably not, but it does highlight the next issue.

Internal linking on Hackernoon is, for the most part, generated via JavaScript.

The Magic Of Internal Linking

SEOs love links and we spend a large part of our time on the clock building them. Whether it's an outreach email, or a controversial bit of content and it's promotion, a large part of the work week is centered around building links.

But Internal links are the best! They are free and as a rule of thumb, 4 internal links are as good as one decent inbound external link (This rule of thumb is nonsense - but useful if you need to quantify and priortise internal linking to a customer or client who needs a cost benefit ratio).

I'm not going to go into the nitty gritty of content silos, Bruce Clay wrote about it 15 years ago and not a lot has changed since then. A large amount of highly related content, interlinked in a hierarchical way, will increase visibility for all the linked pages. It shows Google that a domain has in-depth specialist knowledge and you get a boost for covering a topic from several different angles.

HackerNoon & Internal Linking Topics

After looking at a number of similar platforms over the last number of weeks, one issue crops up again and again with sites that leverage user generated content.

A lack of internal linking and an over reliance on the "similar articles module".

Unfortunately across the multiple platforms I've looked at, users don't cross-link similar articles. And why would they? There is no prompt or visual aid to help find other great content on the platform about the topic they are writing about.

Most of these sites rely on a "similar articles module" to provide these internal cross-links as suggestions at the end of the article.

But there are two issues with the way that HackerNoon have implemented this module.

  • Internal articles linked are too often unrelated
  • Module only loads scroll.

Unrelated Articles Linked

Most of these modules use a mix of things to generate the related articles including; the user, the tags and the date.

HackerNoon's suggestions are completely dynamic(this is a problem) and it shows articles based on date - ie: the freshest articles. This is great for UX and highlighting fresh articles through out the site.

But...

By not linking articles on similar topics and and using dynamic links based on recent date, HackerNoon misses out on the opportunity to build any content silos and the search visibility that goes with it.

Perhaps this is by design but it's awful for SEO.

Related Articles Module Only Loads OnScroll

The "related articles" module is loaded into the DOM by a user scroll event so these links may not be crawled or recognised by Google at all.

Googlebot doesn't scroll through a page in the same way a user does, but it does use a couple of tricks to fully load pages that contain JavaScript events. Googlebot changes the viewport to 12,140 pixels in height. In many cases this will trigger a bunch of JavaScript events such as loading content etc.

But in my tests, the HackerNoon "related article" and the "tags" module aren't triggered and so these links wouldn't be crawled.

HackerNoon's Unique Position

HackerNoon has developed a great reputation over the years and there are some very talented people who take the time to write on the platform. An important part of developing this reputation for quality in-depth articles is the HackerNoon editorial process.

Low-quality spammy pieces don't make the grade and editors work with content creators to make sure articles hit a balance of technical or specialist insight and readability.

This editorial control baked into the platform offers HackerNoon a huge advantage over its competition. The editorial process allows the HackerNoon team to add internal links to other relevant (evergreen) articles across the site.

I'm unsure whether the site's terms of service would allow this to be done retroactively. I'm guessing it would, outside of the corporate partners HackerNoon has developed over the years.

The Evidence For Organic Traffic Growth Being Limited On HackerNoon

From the Ahrefs organic traffic estimation tool, we can see that the issue raised its head in April of 2020.


HackerNoon's Evolution

HackerNoon, as you would expect, has evolved over the years, adding features, refining the user experience and speeding up its loading time. The area that we are interested in, is the "tags" and "related articles" modules located at the bottom of the page.

HackerNoon November 2018

In late 2018, it looks like the tag pages are hardcoded and there are two profile cards at the bottom of the article.

It doesn't look like there is a "similar articles" module, but it may have been loaded in a different way.

HackerNoon August 2019

There was iteration throughout early 2019 and in August, the bottom section contained

  • Links to the tag pages
  • More by author
  • More related stories

Interestingly, the linked articles in "related stories" don't change or update over time. They look static which creates permanent connections between related pages and is great for SEO.

HackerNoon May 2020

As we saw from the traffic estimation tool, May saw a huge drop in traffic site wide. So what happened in May 2020? The tags links are still there, but the direct links to the authors other posts has been removed and the related stories module has been reworked completely.

The related articles module now uses a dynamic process to show the fresh and sponsored articles that have little topical relevance to the rest of the content on the page.

HackerNoon Today

Remember that the way the "tags" and "related stories" modules are loaded now, it's highly unlikely that these links are seen at all. I would have to dig into the server logs to confirm this.

Regardless of whether Googlebot parses these links, you can see that suggested articles are based on time rather than topic and so, don't provide an SEO benefit.

Two Tests To Confirm The Issue

Nobody wants to do work unnecessarily and if you are proposing major changes to a customer or client, it makes sense to show them low risk proof that the invested time and money will pay off. I'm sure the boar and developers at HackerNoon are no different. So here are two low risk, low effort ways to confirm that the "related articles" module needs to be rethought and page linking needs to be a part of the editorial process.

Render the "tags" module

Reworking the "related articles" module would be a huge endeavour that would require a lot of thought, consultation and testing. For a quick test this is not feasible.

The next obvious choice to take a look at the "tags" module, loading this with the main content should be a lot easier and would allow the huge amount of link equity to move through the site.

I would expect to see site-wide improvement within about 2 - 3 months of this change. With recent articles in the tag pages benefiting the most.

Choose Two Topics & Manually Interlink

Interlinking within the body of an article is the most powerful type of internal link as Google is able to look at the context of the hyperlink and draw from the anchor text to understand the target page. The Hackernoon editorial process probably allows for the team to go in and make small edits such as adding internal links.

So how to choose a topic or set of articles and track them? Ideally, you want to look for pages that are on page 2 for a keyword or group of keywords, add the internal links with related anchors to the set of pages and wait.

The coding interview url from the example above is a reasonable place to start as HackerNoon has a large number of related articles that could be linked.

Other SEO Issues That Need Some TLC

There were some other things I came across that could do with a little attention, so in no particular order.

Tag Pages

The tag pages provide little inbound traffic. They could be indexed much better. Why not add a couple of hundred words per page to add context to the list of internal links and at least give them some value outside of a UX perspective.

site:hackernoon.com/tagged

Profile Pages

Profile Pages don't contain an H1. If the SEO purpose is for the user's profile to be found high in the SERPS when the name is Googled, this lack of a clear h1 with the username is an oversight.

Article Schema
Whatever way the article schema is parsed from the body of the content, it's full of encoding errors and if the idea of schema is to provide the data in a clean and structured way, is this being achieved currently?

Conclusion

HackerNoon is a great site and has built an industry wide brand that is synonymous with insightful and entertaining tech content. They've built partnerships with the top tech companies in the world and steadily introduced great new features to the platform.

With a huge number of inbound linking domains and with all this content With topical interlinking baked into the editorial process and a re-engineered "related articles" module, I have no doubt their organic traffic would skyrocket.

If you find this beneficial and have an open-source or commercial project you want me to take a look at, connect with me, Murrough Foley, on Linkedin.