One of the exciting things about the Internet is that anyone with a PC and a modem can publish whatever content they can create. In a sense, the Internet is the multimedia equivalent of the photocopier. It allows material to be duplicated at low cost, no matter the size of the audience.
Those are some of the words Bill Gates wrote in his essay “Content is King.” Ever since that has become a prophecy among content producers. Among the community where this discussion is most felt is the SEO community. That comprises a community of experts that try to rank content on the web via the main commercial search engines.
Every here and there in the SEO community a discussion goes on about whether Content is still King or that got replaced by something else.
There is no doubt that content matters. However, content depends upon context, users’ intents and the authoritativeness of its author. Those factors are even more evident now after a tornado swept away the traffic from many web properties.
On the first week of August 2018, a tornado that the SEO community called “Medic Update” hit badly many sites. That was an update of the core of Google’s algorithm which shifted millions if not billions of visits from some sites to others. As specified by Google:
This week we released a broad core algorithm update, as we do several times per year. Our guidance about such updates remains the same as in March, as we covered here: https://t.co/uPlEdSLHoX
— Google SearchLiaison (@searchliaison) August 1, 2018
The search liaison of Google, which deals with communication with the outside world (especially SEO practitioners) gave some more details on the algorithm change by specifying:
Each day, Google usually releases one or more changes designed to improve our results. Some are focused around specific improvements. Some are broad changes. Last week, we released a broad core algorithm update. We do these routinely several times per year….
— Google SearchLiaison (@searchliaison) March 12, 2018
Some of the biggest losers of this update seem to be the YMYL (your money or your life kind of website). In short, a site offering you guidance that can affect your life (be it physical or financial) has been transformed from this algorithm change. As reported by Search Engine Land, a list of the site in a week lost anywhere between %30–50 traffic.
As Bill Slawski pointed out:
Content is not king on the Web; meeting a searcher’s intent is. When someone searches for pizza at lunchtime, they likely aren’t looking for a history of Pizza, but rather a slice or two. #SEO #Intent #Context
— Bill Slawski ⚓ (@bill_slawski) August 9, 2018
What might seem trivial is not. In fact, in the SEO community often discussions that revolve around the length of the perfect piece of content are going on at all time. However, content is not the point. Google is a tool to find answers to questions that might be philosophical or very practical.
Thus when I’m searching “pizza” based on the time of the day, I might be looking for something to eat. Alternatively, if I’m just curious, I might be looking for the history of Pizza. Those cases show two completely different intents. One is extremely practical (I need to eat right now!). The other is informational (I want to know the historical context of Pizza).
If you were to listen to the usual saying of “content is king” you’d probably waste your time putting together a two thousand word piece that covers the history of Pizza together with all the other different questions a user might have.
However, that same user might only be looking for specific information that might as well be answered with a 300 words recipe that describes how to make a Pizza. In this scenario, thinking in terms “content is king” will probably kill your business.
For many years the speculation about when Google would become semantic has going on. In short, for those who don’t know how Google worked and how it works now. In the past, Google relied primarily on two aspects for search: backlinks and keywords. So that each time a user would insert that keyword in the search box, a result would be given by matching that keyword with a web page. As simple as that.
Although the mechanism might sound simple, it is true that Google managed for years to index and crawled billions of pages in a snap of fingers. Also, Google’s first algorithm, PageRank, represented an incredible innovation in the commercial space. When Google was launched, the search industry was a plethora of engines that more than serving relevant results would serve spammy content, plenty of banners.
Google change that. However, as Google started to grow at fast speed, none would have imagined (not even Larry Page). The most significant issue for years was to manage that hypergrowth by avoiding the company would implode. Google was able to survive its hypergrowth and become the top search engine in the world. It finally controlled the web.
However, the birth of the SEO industry also represented a challenge for Google, which had to come up with a way to overcome the “manipulation” of its algorithm. While, in general, SEO practitioners operated according to Google’s guidelines (so-called White Hat SEO) many others operated to game the system (so-called Black Hat SEO). Thus, Google came up with several updates to its algorithms, like Panda or Penguin, to wipe out those who were trying to game the system.
Another direction Google is moving toward is about integrating its semantic engine, built on top of its core algorithm as the main propeller for search. Back in 2013 and 2015 Google has updated its core algorithm with Hummingbird and then with a component of it, called RankBrain. To put it shortly, those changes were mainly intended at allowing Google to read the information on the web by looking at a massive knowledge graph it had created. This is a semantic technology that allows search engines to store billions of simple logic phrases (like “I am Gennaro” and “Gennaro knows John”) which connected via logic relationships will enable this graph to grow exponentially.
Today Google’s knowledge graph powers up a good chunk of Google’s queries. Thus, Google is finally able to provide answers to its users based on direct questions.
As of now, Google is also able to offer several perspectives on a single question. For instance, if you ask on Google US “who’s Gennaro Cuofano” you might see this:
Google uses two different features (featured snippet and knowledge panel) to answer the same question from two different perspectives. However, there is one interesting point to notice. The knowledge panel is by far the most stable feature on Google’s search results.
While featured snippets might come and go, knowledge panels usually stick. That’s because the knowledge panel information is coming from Google’s proprietary knowledge graph. While the featured snippet information is coming from a web page. Thus, if another web page will come with a better way to answer that question (and a better SEO strategy) you might lose the snippet.
That is also why knowledge panels might prevail within the Google’s assistant compared to the featured snippet. Thus, the knowledge panel is and will be a critical space on the white page of Google.
That space is already worth billions. Where before the information from the knowledge panel was coming from an Amazon page I had created in the past. As you notice from the picture, that same information is coming from a page of Google books, which I did not create. This is a way for Google to get control back on its search results.
That is also the avenue that goes directly in voice search. Knowledge panels are often featured as answers within Google digital assistants. However, publishers also fear that those same features intended at giving direct answers to users might even kill their business.
In the article “Ok Google, Are You In Search Of A Business Model For Voice?” I hypothesized about four business models that might be feasible going toward voice search. At this stage, though one thing becomes clear; a nice, in-depth piece of content out of context or not able to capture the intent of the user might be worth nothing. In an era where multiple devices will be able to catch our attention by tapping into our intents with more and more accuracy, we’ll need to rethink the way we deliver content.
For instance, while an essay might always be the best writing format to meet users intents on a philosophical matter. Think of someone searching for “the meaning of life” (apparently more than 200k searches per month happen on Google for that). In other cases, where the intent has a very practical and transactional intent (I just typed “Pizza” in the search box around dinner time, and all I get is restaurants) understanding the context is crucial!
Back in 1996, Bill Gates wrote the famous essay that said: “Content is King.” Ever since discussions have been going on about whether this is true or not (you’ll probably find articles entitled “Content is King in 1997?” going forward). While content remains critical, it is also important to remember that the majority of the web traffic today goes through Google (the most popular site according to Alexa).
Yet people searching through Google might have philosophical questions. But in most cases, those are practical ones. In fact, as of 2017, 86% of Google revenues still come from advertising, and while you can still sell something to someone looking for its life’s meaning. It’s way easier to sell something to someone that doesn't even bother asking.
Based on that, at commercial level content producers that can capture the context of a query and understand the intents (deep concerns that get them to act or trans-act) will win!
Planning for the tasks available to a searcher on a site, and making those available to that visitor is anticipating the intents behind a visit (How to, Buy, Book, Watch, etc. )
— Bill Slawski ⚓ (@bill_slawski) September 20, 2018
Originally published at fourweekmba.com on September 20, 2018.