Google just passed twenty, yet when we think in web age, it makes of it a dinosaur of the web. Search, which has represented Google main focus for decades and that has represented the most innovative field for years might be now more of a thing of the past. Also, search appears to be more of a burden than a real strength, as we move toward voice search. Indeed, when in May 2018, Sundar Pichai announced Google Duplex, the giant from Mountain View finally showed to the world its capabilities. Its AI capable of conducting conversations with humans, with extreme comfort and fluidity made clear that Google is way more ahead in terms of Natural Language Processing and Understanding capabilities that anyone would have imagined.
For decades users have been used to search as a way to find a list of pages related to a topic or need they had. Even if you tried to ask Google complex questions, it didn’t answer. Thus, a user reverted to insert a simple keyword to find what they were looking for. Yet starting 2012 Google started to build a massive index of meanings of the web, which is called a Knowledge Graph. Today that knowledge graph serves a good chunk of searches on the internet.
The main difference from a traditional search engine and a semantic search engine depends on its ability to answer complex questions by moving along two lines. First, by understanding what the user is looking for. For instance, if I’m looking for “Apple” am I looking for the fruit or the company? The semantic engine looks at meaning by reading the context of a search. Therefore, it will substitute the “unperfect query” with a synthetic, yet more suited query. Second, the semantic engine extracts meaning from the pages around the web. In the past, Google attempted to understand a web page, based on the number of references (so-called backlinks) that page received (besides many other factors).
While this mechanism is still essential, today Google is finally able to extract meaning directly from web pages through a vocabulary called Schema.org. The meaning extracted from web pages become part of Google’s knowledge graph. That knowledge graph powers up Google search results and it also offers new visualizations that make users experience richer. Those experiences are also way more complete and compelling. Thus, Google for the first time can offer a whole journey on its search results pages. Even though Google is still calibrating this feature (like many others) that opens up a lot of questions for the publishing industry but also for the future of the web.
Example of a search powered by Google knowledge graph. When you search for “Larry Page” you see a feature called Knowledge Panel which extracts critical information from Google Knowledge Vault and serves it to users to provide contextual information.
New features like Knowledge Panels and Featured Snippets are taking over Google search results. And the experience and user journey can finally happen on Google above the fold. Indeed, if experts in the field of positioning web pages through Google (SEO practitioners) have been focusing for years on how to rank content on Google’s first page, which got 95% of Google traffic.
Now it has become critical to understand how to get featured within Google above the fold if you want to keep using it as a viable source of qualified traffic for your business. This shift on the search results pages (SERP) makes it clear that a war is going on. That is the war to get the attention of billions of people across the globe. Yet while search has been the answer for over two decades, it might be now part of the past.
The SERP has been and still is Google most valuable asset. That is the place where more than five billion users queries go through and where advertising (don’t forget Google is the largest digital advertiser on Earth) gets sold. However, what has been a strength and cash cow for over two decades might represent Google greatest burden.
As of 2017 Google’s revenues still came primarily from advertising, which makes Google ability to experiment quite limited. Even though last year alone Google ran more than 200,000 experiments that resulted in 2,400+ changes to search. Those changes are incremental, and they can’t be a breakthrough. That’s because Google search results pages are also the most critical assets in terms of monetization. That gives Google limited flexibility to change the way it works drastically. Imagine the scenario in which Google would unleash its AI, that would break the delicate balance between Google and publishers, which — for two decades — have provided quality organic content beside Google paid ads.
Therefore, while players like Amazon can start from first principles when it comes to thinking about voice and how the next wave of search will look like, for Google that might be harder.
Search is going through a radical shift. As Google is becoming more like a portal it also raises questions on the future of search itself. In fact, on September 24th Google announced it would be rolling out Google Discovery. Think of it as Google feed. Where the feed was typical of the experience offered on social media, now the feed is becoming a critical part of Google experience too. In fact, with Google discovery, a user won’t need anymore to insert a keyword in the search box to get started. The user will see a set of stories based on her interest, independently from search.
Thus, publishers that for years have structured their content to accommodate Google guidelines might end up empty-handed!
In this epic battle for the control of “the next web” which might turn out to be completely different from the web as we know it today the sacrifice bunt seems to be the publishing industry. As those publishers have learned to use search engines and social networks as main distributors for their content, those have suddenly stopped sending back organic traffic.
Both from desktop and mobile the no-click searches, for which the navigation of the user happens within Google’s search page result — have increased substantially. The effect is even stronger on Mobile. This means that Google is sending less traffic to those sites and we can expect this trend to consolidate in the next future.
As Slate pointed out in the “The Great Facebook Crash” traffic from Facebook plummeted at a staggering 87 percent. In January 2017 Slate got more than 28 million clicks from Facebook, which became less than 4 million in May 2018. In an attempt to strengthen “meaningful relationships” (which is a way for Facebook to stay alive after the Cambridge Analytica data scandal) publishers are losing most of their organic traffic via Facebook as well. How are publishers responding?
In this era of walls, publishers too respond with paid walls. Will those publishers that represented the backbone of the information industry for over a century survive to the next wave of the internet? Is search coming to an end? And what will it become? Those all remain open questions!
I’d like to end up this article with a few tips for publishers:
Originally published at fourweekmba.com on October 13, 2018.
Create your free account to unlock your custom reading experience.