paint-brush
Eli Pariser’s Filter Bubble is Now 5 Years Old…Have We Popped it Yet?by@aternoy
238 reads

Eli Pariser’s Filter Bubble is Now 5 Years Old…Have We Popped it Yet?

by Alexis TernoyFebruary 15th, 2016
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

<strong><em>The sites we turn to for our information, whether it be news, research or entertainment, filter our results to make them more relevant for us. It is called personalised search and it is now the norm for not only Google search results but also shopping sites like Amazon, news sites like </em></strong><a href="http://www.huffingtonpost.co.uk" target="_blank"><strong><em>Huffington Post</em></strong></a><strong><em>, Facebook and more.</em></strong>

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Eli Pariser’s Filter Bubble is Now 5 Years Old…Have We Popped it Yet?
Alexis Ternoy HackerNoon profile picture

The sites we turn to for our information, whether it be news, research or entertainment, filter our results to make them more relevant for us. It is called personalised search and it is now the norm for not only Google search results but also shopping sites like Amazon, news sites like Huffington Post, Facebook and more.

The result is called the “filter bubble and if you have been anywhere online the last 5 years, you willl have heard of it. Eli Pariser’s 2011 book alerted the world to the idea and we have had our collective eye on the issue ever since.

Why Folks Want To Pop This Bubble?

The problem with tailored web results is that we are served up an increasingly narrow set of pleasing results designed to support our personal status quo. Algorithms filter out results you are not likely to find interesting, and in doing so offer you a filtered outlook on the world.

You end up living in Pariser’s “filter bubble”.

Compare this to a child who grows up eating only bologna and mayo sandwiches, never knowing there is ‘croissant’ and ‘pain au chocolat’out there. Or steak.

Or compare it to someone who is lived in a survivalist compound for a decade and heard only the doom-speak of his fellow preppers…it’s a sheltered world devoid of different opinions and it bolsters convictions that may not be very well thought-out.

Sheltered worlds and filter bubbles alike breed pig-headedness or ignorance.

How Do We Escape This Terrible Algorithm-Driven Provincialism?

If you are interested in maintaining an enlightened world view in a world which allows for civil discourse to flourish, the filter bubble should scare you. To some, it is even the sign that the End has begun: we are finally devolving into the stupid creatures we were never meant to be.

There is good news, however. Thanks to Eli Pariser’s book and the awareness he has raised with his TED talk on the matter (), people are doing something about the problem.

Here are a few glimmers of hope for humanity for you:

The Digital Drivers’ License

Dr. Carr-Gregg, who has worked with Google on such matters, suggests a digital drivers license. In fact, he has had discussions with Google about tech companies recognizing their responsibilities in this area.

To get the digital drivers’ license, you had have to correctly answer questions about what you do and don’t believe on the internet. What test-takers clicked on and how they behaved online would also be examined. It is a way to ensure they know how to manage the incredibly thick layers of nonsense found online.

Eli Pariser’s Idea

Eli Pariser’s recommendation for a fix is similar. He calls upon companies like Google to adopt ethical standards. Google should, in his opinion, offer users a choice in how much filtering takes place. An end user should be able to invite a diversity of viewpoints into his search results, which would challenge his personal status quo.

Bobble

Another solution comes in the form of a plugin. It’s for the Chrome browser, which by the way is a Google product. It strips out the personalization so you can see what others would see if they performed the same search in Google.

It won’t help you with Facebook, Netflix, Amazon or the increasing number of other websites that use personalized search, but it sure is a good start.

Yahoo’s Recommendation Engine

Researchers at Yahoo Labs are working on an algorithm which would return results designed to give you opposing viewpoints. However, this is only in the research phase…there’s no toggle button for your browser yet!

DuckDuckGo

There is an alternative search engine which claims not to use filtering because they respect end user privacy. Called DuckDuckGo, it’s a viable alternative to the Google search engine.

However, that only solves part of your problem since more than just search engines are filtered.

Google’s “Private Results”…NOT!

Google claims to have helped users choose whether personalized search is used when they use the Google search engine. There is a toggle button (choose the head or choose the globe) that turns off personalized search.

However, there is a new definition of “personalised search” at work here, which has very little to do with offering users a wider perspective through search results. Google is defining “personalised search” as results which give content from Google+ pages preference in your results. It also searches your Google calendar and contacts and serves them up in your search results.

By toggle the button you are simply reverting back from this “super personalised” version of the algorithm. Now you get the personalized search we have been talking about all along. Thanks for nothing, Google!

Not Yet Popped

While it is clear we have not yet popped the filter bubble, it is good to know there is growing awareness and, five years after Eli Pariser’s ground-breaking revelation hit the newsstands, some of us are working towards a solution.

More on my blog http://outofoffice.today

My recent post: