paint-brush
Homo Sapiens Versus The Internetby@DaveDixon
6,189 reads
6,189 reads

Homo Sapiens Versus The Internet

by R. David Dixon Jr.October 3rd, 2017
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

<span>T</span>he internet has promised a lot. It has the ability to give voice to the voiceless and provide a conduit for free expression. It’s been billed as the great hero of free speech, and it is precisely for this reason that the Internet continues to be so heavily censored in countries with authoritarian regimes. From the dictator’s point of view, unfettered access to the Internet can seem tremendously threatening, and in all likelihood, is. Unfettered communication may spread viral contagions such as the concepts of freedom of speech, representative government, freedom of religion, human rights, and democracy. Honestly, how could any respectable dictator NOT direct what ideas might “infect” their population?

People Mentioned

Mention Thumbnail

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Homo Sapiens Versus The Internet
R. David Dixon Jr. HackerNoon profile picture

Humanity has started a long-term relationship with the Internet. We are now vastly more capable as a species. But recent research in psychology suggests that in some ways we weren’t built for this, and will need to learn to some new tricks to survive_._

The internet has promised a lot. It has the ability to give voice to the voiceless and provide a conduit for free expression. It’s been billed as the great hero of free speech, and it is precisely for this reason that the Internet continues to be so heavily censored in countries with authoritarian regimes. From the dictator’s point of view, unfettered access to the Internet can seem tremendously threatening, and in all likelihood, is. Unfettered communication may spread viral contagions such as the concepts of freedom of speech, representative government, freedom of religion, human rights, and democracy. Honestly, how could any respectable dictator NOT direct what ideas might “infect” their population?

2017, however, has been something of a banner year for concerns regarding how online mouthpieces can influence the world. It first hit the fan when Gizmodo published a series of articles exposing the processes behind how Facebook’s “trending news” actually “trends.” It included many alarming discoveries, including the revelation that much of the news that is actually trending is “blacklisted” on a daily basis, for ideological reasons. They also extensively quoted former employees who worked on Facebook’s trending news curation staff. These workers described how Facebook would overtly insert ideologically motivated news (that was not actually trending) into the trending news feed, as well as how they “routinely suppressed conservative news.”

One need not be an admirer of conservative media to sense that these activities could be problematic. The US Senate Commerce Committee quickly launched an inquiry into Facebook’s news curation, but for most people this news was hardly a blip on their radar. In fact, some who agree with the political leanings of the Facebook curation staff may even applaud Facebook for championing their enlightened viewpoint. It is unlikely, however, that those same individuals are still applauding, as it has emerged in the last few days Facebook and Twitter received significant amounts of Kremlin-backed money to promote incendiary content and “fake news” in an effort to destabilize the United States during the election. These ads, seen by over 10,000,000 people, included images of a black woman firing a gun, to promote racial tensions, and Hilary Clinton behind bars (remember “Lock Her Up!”).

Over the course of just a few months, Facebook alone has managed to raise concerns from all corners of the political spectrum, but Facebook is not alone. Google, Twitter, and others are all embroiled in new suspicions, and even legal actions, as individuals, organizations, and lawmakers begin to sense various threats from these online titans. But how much does all of this actually matter? Aren’t we all still able to make up our own minds? Is “brainwashing” really that easy? Aren’t we all still able to come to our own conclusions? The answers are both yes, and no. As a species we just barely swiped right on Internet’s picture, but we’ve somehow already committed a life-long relationship. This relationship has a lot of promise, and we may even end up real soulmates, but it is important that we also realize up front, that in some ways we weren’t built for this. Recent research in the fields of sociology and psychology has shed some important light on some ways in which the internet plays on our most primal social cues. We need to be on guard to make sure that we can hold our own in this new relationship with massive algorithmic interconnectivity.

Evolution and President Trump

At the heart of digesting online information is the evolutionary propensity to mine all input as social cues or signals. In the case of the “trending news,” Facebook was presenting a manufactured image of the social world which more closely resembles a world which the curators would approve of, than it does the actual world we live in. This manufactured world is then presented as social reality. No matter what side you come down on, this is no trivial thing.

We humans are built to use attention as a cue for status. We then use status as a signal of mastery for which we are constantly on the lookout to apprentice under. Joseph Henrich and Francisco J. Gil-White, while at the Universities of Michigan and Pennsylvania respectively, describe a theory of “information goods.” They show that humans, unlike chimpanzees, use “relative prestige” to assign status, and then use that status as a signal of what to believe and emulate.

The challenge is that this evolutionary mechanism was developed long before mass media. It developed through direct observation. The hunter getting the most attention is likely the fellow who most often comes back with the biggest kills. Thus, trusting the attention of others to point you in the right way was adaptively useful. The Internet is particularly adept at scrambling these signals. We are built to believe that seeing others give their attention to somebody, or something, signals the ability for that someone to teach you something useful. Therefore we implicitly connect attention with prestige and prestige with expertise.

President Trump, who has an objectively terrible track record with the truth, provides one example of this. A giant swath of the nation, I assume mostly reasonable and well-adjusted people, describe him as trustworthy, despite what the other half, which I assume are also mostly reasonable and well-adjusted people, feel is obvious and damning evidence to the contrary. His supporters are surrounded, physically or digitally, by others who are paying attention to him. Therefore, some basic, even primordial, part of their brain signals to them that this is someone who deserves attention, emulation, and trust. They believe in him, at least in part, because of his ability to attract attention signals of trustworthiness. When each of President Trump’s tweets, to his 40 million followers, and now to the whole world, receives hundreds of thousands of likes and retweets, we shouldn’t be surprised that he didn’t need to actually do anything to earn that trust. He just needed to attract attention to himself, and then claim that he deserves the trust. The attention supports claim.

Searching Reality Through Algorithms

All of this also has huge ramifications for our consumption of the news, use of personalization algorithms, and the general tendency to surround ourselves with those like us. Again, this is no trivial thing. The “secret” decisions that go into the algorithms behind what we are shown can be extremely powerful. Robert Epstein, the renowned psychologist and former editor-in-chief of Psychology Today, published a series of experiments he recently conducted with Ronald E. Robertson involving over 4500 undecided voters. He was able to show that Google’s search algorithm has the power to “shift the voting preferences of undecided voters by 20 percent” and, for some demographic segments, even by as much as 80 Percent. All this was accomplished remarkably without the voters being aware of any manipulation. They call this phenomenon the “search engine manipulation effect.” These findings alarmed them and even drove them to found a growing new research group, “The Sunlight Society,” dedicated to studying specifically how new technologies might impact “democracy and human freedom.”

From a societal and sociological standpoint, we often don’t take platforms like Facebook and Google seriously enough. Facebook, for example, has some serious intentions when it comes to the news. Last June, Mark Zuckerberg was asked about the role he sees for Facebook and the news, and he stated that it is his intention for Facebook to become “the primary news experience people have.” Twenty-seven percent of the human race are active Facebook users, and so there’s a problem when we are led to believe in ‘unbiased’ reflections of the social world around us, which are actually a carefully constructed, and approved, worldview masquerading as “what everyone thinks.”

Clicks, likes, and shares signal what we believe to be worth our attention, and like it or not, that influences how we and others see the world. That interaction is short circuited if we’re not getting what others actually think is important, but instead what an algorithm, or a few young social engineers at Facebook, Google, or Twitter, would prefer you to think is important. On that level, it’s not so unlike the censorship we loath in China, North Korea, or Russia, (a point those governments often make). They can, and do, control what is written in the news, but at least knowledgeable people within their countries know that there is censorship, and read the news with a grain of salt. However, with the speed and convenience that most of Facebook’s 2 Billion users interpret and react to their Facebook feeds, it’s likely safe to consider the “trending” section largely unsalted.

Birds of a Feather… Lots and Lots of Birds

The Internet may also challenge democracy in much more subtle ways. Online, as in offline, we tend to gravitate towards those like us. Psychologists call this homophily. The difference between online and offline is that on the Internet we are never forced to sit down and have a face to face discussion with someone with whom we disagree. Rather, we can join thousands, or millions, of our like-minded ‘friends’ in promoting, condemning, ridiculing and crusading against those on the “other side.”

Research on group polarization has shown that deliberation with like-minded individuals tends to move groups and individuals towards a more extreme point than their pre-deliberation judgments. In other words, people holding moderate, nuanced, or more inclusive views in the beginning, after speaking to others of the same opinion, tend to experience a form a radicalization where they hold tighter and more firmly to their original beliefs, and actually take them to more extreme levels.

This is called the echo-chamber effect, and it’s playing on those same evolutionary inclinations. Going back to Trump supporters, whether we consciously think it or not, there’s a part of our evolved brain that says “137,000 other people agree with me!! I must be right, like SUPER right, right!?” But the “other side” is doing that too (Bernie Sanders had over 150,000 retweets yesterday). So is every other side. We’re more and more sure that we’re more and more right about more and more stuff that we are also certain is more and more apocalyptically important. The Internet is part of the reason why. All of the sudden democracy isn’t so much about arguing ideas in the town square and then choosing the winner on the ballot, as it is about existential warfare in high stakes games of a really really big us vs a really really big them.

The same thing plays out with the news, “fake” or otherwise. Psychological research has shown that repeatedly hearing a lie causes us to believe more each time, even if we know the lie is completely false. Another psychological experiment found that even when people were directed to simply write one paragraph to defend what they knew was a lie, they still increasingly believed it. Even more damning is the research showing that when we are told a “fact”, even when we are later told by the original source that the information is not true, we still continue to include it in their decisions, and at least to some extent, believe it. If there’s one thing that online news aggregators, social media, and Donald Trump are good at, it’s repetition. And it’s incredibly hard to unbelieve something.

So, what is to be done? Findings in psychology suggest that we already tend to think that people agree with us more than they actually do (this is called false consensus bias). We also tend to look for things that agree with what we already think (confirmation bias). We are not, however, condemned to these fates, but without concerted effort, it is almost always our initial inclination. The question is, how much concerted effort do any of us have to spend sifting through the social flows of our online lives? Add to that the evolutionary pull to trust and follow those who are getting the most attention, combined with the almost miraculous ability to surround our digital selves with what others are paying attention to, and there is some cause for concern. Celebrity is trustworthiness, likes are validation, and the stakes are increasingly extreme.

I don’t hate technology. I love technology for the good that it means I can do. I live in the heart of Silicon Valley and technology is the focus of my career. I bought the very first Android phone (the G1!), and now I look forward to getting the newest iPhone every fall (sorry Google!). I don’t want my boss to see my browsing history simply because of how much time I spend reading the news. But there’s a strong case to be made that without some thoughtful work on the part of individuals, companies, and governments, we may be wildly under-prepared as a species for the relationship with technology that we’re already in.

Please support long-form writing (and this author) by clapping for the article if you found it useful.