I haven’t sat and wondered about anything in years. Every question that pops into my head at any time, from merely interesting questions (“What’s the world’s largest breed of dog?”) to potentially life-altering ones (“Is it bad if I have tingling in my left arm?”), can be typed into the supercomputer in my pocket and answered instantly and freely. Similarly, I don’t have to wonder what my friends are up to — instead, I just log into Facebook and can know all about their lives, once again instantly and freely. While ancient people spent fortunes to build tiny libraries of books, every single person with a smartphone has access to more information than all of pre-21st century humanity combined.
This is incredible — we can listen to any song ever recorded, learn about any topic, and share our views and express ourselves digitally without any friction. It’s literally addictive to have access to this much information, and while we have a long way to go in learning how to use this technology in healthy ways, our phones and the Internet give us superpowers unimaginable to previous generations.
But of the two words that I used to describe our information access in the digital age, only one is accurate. Google and Facebook allow us to access information instantly, but not freely. While we pay nothing up front for these services, as the infamous quote goes:
“If you’re not paying for the product, you are the product.”
And being the product has serious costs. Scandals like Cambridge Analytica at Facebook and the secret microphone in the Nest Secure at Google have made it increasingly clear that big tech companies want to know everything about us. Engaged in what Harvard Business School professor Shoshana Zuboff dubbed “surveillance capitalism”, the profits of companies like Facebook and Google are a direct function of how much they know about us and our behavior. To constantly have access to information means to constantly be watched, recorded and assessed, and no matter how free it feels, feeding our information addiction has significant psychological costs.
You’ve probably had the experience of being in a public place and knowing, seemingly through a sixth sense, that someone is watching you. You might not know who it is or where they are, but you can feel their eyes on you. It’s a strange feeling, making a normal experience anxiety-provoking and stressful — even thinking about the sensation while writing this in a coffee shop makes me feel a bit uneasy. This feeling is the result of a system in our brain that is constantly determining where others in our environment are looking, which causes us to unconsciously know when someone is looking at us.
We have a similar sixth sense in the digital world. Consciously or unconsciously, we all know that we’re being watched when we browse the Internet, and it provokes anxiety in the same way as being physically watched. In a study, researchers founded that when students learning online were surveilled, they experienced a higher level of mental workload and stress than students who were not surveilled. Furthermore, students that were surveilled while learning had lower scores on a test of the skills they learned than students who were not surveilled. Being watched increases our stress levels and decreases our ability to learn and retain information.
The problems worsen the longer we’re watched, especially if we’re being evaluated in any way. In a University of Wisconsin study, researchers found that workers whose performance was electronically monitored for long periods of time “reported higher levels of job boredom, psychological tension, anxiety, depression, anger, health complaints and fatigue” than workers who were not electronically monitored.
Just knowing that we’re being watched is enough to make us feel uncomfortable. Couple that with the fact that prospective employers increasingly use Internet activity to judge candidates for jobs, and we end up with serious mental health issues like anxiety, depression and anger from the gaze of Big Tech’s algorithms.
In 1787, British philosopher Jeremy Bentham concocted an idea to use the effects of surveillance for the good of society. He created the Panopticon, an architectural design for a new kind of prison. The Panopticon was a circular building; cells were in rows around the circumference of the circle and stacked up to the ceiling. In the center of the circle was what Bentham called the “inspector’s lodge”, from which guards could see into every cell in the prison.
A real-life Panopticon.
What was revolutionary about Bentham’s idea was that the inspector’s lodge was constructed in such a way that guards could see out, but prisoners could not see in. Because of this, prisoners had no idea if they were being monitored, and therefore had to behave as if they were at all times. Bentham argued that eventually prisoners would completely regulate their own behavior — they would essentially guard themselves.
In 21st century society, big tech companies operate as the prison guards in an Internet Panopticon. We know that everything we search, like and post online can be observed, but we don’t know when we’re being watched or what exactly is being recorded. We’re the prisoners in the cells, forced by opaque algorithms to act as if we’re being watched all the time.
While we don’t know exactly what about our web browsing is watched and recorded, the possibility that any of our actions could be surveilled at any time has a massive psychological effect: conformity. When we are being watched, we subconsciously begin to bend our behavior towards whatever standards we assume the watcher has for us. In a study by the University of Sydney, when presented with a variety of scenarios describing immoral behavior, subjects rated the behavior more harshly while being observed in order to project a favorable image of moral uprightness to observers. But the big catch? They didn’t even realize that their behavior had changed at all.
We know that everything we search, like and post online can be observed, but we don’t know when we’re being watched or what exactly is being recorded. We’re the prisoners in the cells, forced by opaque algorithms to act as if we’re being watched all the time.
We experience this digitally all the time. When was the last time you were about to Google something but felt uneasy and decided against it? If we were worried about someone else seeing an unsavory search in our history, we could always just delete it. But subconsciously, we stop short because we know that Google is watching. Something in us doesn’t want our watcher to disapprove — we know there are expectations of acceptable behavior and we unconsciously desire to meet them.
This is scary — far scarier than just the stress that comes from being watched. By watching our every digital move, tech companies are subconsciously nudging our behavior in line with their values and expectations. And what are those expectations? That you’d scroll for just one more minute. That you’d click that ad for a new pair of shoes. That you’d support politicians and regulations favorable to the tech sector. While we don’t know how strong the conformity effects are, the panoptic nature of the Internet uses our psychology to give another measure of control over us to already powerful tech giants.
We’ve become so reliant upon the internet today that it’s basically impossible to participate in society without using online products that make us susceptible to being constantly watched. But even if we could break up with Big Tech, I don’t think we’d want to. I can hardly imagine life without the effortless access to information that I enjoy everyday, and I personally have no ambitions of giving that up anytime soon. Call me a hypocrite, but I’m not a tech hater at all — I think the Internet (and by extension tech companies) have done and continue to do immense good for the world.
But at the moment, we all bear a significant psychological burden as a result of constantly being digitally surveilled. And while we can’t opt out of being watched in the short term, there are a few steps we can take to move towards a better Internet:
With most tech products, you sign away the rights to your data in the Terms and Conditions that you inevitably agree to without reading (we all do it). This opens up the potential for everything about your use of that product to be constantly observed by the company or worse, by outside actors; for example, in a recent scandal, Facebook allowed Netflix and Spotify access to the private messages of users.
Whenever we can, we should opt for products that make it impossible for our actions to be seen by tech companies. Technologies exist to ensure companies have no ability to see private user details. One great example is Signal, a messaging app that’s encrypted end-to-end, ensuring that no one can read your messages except the recipient. Opting to use products like Signal when possible will encourage other tech companies to build these technologies into their products as well.
The problem of Internet surveillance is pervasive because the business model of companies practicing “surveillance capitalism” directly links knowing more about the user to higher profits. When your business is to sell targeted ads, every ounce of information is valuable, leading companies to stop at nothing to know everything about us.
But there are alternative business models that we can choose to embrace. For example, Brave Browser and Basic Attention Token are working to create a new model for online advertising that doesn’t rely upon aggressive digital surveillance. They plan to keep browsing data private on the user’s device and use the information anonymously to accurately compensate content publishers, target advertisements and reward users for their attention, all while blocking online monitoring software. We should encourage new ideas like this and start companies that experiment even more with new business models instead of taking old “surveillance capitalism” models for granted. Without new business models, the incentives of old systems are far too powerful to overcome.
I’ve kept a pen and paper journal for over 6 years now. Everything from irreverent stories to deep confessions are contained in those journals, and I’m incredibly protective of them. I’ve considered taking my journaling digital with something like Day One many times — the practical benefits of quick typing, searchability, and insurance against loss are huge. But despite those benefits, it’s incredibly comforting to know that those words are strictly contained in the pages of the journal that I leave on my desk every day. It would take someone breaking into my house for those words to be read by anyone but me, and that’s a massive relief.
Even with information that we’re less protective of, leaving it in the analog world (or offline in general) can be incredibly valuable. When you tell a friend an embarrassing story, the only data breach you have to worry about is them gossiping about you. When you leave photos on your hard drive instead of uploading them to the cloud, you don’t have to worry about what anyone thinks of how you look in them. The analog world is a refuge from the digital one in which anything you do could be seen by tech companies and — based on their recent privacy scandals — anyone else. Take advantage of the opportunity to live unobserved in analog, free of the psychological price imposed by our current digital status quo.