Data ownership hawk @iotex.io. Stanford Grad. I ran a crypto hedge fund during peak ICO mania.
During the 2020 super bowl, Amazon shelled out at least $15 million for a 90-second ad promoting its ubiquitous Alexa smart assistant. The ad was lighthearted and featured the ever voguish and uncontroversial Ellen DeGeneres alongside her spouse.
The ad was aired on Ellen’s TV show in the week prior to the big event.
However, behind the smiles of Ellen DeGeneres and the creepy, ever present Amazon logo are vast networks of data collection, distribution and processing architectures powering the ostensibly benign “Alexa tell me a joke” interactions undergirding the popular appeal of the device. Alexa powered devices (which dominate 70 % of smart speaker market share; Google Home lags far behind at 25 %) collect information on where their users live, who lives in their household, what their interests are and what they are buying.
What is Amazon doing with all this user data? In late January 2020, a report by the EFF showed Amazon’s massively successful doorbell camera Ring sends sensitive and personally identifying user data directly to advertising partners via it’s Android mobile app. It’s almost certain that Amazon shares Echo data with Google and Facebook and vice versa given the deal they cut with ring.
In fact, we have clear evidence that Amazon is up to more than just delivering setups and punchlines with Alexa. One of Amazon’s many patents includes keyword analysis of data captured from a listening device “for purposes such as targeted advertising and product recommendations”.
A screenshot from an amazon patent application, granted in 2015.
Let’s take a step back.
In today’s connected world, technology users have grown accustomed to making the implicit trade-off of getting a free service in exchange for personal data when searching Google or signing up for Facebook. This is an implicit exchange oft summarized by the cliche: “if a service is free, you are the product”
It’s common knowledge that these companies (with Facebook and Google constituting the primary offenders) package your data into actionable insights to sell to advertisers targeting ads tailored to maximize the chances they capture your precious click and ultimately discretionary income. Sooner or later your credit card comes out.
In other words, the service is not even a little bit free. According to their January 2020 SEC filing, of the $70.7B in revenue Facebook generated in 2019, $69.6B came from ad sales (i.e. nearly 99%). All told, Facebook users “paid” an average of $25 in value to Facebook in terms of their value to advertisers. A number nearly twice as much as they “paid” per user in 2016 (then ~$14) despite the service hardly improving at all.
(Mark thinking about mass data harvesting and user manipulation)
“It’s going to take time but over the next decade I want us to build a reputation on privacy that’s as strong as our reputation already building good, stable services,” Zuckerberg said during the annual financial conference call on January 29, 2020.
What’s interesting about Zuckerberg’s statement during a financial conference call is that he cannot possibly be defining privacy in the same way users of his platform define privacy.
Privacy to Zuckerberg means conducting analysis of user data on Facebook’s servers before delivering actionable insights to advertisers. In other words, “privacy” means reducing the raw data sent to advertisers in favor of delivering actionable behavioral insights. Facebook will sell the “why” of data instead of the raw and un intersting “what”. This is not privacy, it is merely packaged and expressly delivered personal invasion.
In 2014, Facebook ran an undisclosed study on 689,003 unwitting users titled: “Experimental evidence of massive-scale emotional contagion through social networks”. During the study period, Facebook manipulated the newsfeeds of study “participants” (who had no idea what was going on), showing a random group of them a disproportionate number of negative posts.
Facebook was able to demonstrate the existence of what they call “emotional contagion” where users shown a greater number of negative posts begin to post more negatively themselves. In other words, Facebook can unilaterally and arbitrarily manipulate the moods of it’s users.
Despite all this, you still might be asking “so what?”. Facebook offers a valuable service to me, I’m happy they’re making money, and I trust them not to manipulate me. Fair enough. But what if you shelled out $99.99 for a Ring doorbell camera like 400,000 people did in December of last year alone, or $74.99 for an Echo?
Amazon charges a high enough retail price to pocket a healthy profit so we’re done right? It’s the kind healthy win-win customer-business exchange that has turned the global economic crank for centuries.
The problem is, Amazon (along with Google and many others) decided that this is not nearly enough of a margin for their shareholders. In fact, this is just the very beginning of the value generated from the sale of each device, which installs an always on data vacuum that begins sucking the moment customers plug in their Alexas (or Rings, or Google homes).
The late night fantasy of advertisers everywhere is to eliminate uncertainty about a users reaction to their ad.
Advertisers would like to only serve ads with absolutely certainty that they will induce a buying decision (either consciously in response to the ad or less consciously at some later date), every other ad they pay for is a waste of money and time.
However, there is no way to conduct truly comprehensive behavioral predictions without treading into the real world and that’s where internet connected smart home devices (i.e. “Internet of Things” devices) come in.
With smart home devices like the Amazon Echo or Ring camera, the free service for privacy loss exchange gets perverted. Consumers are actually paying upfront and above cost for devices that are in fact the longest data extraction tentacles yet to grow out of the big tech krakens.
Facebook, Google, and Amazon are investing heavily in IoT not because they want to create great products to sell to millions of happy people and leave it at that. It’s so they can expand their massive surveillance systems to the real world in order to increase their revenue per user.
Serving highly targeted ads on what you click on in a scrolling news feed is one thing, but delivering an ad via your smart speaker based on how well you slept the night before and how long you brushed your teeth for is another thing entirely.
For consumers, it’s understandable if you feel like throwing up your hands when reviewing different products privacy policies. Even for the highly technical, the burden of reviewing technical specifications and assessing the information sharing of devices is overwhelming. To make matters worse, even if a device doesn’t start out sharing data all it takes is a manufacturer to push a software update and the data extraction crank can start to turn.
What to do if you’re a privacy conscious individual, or simply someone who doesn’t like to pay to be manipulated, exasperated by this quagmire? One recommendation is to look behind the shiny packaging and instead focus on the companies behind each product.
Are the companies manufacturing these devices building businesses that are based on data extraction and ad serving in a larger sense? If the answer is yes, even if the device itself isn’t a conduit for ad delivery yet that does not mean it is not an undercover reconnaissance agent serving a larger surveillance power.
For example, instead of analyzing each Google product for it’s individual data collection merits (besides, these are subject to change at any time) understand that every Google manufactured device is built for the purpose of serving a much larger data collection engine and as such has been architected from the very beginning to serve as a data extraction tool on its users as opposed to a device built for its users.
With IoT devices made by these manufacturers you are played twice as a fool. First you pay hard cash for a product and second you are forced to share your personal data with these companies who then turn around and re-target you with ads effectively making you pay twice. In this case, the service is not free and you remain the product.
Fortunately, a future of ubiquitous and rapacious data extraction is not fatalistically pre-determined. There is still a potential for us to achieve Steve Job’s vision for computers (and by extension all internet connected devices) to serve as “bicycles for the mind”.
Which is to say to enhance and magnify human creativity and brain power rather than to map it for the purposes of highly targeted manipulation.
The only way to achieve this as a consumer is take a hard look at the business models of the companies you as a consumer make the choice of being complicit in when you purchase a product from a serial data seller.
I envision a future where the anti-consumer decisions made by Google in 2001 when they first established this mechanism of data extraction now referred to as “surveillance capitalism” do not move past the screen and into the consumer’s home. You have the option to put your foot down and take a stand for privacy and for business models that do not take more than what is fair.
At IoTeX, we do not even have the infrastructure to extract your data. Our business model is not and can not be based on data extraction. Building in full data ownership right into the controls of our devices is based on Privacy-by-Design and Ethical-Design principles that place the user’s agency as not only a value, but a product blueprint.
Organizations must build user controls from the ground up. Data collection must occur on a default opt-out basis. As a consumer, you have a choice. Don’t make the choice that makes you pay twice.