Big Tech after the Facebook-Cambridge Analytica scandal

Written by asandre | Published 2018/03/21
Tech Story Tags: internet | technology | tech | silicon-valley | cambridge-analytica

TLDRvia the TL;DR App

The sun rises behind the entrance sign to Facebook headquarters in Menlo Park before the company’s IPO launch, May 18, 2012. (Credits: REUTERS/Beck Diefenbach/File Photo)

Now even tech executives are calling for more regulation.

“Tech is definitely about to get regulated. And probably for the best.”

This was a tweet from Aaron Levie, cofounder, CEO, and self-defined Lead Magician at Box. It was his first comment after the news that Cambridge Analytica, the data analytics firm that worked with the Trump campaign and the pro-Brexit campaign, illegally harvested 50 millions of Facebook profiles of US voters.

A few days later, Levie commented again on Twitter:

“The responsibility of tech companies grows exponentially,” he pointed out while admitting that saying that they are “merely platforms and pipes” cannot be a sustainable argument any longer.

He elaborated a bit more in an interview with CNBC’s Deirdre Bosa. “In the past 10–20 years we had a tremendous amount of technology innovation that is now starting to impact every single element of our life,” he said. “This is what we have seen from the digital revolution: it‘s changing our cars and transportation; it’s changing life sciences and healthcare; it’s changing voting and democracy.”

Levie explained that “for quite sometimes we treated these technology platforms as utilities and all the users are the ones who are liable on these platforms.”

“But ultimately, as you see growing use cases from machine learning and AI [artificial intelligence], the platforms themselves are going to be making more decisions on our behalf. That means that we have to make sure that their decisions are those that protect consumers, keep us safer, keep our information private.”

He mentioned how regulation right now is almost non-existent. “We don’t have a lot of either self-regulation or government-imposed regulation that really understand what this future world will look like,” he said. “We’re in the very early stages of a shift in terms of how these utilities and tech companies are seen.”

In an interview with WIRED’s Brian Barrett, Sam Lester of the Electronic Privacy Information Center (EPIC) said that asking what Facebook user can do to protect themselves as akin to asking what drivers could do to protect themselves in a car before seat belts became standard.

Lester mentioned, however, that new regulation in Europe might be a way of dealing with this. “The good news is, some version of a data privacy seat belt may be in the offing,” he said. “The European Union’s General Data Protection Regulation (GDPR), will require transparency from companies about what kind of data they collect, and how it will be used. And while no such law seems imminent stateside, the Attorney General of Massachusetts announced an investigation into Facebook and Cambridge Analytica that could at least shed more light on what took place. Senator Ron Wyden Monday followed up with a detailed series of questions for Facebook to answer.”

Even before the Facebook and Cambridge Analytica scandal broke late last week, Sir Tim Berners-Lee, the inventor of the world wide web and founder of the The Web Foundation, called for large technology firms to be regulated to prevent the web from being “weaponized at scale”.

Berners-Lee’s statement was part of an open letter to mark the 29th anniversary of the world wide web. Prophetic? Or just common sense?

“In recent years, we’ve seen conspiracy theories trend on social media platforms, fake Twitter and Facebook accounts stoke social tensions, external actors interfere in elections, and criminals steal troves of personal data,” he wrote, pointing out that the current response of lawmakers has been to look “to the platforms themselves for answers” — which he argues is neither fair nor likely to be effective.

“Companies are aware of the problems and are making efforts to fix them — with each change they make affecting millions of people,” he continued. “The responsibility — and sometimes burden — of making these decisions falls on companies that have been built to maximise profit more than to maximise social good.”

A legal or regulatory framework that accounts for social objectives may help ease those tensions.

“I want the web to reflect our hopes and fulfill our dreams, rather than magnify our fears and deepen our divisions,” Berners-Lee said.

Mike Isaac of The New York Times, currently on a book leave, commented on Twitter that “one thing I keep hearing is ‘no matter how bad tech cos have screwed up, inviting regulation will make this worse’. Please, explain to me, though, how tech companies should be allowed to continue self-policing.”

“Believe me, I am not one to praise the efficacy of bureaucratic oversight — especially at this particular political moment in time — but self-restraint seems to have largely failed,” he continued.

As of now Mark Zuckerberg or Sheryl Sandberg haven’t talked publicly about the scandal or plans for the future. Mike Allen of Axios reports today that Facebook’s CEO “plans to speak out in the next 24 hours on the data-harvesting revelations that have hammered his stock price, inflamed lawmakers in D.C. and Europe, and trapped his social network in a crisis of trust.”

It’s a big deal, and he knows it,” a source close to Zuckerberg told Axios. “We’re told that Zuckerberg’s remarks will be aimed at rebuilding trust, and that he wanted to say something meaningful rather than just rushing out.”

And it’s interesting how Marc Benioff of Salesforce, who commented the news from Facebook and Cambridge Analytica on Twitter with words like “Wow” (twice) and “Getting stranger,” is now focusing the attention on trust.


Written by asandre | Comms + policy. Author of #digitaldiplomacy (2015), Twitter for Diplomats (2013). My views here.
Published by HackerNoon on 2018/03/21