Abhishek Anand

@abyshake

Everything wrong with Facebook’s countermove on Revenge Porn

Facebook is piloting a service to help people prevent revenge porn by ‘sending their nudes’ — voluntarily!

Zuckerberg may not have thought it through (img — Stephen Lam / Reuters)

Revenge porn! Two words that are as horrifying as they are disgusting. To be honest, the first time I got to know about the ‘term’ was via an episode of HBO TV series — The Newsroom. One of the characters in the show was going through a lot of trauma on all fronts (including facing disciplinary issues at workplace) because an ex-boyfriend had chosen to share intimate pictures of her on a website. Sloan ended up emerging strong out of that ordeal (not to mention punching the ex in the face, which was just classic). But, imagine that happening to you. Being in a position where you are always thinking of whether the person sitting next to you in office, in the classroom — has looked at those pictures, those videos or not. Where you are always waiting for the other shoe to drop. Waiting for someone to come ask you about it. Thinking of ways in which you can possibly get them taken down.

Now Facebook has come to your aid. But I do have my concerns. I would lay them out in detail, but first let us look at things in a slightly detailed light.

A REAL PROBLEM

No doubt it is a real problem. One that should be and needs to be addressed. For a moment, I was asking myself — “Why Australia?”, and with less than 2 mins of Google search, I landed on this:

It’s believed as many as one in five people have experienced what’s called image-based abuse, but the real scale of the problem is impossible to gauge.

Source: ABC News

(img)
(img src — leftright)

Whether you have encountered someone who has been a victim of this — sometimes faceless — crime or not, you can’t deny the fact that it is happening around us, and it leaves the affected women (and sometimes men) feeling degraded, traumatised and humiliated. Severely!

FACEBOOK STEPPING UP, ALONG WITH GOVERNMENTS

Apparently Facebook is working with four different governments to counter this challenge. While the other three countries still remain unknown, Australia is where the program is piloting — in collaboration with the govt. agency eSafety.

THE PROBLEM?

The strategy entails uploading your nude photos or videos to Messenger in order to help Facebook tag it as non-consensual explicit media.

You want people to send you their nudes so that their nudes don’t end up on the internet? Sure! That makes perfect sense, doesn’t it?

How would you address the fear people will have that they are ‘proactively’ giving up their images for possible abuse?

The second you are asking someone for sensitive details, people will always be concerned about a fear of misuse. I know of people who opt-out of autosaving their credit card info on ecommerce websites as well. Imagine how they would feel about supplying you their nude pics and videos just so as to prevent someone else from doing that.

To be fair, both Facebook and eSafety are claiming that these pictures that you would be sharing would not be stored on Facebook’s servers.

“They’re not storing the image, they’re storing the link and using artificial intelligence and other photo-matching technologies,” e-Safety Commissioner Julie Inman Grant told ABC. “So if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded.”
Facebook’s hashing system would then be able to recognize those images in the future without needing to store them on its servers.

Great, but I have two points to make here.

  1. It would not do much to counter the psychological inhibition many would be faced with while they decide whether or not to participate in the program.
  2. The final verdict on this — whether the images would be stored or not, or whether they would be seen by any third party or not — are still sketchy at best.
(img)
Promises to delete. Sure, that sounds convincing. (img)
An employee will have to review your ‘uncensored’ photo. Am I the only one who isn’t quite comfortable with how that sounds like? (img)

The official release from Facebook itself talks about ‘a specially trained representative’ who would be reviewing the image.

On a side note, remember that Twitter employee who was leaving the firm and chose to take down Donald Trump’s twitter page for good 11 minutes on his/her last day? Surely there is no way something of that nature is possible here, isn’t it?

MY PROBLEM WITH THE WHOLE THING?

It fucking stinks!

From whatever I have read about the story so far, it appears as if ‘each’ intimate photo of you that you choose to share would be given a unique fingerprint marking it as a ‘Non-Consensual Intimate Image’. But it does not prevent sharing of images that you have not shared? I am not sure if that is the case, but if it is — boy oh boy. Imagining sharing an image with your boyfriend first and while he is trying to be all lovey-dovey in his chat, you are sending that image to Facebook to mark it as safe. Every. Single. Fucking. Time.

And why do they even need you to specifically send a NUDE?

When it comes to the ambassadors, champions and proponents of Machine Learning and AI, Facebook and Zuckerberg are as big as it gets. Are they really saying that they can’t use a combination of the learning of what can be considered an intimate image and your face to take this problem head on?

Before you answer that, understand this — for both those parameters, the only problem Facebook would be facing would be the problem of plenty, not scarcity of data. And it may also be noteworthy that now we have AI systems that can enable a robotic arm to place an apple in a blue bowl every single time (irrespective of the bowl’s position).

Is it really that far a stretch to make this happen without having access to a user’s private photographs? Really?

Here is a crazy approach:

  1. A simple privacy page for all users that allows them to enable sharing of ‘their’ photos by ANYONE OTHER THAN THEMSELVES. This can have varying degree of ‘exposure’ (for the lack of a better word). Let us say I am not comfortable with people sharing my photos in a bathing suit, I can enable such a setting (not bathing suit per say, but lets say — exposure).
  2. The users are able to upload any image they choose to. However, if I have enabled a filter that asks for authentication from my end as well if I breach some sharing protections I have enabled, I would need to re-enter my password. Prevents misuse in case of loss/theft of my phone.
  3. Facebook already knows what I look like.
  4. The internet is full of images of varying degree of exposure. Facebook trains up a machine learning algorithm taking in all those images. Hello Pornhub! Anything that shouldn’t be shared as per the user’s settings, won’t be shared.
  5. Voila!

I am no data scientist myself, so I put my question to every single data scientist out there:

Is that approach really impossible for Facebook to follow and execute?

This is not a problem that requires employee intervention — even if specially trained community managers. This is a machine learning problem. For god’s sake treat it as such.

That’s it for today; see you tomorrow!

I am Abhishek. I am here... there.... Everywhere...
Medium | Twitter | Facebook | Quora | LinkedIn | E-mail
Click here to join the mailing list.

Disclaimer — Some of the quotes in the story have been taken up from various news stories on Techcrunch, Verge etc. — the links to those news articles are there in the story.

More by Abhishek Anand

Topics of interest

More Related Stories