The symptomatic static of the algorithmic match

Written by ted.oneill | Published 2018/11/09
Tech Story Tags: advertising | social-media | google | facebook | algorithmic-push

TLDRvia the TL;DR App

And the “evil surveillance” that gathers the data

Hackernoon is planning to part ways with Medium, it appears. The move makes sense now that it is clear that Medium is focusing on becoming more of a centralized magazine than a home for independent publishers. And, really, Hackernoon has no choice, given that Medium now prevents any third party ads.

Hackernoon made it clear that it plans to sell ads (“sponsorships”) on its new site, but clarified what that means for them:

We believe sponsorship is not evil, if done right. What’s evil is surveillance. Our sponsors will be either site-wide or by subject matter.

Surveillance.

It’s the perfect word to describe the current status quo for most of the big players in social media.

In an online world dominated by the likes of Google and Facebook, we’ve all become conditioned to the watchful eye of the modern Big Brother. I won’t go down the path of calling those companies evil. They are just trying to make a buck… and the way they do that is to track everything we do so that they can build a profile they can use to sell us things.

We have no say on the dossiers they are compiling on us. That makes it a privacy issue, but it also creates a distorted picture of who we are. We are served ads and content that can be quite meaningless to us.

Our true interests cannot be determined simply based on an aggregation of our search history or the links we click on.

How many times have your clicked on an ad or a link, maybe out of curiosity or maybe for a reason that is not even reflective of your own interests (shopping for a present for a friend, for instance), only to be haunted by ads for related products for days?

The algorithms created by these companies can hit the mark sometimes, but they also serve us ads that have no relation to our true interests. This static is noise to us and delivers almost no value to the party pushing it out.

Of course, the algorithms can use frequency patterns to hone in on significant interests, but that model still relies on surveillance to make assumptions. It never asks us to confirm its assumptions. It never gives us the right to adjust them either.

It’s not all about selling ads either. Even an ad-free site like Medium uses algorithms to determine interests so that it can lure you in with targeted content recommendations.

This morning, I received this email from them:

The email was a list of Medium articles about motherhood and pregnancy. As a man, I am probably not the ideal target. There was not even an explanation as to why I was receiving the email.

Maybe I read an article there that referenced mothers. Maybe I followed someone who writes a lot of articles about motherhood. Who knows.

All I know is that somewhere there was an algorithm fail. It didn’t convince me of the value of Medium. In fact, it made me question their competence.

You can laugh these kinds of things off, but its symptomatic of the “push” model we are accustomed to from most major online companies these days. They analyze you and push ads/content/stuff to you that they hope will inspire you to do what they want.

I respect the approach that Hackernoon is taking with its ads. By having ads that target content subjects and not user profiles, the readers are respected and there is no privacy compromise. I would also argue that the ads will be more efficiently targeted.

Anyone who spends a decent amount of time online likely suffers from information overload. We are bombarded by ads, promoted posts, and recommendations, usually based on the profile-tracking performed by the “brotherhood”. It’s refreshing to see sites like Hackernoon develop a business model that respects the privacy of its users.

As privacy measures like the European GDPR take hold, there will be increasing pressure for companies like Google and Facebook to change their methods. Collecting and sharing someone’s personal browsing history for profit is borderline immoral. At the very least, there should be some compensation for the profile owner, since it is their information that is being sold.

It would be far better for people to volunteer their true interests and to treat ads like pulls, rather than pushes. If I read content about sports, I understand that ads meant for sports fans are pulled at the same time. Far better that than to be pushed ads mined via an algorithm matching ads based on an analysis of my personal profile.

In that sense, the algorithm-pushed ads feel more like annoying static, interfering with our lives, while in the pull model, the ads make sense, are properly targeted, and respect our boundaries.

Well done, Hackernoon!

<shameless_ plug>

If you identified with this article, you are probably going to be a fan of Narrative, the upcoming content network my company is building. We’re putting the people in charge in every way and creating a true economy around online content.

</shameless_plug>


Published by HackerNoon on 2018/11/09