paint-brush
Does YouTube’s Algorithm Discriminate Against Minority Creators? by@alexlefkowitz
1,214 reads
1,214 reads

Does YouTube’s Algorithm Discriminate Against Minority Creators? 

by Alex Lefkowitz October 16th, 2022
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

Minority creators claim that they are being discriminated against by the platform’s algorithm. Several lawsuits by LGBTQ and BIPOC creators have , however, failed. Yet now the Supreme Court is re-examining the underlying law, section 230 - which does not account for algorithmic content selection. YouTube's algorithm determines which creators find success on YouTube and which disappear into obscurity. It also flags videos that contain inappropriate content and applies restrictions on who can see them.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Does YouTube’s Algorithm Discriminate Against Minority Creators? 
Alex Lefkowitz  HackerNoon profile picture

YouTube is one of the biggest social media platforms out there, with over 2.6 billion monthly users. Over 500 minutes of video are uploaded every minute as countless creators use the platform to share their passions and interests - and to generate an additional income stream or even a full-time income. 

That’s possible thanks to YouTube’s Partner Program, which gives a cut of the ad earnings on videos to their creators. Plus, YouTube has also introduced some direct monetization features such as Super Thanks and Stickers. These allow viewers to tip creators directly. 

However, many minority creators claim that they are being discriminated against by the platform’s algorithm and that they are losing out on earnings in comparison to their peers. In the past, several discrimination lawsuits by LGBTQ and BIPOC creators have failed, but now the US Supreme Court is revisiting the underlying legislation. 

Here’s the full run-down on these claims, their foundation, and what is happening next. 

The Power of YouTube’s Algorithm 

Before we dive into the long history of discrimination accusations against YouTube, it’s essential to understand the role of the platform’s algorithm in all of this. 

Basically, it determines which creators find success on YouTube and which disappear into obscurity. 

YouTube’s algorithm assesses each video that is uploaded and decides what appears in search results and recommendations. It also flags videos that contain inappropriate content and applies restrictions on who can see them. 

Having an age restriction placed on your video, for instance, practically means that it becomes impossible to monetize. For one thing, it will only be visible to users who are signed in and over 18 years of age. For another, most ads don’t run on videos that have restrictions placed on them, cutting revenue for creators. 

That’s why YouTube’s algorithm has a massive influence over creator success - both the size of their audience and the content they can use to generate an income. 

Restricting LGBTQ and BLM Recommendations

Two of the most prominent discrimination lawsuits against YouTube actually revolve around restrictions being placed on videos by BIPOC and LGBTQ creators.

In 2020, African American creators launched a putative class action against Alphabet, YouTube’s parent company. They alleged that YouTube used its algorithm to flag videos using terminology related to the Black Lives Matter movement, including “racial profiling”, “BLM”, and “police shooting”. Subsequently, these videos were placed in restricted mode. 

A similar lawsuit was already filed in August 2019 by LGBTQ creators. They claimed that videos using words like “gay”, “bisexual”, and “transgender” in their titles, tags, and descriptions were often flagged and restricted. 

Age Restrictions Targeting Black-Created Content

A similar complaint appeared much more recently. Popular Black YouTuber CoryxKenshin posted a video outlining his experience being targeted by YouTube’s algorithm for posting content that his peers uploaded without problems. 

The dispute is about the recently released indie horror game The Mortuary Assistant. It quickly gained popularity among gaming YouTubers and many of them uploaded playthroughs, CoryxKenshin included. 

However, unlike those of other creators, CoryxKenshin’s video was promptly flagged and restricted, without YouTube telling him why. 

He appealed the restriction, which was denied. Not letting up, he eventually found out that the restriction was applied due to a scene from the end of the play-through, which included suicide-related imagery. 

The problem? Play-throughs by other popular, white YouTubers included the exact same scene, without having been flagged. 

CoryxKenshin then reached out to YouTube, once again appealing the restriction, but this time using the unrestricted playthrough by Markiplier as an argument. 

Swiftly, the restriction on his video was removed. 

In his video on racism and favoritism, CoryxKenshin also details how much of his content gets restricted just when it is trending. An example he gives is an old video of his which began trending just after he returned from a hiatus. 

If the algorithm operated without bias, he argues, that piece of content should have been flagged long ago. As it is, it gives the impression that his channel is being targeted on purpose. 

BIPOC Under-representation in Kids’ Content

Another recent piece of evidence of racial bias in YouTube’s algorithm comes from a study by Common Sense Media, a review website for kids entertainment. In partnership with the University of Michigan, they reviewed the kids’ content that YouTube’s algorithm pushes for younger audiences. 

Since the platform features user-generated content, the authors argue, it has a much better chance of representing the diverse reality that we live in. Especially in comparison to Hollywood studios, which still struggle with BIPOC representation. 

Yet the study’s results showed that the majority (62%) of content popular with kids under 9 featured no BIPOC characters at all. Those videos that did include diverse characters were significantly more likely to feature negative elements, such as bad language and violence. Furthermore, 10% of videos achieving viral popularity among tweens and teens featured racial stereotypes. 

That is not to say that positive content by BIPOC creators does not exist. YouTube only fails to promote it through its algorithm. 

While the study’s authors don’t outright accuse the algorithm of promoting biased content, they do issue an urgent call for more transparency and a conscious effort on YouTube’s part to promote positive BIPOC content for kids.

Algorithmic Amplification and Section 230 

YouTube says that it has already made great strides forwards and that it is working to implement programming initiatives designed to promote diversity and inclusion. Nevertheless, the effects do not seem to be apparent to creators yet. 

Part of the issue is also that YouTube and other social media platforms like it currently enjoy sweeping legal protections, thanks to Section 230 of the Communications Decency Act. This law, passed in 1996, stipulates that online companies are not liable for transmitting materials uploaded by others. 

Those of us who remember the internet back in 1996 may well think it’s bizarre that a law from the pre-Google era of message boards, dial-up internet connections, and MSN regulates today’s giant social media corporations. Many politicians agree and have launched initiatives to modify the law. Unsuccessfully, so far. 

In 2020 and 2021 Section 230 was used to beat the lawsuits by LGBTQ and BIPOC creators alleging bias. 

Now, however, the US Supreme Court has agreed to hear a challenge to Section 230. One key argument is that social media giants like YouTube forfeit protection by the law as soon as their algorithms amplify certain content, signal-boosting its message. 

Final Thoughts

Whether Section 230 will be modified on the basis of algorithmic amplification and how it will help diverse creators fight bias remains to be seen. 

For the moment, the best they can do is be aware of the algorithm’s opaque workings and the fact that there seems to be a palpable bias against creators representing minorities. And to talk about it. 

By highlighting their experiences with these discrepancies as a collective, they can contribute to change and provide arguments for the lawyers tackling the legal framework that permits them to happen.