It's not practical or sustainable for one corporate entity to govern the conversations of a global audience.
That's why the focus is shifting away from mega-social networks back to more manageable, human-friendly independent communities.
What is an independent community? It's a space for peer to peer conversation online, which is owned and run by an individual or small organization. Think of it as the community version of an independent bookstore (where Facebook = Amazon).
There are many reasons why global social community governance is not sustainable. (And I won't argue with anyone who says that Facebook isn't actually a community, but they claim to be so fair game.)
Topics and images that are totally acceptable in one country (or region) are punishable by imprisonment in another country.
You can toggle languages in most of the big social networks, but simply translating words into another language does not account for the nuance or cultural values expressed in those words.
A one-size-fits-all approach to content moderation does not work.
In an independent community, the owner typically provides posting guidelines that are appropriate to the specific audience of that community. Instead of having a global police force, you have a "beat cop," who knows the neighborhood and its residents.
We've already seen numerous incidents of political takedown requests, from all sides of the political spectrum and from around the world. We've seen governments push content decisions on US-based social tech companies.
When communities are built within the structure of a "borderless" social network, these decisions can have unintended consequences on the ground, where there ARE borders. And why should a team of employees at a US-based company be tasked with being the world's thought police?
Privately held companies have every right to control what is shared in spaces they provide online. Where things get shady is when content is selectively taken down, or rules are enforced unevenly or non-transparently.
Within a smaller, independent community, there is an explicit understanding that the site owner can set content guidelines and then delete content that violates those guidelines.
The disgruntled member's recourse is to simply complain and/or move to a different community. It will not trigger an international incident if Joe's song is deleted from the Sea Shanty Songwriters Community because it was Death Metal.
Big tech's posting guidelines are mostly unwritten, frequently shifting, and enforced mostly without recourse.
One way big tech has tried to deal with the firehose of content is by resorting to AI, which is imperfect and can be biased as well. (In fact, the EU is currently exploring ways to curb the use of AI to make decisions about our personal lives.)
AI tends to cause a lot of "bycatch," which is the unintentional netting of innocent fish/content with a big net.
Cue the "overtly sexual cow."
The typical indie community is a manageable size and can scale moderation along with member growth, allowing humans to determine just how sexy the cow is allowed to be.
How would you go about reviewing 2,880 reported pieces of content in a 24 hour period? That's the expectation for each Facebook content moderation contractor reviewing reported content.
The sheer volume of content is overwhelming the teams tasked with doing the reviews, and it is taking a human toll.
When you decentralize the content into purpose-built communities, you also alleviate the command-and-control pressure to vet the most violent and extreme content by a single team of moderators.
No community is a utopia, but the personality scuffles and bad content that the average independent community moderator deals with is a cakewalk compared to the tsunami of evil facing the contract content moderation workers in the social network economy.
Facebook has had to segregate EU member data to comply with the privacy rules of GDPR. With a so-called "borderless" community that spans the globe, you need to account for hundreds of unique (and some conflicting) regulations regarding data privacy, copyright, and child protection.
Complying with every tech regulation in every country in real-time is not feasible.
In the world of independent communities, compliance is much simpler. The vast majority of groups don't meet the threshold (revenue or amount of data) to be required to comply. They have ownership based in a specific country and therefore can deal with that country's regulations if necessary. And many indie community tools provide simple compliance mechanisms for consent, privacy enforcement, or age checks.
How do I know that the post I just shared on a social network will be seen by anyone? In the world of big tech, I have no reliable way to ensure that my content is seen by the intended audience without algorithmic throttling.
Creating an algorithm that effectively serves up the best, most interesting, most engaging content on a global scale is not sustainable, and not appealing to the consumer anyway. (Facebook is finally adding an option to see a chronological feed, so perhaps they've reached this conclusion too.)
The simple beauty of an indie group is that I can share something and feel confident that everyone who is interested will see it. I have a direct, uninterrupted connection with the other members of the community.
Lots of ink has been spilled over the ways big tech vacuums up and then monetizes the personal data and behavioral data of its billions of users.
The entire premise of big social networks is that the user gets "free" access to a service that connects them with other people, in exchange for being targeted with ads.
But the everyday practicalities of harvesting data for targeting and then serving up those ads are staggering. How do you provide enough targeting data for ads to be relevant, but without allowing, say, racial profiling? How do you inform the users in detail about where their data is going (lots of it is going cross-platform and cross-border).
Independent communities can offer value propositions that can withstand daylight. Sponsored groups, premium memberships, and paid-only content are easy transactions for a small community. No global tracking, no clickbait, and no secret targeting.
If I'm in a fishing forum, I actually love to see an ad for the newest tackle box. Heck, I might be more likely to click on it too.
Global, borderless social networks have a hard time with anonymity too.
There are many legitimate reasons why a community member might prefer to use a pseudonym. Journalism, unpopular political speech, discussing sensitive medical challenges--- all of these are reasons to use anonymous accounts.
But big tech has decided to prioritize security concerns above the need for anonymity. Within the context of a huge, worldwide conversation, it's easier to force identity verification than to deal with the potential content moderation challenges of removing speech accountability. This can have a chilling effect on participation by those groups who are legitimately afraid to reveal their name and friend circles.
All of these reasons make it impractical and not feasible over the long term to govern the conversations of a global audience within a single, massive social space.
We are in the midst of a tectonic shift back toward the niche communities and independent groups that built the Internet in the first place.
And I'm here for it.
Create your free account to unlock your custom reading experience.