Modern social media sucks at handling abuse compared to some online communities of the past — such as BBSes, early web forums, and some of the IRC channels, is that people are trying to be founders of the next unicorn startup, not manage a community they actually want to be part of.
So it’s about the Google approach the scale rather than Paul Graham’s advice.
Google works methodically to avoid custom tweaks to its algorithm, partly to avoid accusations of preferential treatment, but also because of massive scale.
On the other hand Paul Graham has made a good case for how startups should do things that don’t scale, in the beginning. Partly because it’s cheaper — see what works by hand before you try writing code to do it. But also because if it doesn’t scale, other people will be afraid to waste resources on it. So your elbow grease becomes a competitive advantage.
Twitter acts like a gigantic worldwide IRC server. That’s even how a friend of mine sold me on joining back in 2007. So why aren’t there tons of channel moderators???
Because they’re trying to do it like Google.
I love Simon Sinek’s points about how groups want leaders who will protect them. What we need with a service like Twitter isn’t “handling abuse complaints.” It’s being a lot more proactive. We need mods who aggressively seek out unacceptable behaviour and protect the community from it and the abusive individuals.
And maybe over time, some of that moderation can be automated. Maybe primarily as tools to make moderators more effective.
But it has to start with people.
Hacker Noon is how hackers start their afternoons. We’re a part of the @AMIfamily. We are now accepting submissions and happy to discuss advertising &sponsorship opportunities.
To learn more, read our about page, like/message us on Facebook, or simply, tweet/DM @HackerNoon.
If you enjoyed this story, we recommend reading our latest tech stories and trending tech stories. Until next time, don’t take the realities of the world for granted!