paint-brush
Dealing With Spammy Bots - How Humanode Uses Sybil-Resistance to Bash the Bad Bots in Discordby@shahmeer
192 reads

Dealing With Spammy Bots - How Humanode Uses Sybil-Resistance to Bash the Bad Bots in Discord

by Shahmeer KhanFebruary 23rd, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Spammy Discord bots are a growing problem, causing damage to web 3.0 projects and communities. These bots can be used to spread false information, manipulate the prices of cryptocurrencies, and even steal personal information. In some cases, these attacks can lead to significant financial losses, as well as damage to the reputation of the project.
featured image - Dealing With Spammy Bots - How Humanode Uses Sybil-Resistance to Bash the Bad Bots in Discord
Shahmeer Khan HackerNoon profile picture

In the world of web3, online communities are a vital part of building and sustaining projects.


With the rise of decentralized platforms, the need for community management has increased significantly. Discord, the popular chat application, has become a favorite among web3 communities. However, the rise of spammy Discord bots has become a growing problem, causing damage to web3 projects and communities. These bots can be used to spread false information, manipulate the prices of cryptocurrencies, and even steal personal information. In some cases, these attacks can lead to significant financial losses, as well as damage to the reputation of the project. In this article, we'll explore the ways that spam bots can hurt web 3.0 projects and lead to failures, and we will also discuss what steps Humanode took to prevent bots from causing issues on the discord community server.


The problem with spam bots is that they can disrupt communication channels within communities, making it difficult for members to interact with one another. These bots are often created by bad actors looking to promote fake ICOs, manipulate cryptocurrency prices, get more airdrop spots, or steal user information. As a result, community members can lose trust in the project, leading to a decrease in engagement, and even abandonment of the project. Additionally, in some cases, the financial losses resulting from spammy Discord bots can be significant, causing a project to fail. Therefore, community managers should take proactive measures to combat the rise of spam bots and protect their communities.


In addition to causing communication disruptions and financial losses, spam bots can also manipulate community airdrops and voting processes. For example, a bad actor can use bots to create multiple accounts and claim more airdrops, depriving genuine members of the rewards they deserve. Similarly, bots can be used to sway voting outcomes, leading to unfair decisions that do not represent the will of the community. These manipulations can cause frustration and disillusionment among members, leading to a breakdown of trust and reduced engagement.


Even though we see visionary community managers all around web3, who keep an eye on the bot activity and take measures to remove and ban them but it’s a hefty task, and being a human, it’s beyond imagination to manually ban bots every now and then.


Being a part of the community management team, I can relate to the pain of dealing with bots on your server. Humanode community managers faced the same problem of bots. Even though Humanode is a project that uses biometric-based Sybil resistance to ensure that each human is unique and has only one account, we faced the challenge of spammy Discord bots that were becoming a pain in the back for the community. These bots were disrupting communication channels and causing frustration among members. The situation worsened when we held contests and noticed an unusual number of votes on some entries.


To resolve the issue, we temporarily stopped voting and chose winners based on the internal team's votes. We knew this wasn't ideal and lacked transparency. The issue persisted during public token offerings as bots joined the community in large chunks almost every other day.


The team knew they had to come up with a solution to deal with multi-accounts and spam bots, so we started working on developing a specific solution for web3 social communities, especially for Discord and Telegram.


After weeks of brainstorming, the team finally decided to use our own biometric-based Sybil-resistant technology to develop a solution for the Discord server. This meant that a user would need to undergo a crypto-biometric scan, to confirm that they are a real human and get a verified role. The idea was to make it harder for bad actors to create multiple accounts and allow only verified humans to participate in community activities i.e. voting, airdrops, and other decision-making activities. With this in mind, the tech team immediately started working on implementing the solution.


It wasn't an easy task, as they had to integrate the biometric-based Sybil-resistant technology with the existing infrastructure. However, the team was committed to building a stronger, more secure community for Humanode. After several weeks of development and testing, the technology was finally ready for deployment. We are currently in beta testing on our Discord server. And it’s going so well so far.


The team is excited to see the results of their hard work. They had put in a lot of effort to ensure that the technology was foolproof and could effectively prevent Discord bots from spamming voting, airdrops, whitelisting and other activities. With the biometric-based Sybil-resistant technology in place, the community managers could finally breathe a sigh of relief.


To our delight, the new technology is working perfectly. It prevents bad actors from creating multiple accounts and using spam bots to disrupt the community. The community managers are finally able to manage their community without any interruptions. We named it “Humanode Bot Basher.” Cool, isn’t it?


With Humanode Bot Basher in place, the community members feel more secure and could trust the integrity of the community. The internal team could now hold contests and community voting with confidence, knowing that the results would be fair and transparent.


The bot itself isn’t really hard to integrate. Discord admins can integrate the solution into discord servers with a few simple steps. Admins can add specific roles for those who verify that they are real human beings. All the Discord servers get to know from this process is that there is an actual user behind that account, and it is a unique account.  Why?  Because unless you are really two-faced, you only have one face, and when your account is tied to the cryptographic keys generated from the 3D mapping of your face scan, it is not possible to tie your face to another account.


A user can choose one account that they want to bio-authenticate.  Currently, the price set for Bio-Authenticating an account is $1 US per year.  Once a user Bio-Authenticates his or her account, that account will be recognized as a “Bio-Authenticated Account”, allowing the servers to assign a role that will allow the user to join channels or activities that are limited to such accounts. The good news is that you only need to do this once.  All Discord servers that utilize the Humanode Bot Basher will recognize your account as one that has been Authenticated, so you only really need to do it once a year.  Oh, and just in case you don’t know, all profits from this service are equally shared by all Humanode Validators.


The biometric-based Sybil-resistant technology helped Humanode build a stronger, more secure community, and made it a much better place for web 3.0 enthusiasts to come together and share their passion for the future of blockchain technology.


The solution that the team at Humanode developed to tackle the problem of Discord bots could be useful for other web3 communities facing similar challenges. The biometric-based Sybil resistance technology we implemented can be easily integrated into any discord server to provide a higher level of security and prevent multi-accounts from spamming. The lessons learned by the team at Humanode could also be shared with other community managers, helping them to develop and implement their solutions for bashing the bots.


By sharing knowledge and collaborating, web3 communities can work together to build stronger, more secure communities and protect their projects from reputational and financial damage caused by spammy bots.