In the world of web3, online communities are a vital part of building and sustaining projects.
With the rise of decentralized platforms, the need for community management has increased significantly. Discord, the popular chat application, has become a favorite among web3 communities. However, the rise of
The problem with spam bots is that they can disrupt communication channels within communities, making it difficult for members to interact with one another. These bots are often created by bad actors looking to promote fake ICOs, manipulate cryptocurrency prices, get more airdrop spots, or steal user information. As a result, community members can lose trust in the project, leading to a decrease in engagement, and even abandonment of the project. Additionally, in some cases, the
In addition to causing communication disruptions and financial losses, spam bots can also manipulate community airdrops and voting processes. For example, a bad actor can use bots to create multiple accounts and claim more airdrops, depriving genuine members of the rewards they deserve. Similarly, bots can be used to sway voting outcomes, leading to unfair decisions that do not represent the will of the community. These manipulations can cause frustration and disillusionment among members, leading to a breakdown of trust and reduced engagement.
Even though we see visionary community managers all around web3, who keep an eye on the bot activity and take measures to remove and ban them but it’s a hefty task, and being a human, it’s beyond imagination to manually ban bots every now and then.
Being a part of the community management team, I can relate to the pain of dealing with bots on your server. Humanode community managers faced the same problem of bots. Even though Humanode is a project that uses
To resolve the issue, we temporarily stopped voting and chose winners based on the internal team's votes. We knew this wasn't ideal and lacked transparency. The issue persisted during public token offerings as bots joined the community in large chunks almost every other day.
The team knew they had to come up with a solution to deal with multi-accounts and spam bots, so we started working on developing a specific solution for web3 social communities, especially for Discord and Telegram.
After weeks of brainstorming, the team finally decided to use our own biometric-based Sybil-resistant technology to develop a solution for the Discord server. This meant that a user would need to undergo a crypto-biometric scan, to confirm that they are a real human and get a verified role. The idea was to make it harder for bad actors to create multiple accounts and allow only verified humans to participate in community activities i.e. voting, airdrops, and other decision-making activities. With this in mind, the tech team immediately started working on implementing the solution.
It wasn't an easy task, as they had to integrate the biometric-based Sybil-resistant technology with the existing infrastructure. However, the team was committed to building a stronger, more secure community for Humanode. After several weeks of development and testing, the technology was finally ready for deployment. We are currently in beta testing on our Discord server. And it’s going so well so far.
The team is excited to see the results of their hard work. They had put in a lot of effort to ensure that the technology was foolproof and could effectively prevent Discord bots from spamming voting, airdrops, whitelisting and other activities. With the biometric-based Sybil-resistant technology in place, the community managers could finally breathe a sigh of relief.
To our delight, the new technology is working perfectly. It prevents bad actors from creating multiple accounts and using spam bots to disrupt the community. The community managers are finally able to manage their community without any interruptions. We named it “Humanode Bot Basher.” Cool, isn’t it?
With
The bot itself isn’t really hard to integrate. Discord admins can integrate the solution into discord servers with a few simple steps. Admins can add specific roles for those who verify that they are real human beings. All the Discord servers get to know from this process is that there is an actual user behind that account, and it is a unique account. Why? Because unless you are really two-faced, you only have one face, and when your account is tied to the cryptographic keys generated from the 3D mapping of your face scan, it is not possible to tie your face to another account.
A user can choose one account that they want to bio-authenticate. Currently, the price set for Bio-Authenticating an account is $1 US per year. Once a user Bio-Authenticates his or her account, that account will be recognized as a “Bio-Authenticated Account”, allowing the servers to assign a role that will allow the user to join channels or activities that are limited to such accounts. The good news is that you only need to do this once. All Discord servers that utilize the Humanode Bot Basher will recognize your account as one that has been Authenticated, so you only really need to do it once a year. Oh, and just in case you don’t know, all profits from this service are equally shared by all Humanode Validators.
The biometric-based Sybil-resistant technology helped Humanode build a stronger, more secure community, and made it a much better place for web 3.0 enthusiasts to come together and share their passion for the future of blockchain technology.
The solution that the team at Humanode developed to tackle the problem of Discord bots could be useful for other web3 communities facing similar challenges. The biometric-based Sybil resistance technology we implemented can be easily integrated into any discord server to provide a higher level of security and prevent multi-accounts from spamming. The lessons learned by the team at Humanode could also be shared with other community managers, helping them to develop and implement their solutions for bashing the bots.
By sharing knowledge and collaborating, web3 communities can work together to build stronger, more secure communities and protect their projects from reputational and financial damage caused by spammy bots.