Moderating Mastodon and the FediverseJanuary 10, 2023 | UGC
Social media is used worldwide as a means to communicate and connect people from around the globe. More than 70% of the US population has at least one social media account. Applications like Facebook and Twitter have taken the world by storm, helping everyone feel more connected. While these centralized social media outlets are popular, there is a different type of network that is slowly gaining in popularity: decentralized social networks. With the recent change in Twitter’s ownership, people are flocking to other social media applications that are similar enough. The most popular outlet that people are using is Mastodon. While this platform is comparable to Twitter in some ways, it is fundamentally different in that it is a decentralized platform. This model (known as the Fediverse) has been around for some time, but now with Twitter’s users leaving in droves and heading to these alternatives, moderation in the Fediverse is being more closely scrutinized.
What differentiates platforms like Mastodon from centralized platforms and what new moderation challenges are emerging in the Fediverse? Read on to find out!
Why are users leaving Twitter?
Since early 2022, Elon Musk (cofounder of PayPal and CEO of Tesla) had been publicly discussing his decision on whether or not to buy Twitter. Musk has never agreed with Twitter’s content moderation policies and wanted to change the platform to become more “free speech friendly.”
Back in October 2022, Musk purchased Twitter for $44 billion. While those who agreed with him, his policies, and his political leanings celebrated, those who disagreed began searching for other similar platforms to use. The first week after Musk took over, 877,000 Twitter accounts were deactivated, more than double the amount of weekly deactivated accounts Twitter has historically seen. Although, with Musk’s recent decision to step down as Twitter’s CEO (which was decided by users via Twitter poll posted by Musk), it has yet to be seen if users will return to the platform just as quickly as they left.
There are several other platforms that Twitter users have fled to in search of “the new Twitter.” Some of the most popular applications are Mastodon, Hive Social, Post, and Counter Social, with Mastodon being the most popular. While Mastodon was once a small application with 300,000 monthly active users, it grew to 2.5 million from October to November 2022.
This influx of new users would mean moderation difficulties for any platform, but Mastodon poses a unique moderation problem, as it is a decentralized application, unlike sites like Twitter and Facebook. With so many users now part of the Fediverse, these new content moderation challenges are taking center stage. But first, what does it mean to be a decentralized platform?
Centralized vs decentralized social networks
While not nearly as popular as centralized networks like Facebook and Twitter, decentralized social media networks like Diaspora and Steemit have been around for some time and have accumulated hundreds of thousands of users.
Social media platforms like Facebook and Twitter are considered to be centralized because all user data is maintained on a single server and anyone can join and see information from any other user. There is one central home, and all users are a part of it, which can be dangerous for users when there is a data breach. Similarly, content moderation for these sites is also centralized. There is a set of rules that every user must follow and they are banned if they do not follow it. As these networks are owned by large companies, they have the resources to create and execute robust content moderation strategies.
Unlike these platforms, Mastodon is decentralized, meaning it involves several different self-regulated servers and users own their own data, making them less susceptible to data breaches. You can join a server for any interest you have and interact with people directly on that server about your shared interests. There are self-appointed administrators of these servers who are not content moderators, but users themselves who wish to help moderate the server. Each server has its own set of guidelines that users must follow instead of one set of rules for all users across the platform. This user-run moderation approach, with rules that differ between servers, means that content moderation looks completely different between centralized and decentralized platforms.
Content moderation challenges on decentralized networks
Content moderation for centralized networks is far from simple. While there may be a basic set of rules that every user must follow, moderation is still a nuanced practice. We can train AI to catch content that might go against community standards, however for many cases, a human moderator is still needed to step in and decide whether or not something violates community standards. Not to mention that users are coming up with new ways every day to evade moderation.
For decentralized networks, those rules become even grayer as user-based moderation can be a bit more ambiguous. Mastodon does have universal rules against harmful content like hate speech, and while they can implore admins of each server to abide by those rules, they cannot require that a server abides by them. Without sitewide rules, admins of any server can allow offensive content to be posted, or conversely, ban content from users of their own volition or even ban any users they don’t like.
Decentralized networks pose an even bigger challenge when it comes to moderation because they are composed of several different servers with different rules to follow. Interestingly, on the part of Mastodon, this makes things far simpler – they don’t have to hire moderators or worry about being under fire for their moderation practices. However, for users themselves, this can mean a lack of safety and security.
For instance, on centralized networks, marginalized users can be assured a certain amount of security. Most platforms have standards that don’t allow for hate speech from users. Even if this moderation strategy isn’t executed perfectly, there’s some security in the fact that there are rules against attacking marginalized communities. In the Fediverse, on different servers, there are different standards for content moderation. While many servers may not allow hate speech, plenty of servers could still allow different groups of people to be attacked. Mastodon has already seen several white supremecist groups arise, potentially making marginalized groups feel unsafe.
One of the ways that users have been combatting this is by blocking individual servers that promote hate speech. If enough servers and users are refusing to interact with an offensive server, it becomes isolated so that any damage it can cause through lack of proper moderation is contained. Users of the far-right server Gab experienced this first-hand when they joined Mastodon and found that outside servers refused to interact with them, keeping any potential harm contained. This is one way in which Mastodon users can have more control over content moderation, even if there is no governing body.
It remains too early to tell if Twitter will maintain the majority of its users or if enough people will leave Twitter to render it obsolete. With Mastodon as the primary platform that ex-Twitter users have begun using, the Fediverse and decentralized social media platforms are now becoming more popular and in turn being more closely scrutinized.
While decentralized platforms still serve to connect people, they do so on different servers based on mutual interests, making it easier for users to protect their data and carefully choose who they wish to interact with. Allowing users more autonomy may make content moderation easier for Mastodon as a company, but users are still facing challenges related to moderation.
When each server has its own guidelines and rules, some servers may choose not to moderate harmful content like hate speech, leaving other users vulnerable to such content. There are currently ways for servers to self-police, but as the Fediverse and decentralized networks get more popular, there may need to be some significant changes in their overall content moderation approaches.