Request a Demo Blog

Content Moderation in Virtual Reality: the Impacts and Implications for Businesses

March 23, 2023 | VR and Metaverse Moderation

“I still remember the first time I played a multiplayer game in virtual reality: ping pong with a kid around 15 years old, somewhere on the other side of the world. It felt like we were in the same room together, talking. Suddenly he came across the ping pong table to show me how to work something, and it felt like he was standing next to me – an invasion of my personal space. It really hit me: this technology is so powerful, so incredible, so real.” As a content moderation expert, who is in the business of spotting the opportunities for abusive behavior online, it was also the moment WebPurify co-founder Josh Buxbaum saw the huge potential, alongside the new risks for users and brands that this technology will bring. It was when he recognized the need for content moderation in virtual reality.

VR has applications that will change whole industries far beyond just gaming: tours of hitherto inaccessible Space Stations; and far-flung corners of the world. Surgeons or emergency medical professionals can experience a mass casualty scene. Perhaps most importantly, the metaverse is enabling children and adults from every corner of the planet to learn and experience things that before now would have been unavailable to them. 

The opportunity could change the world as we know it –  and around 400 million people are already users of the metaverse. But of those, 80% of metaverse users are younger than 16, perhaps owing to the fact that gaming, for now, dominates the landscape. From innocuous settings like ping pong tables to the more ‘mature worlds’ for users age 18+, this groundbreaking technology allows you to physically engage with fellow gamers from anywhere in the world and collaborate – and the platform is constantly evolving. A Gartner report estimates that between work, shopping, social media, and entertainment, 25% of people worldwide are expected to spend an hour or more in the metaverse daily. Once you put on your first VR headset, you will quickly understand why. 

For decades, user-generated content has come in the usual form of text, images and videos – all of which presented challenges for moderating at speed. Thankfully, fast, customizable solutions followed, and today’s moderation tools are accurate at scale, and largely up to the task. The metaverse, however, has introduced new challenges for brands that want to keep users safe. 

“In the metaverse, it’s a completely different type of engagement. The rules are different,” Josh explains. “We’re now moderating behavior, interactions, gestures. Moderating content in virtual reality isn’t a project where you can just sit at your desk. Our team is in headsets, interacting in the games.”

How online abuse in VR affects mental health

When you create a platform, any platform, the bad actors out there will immediately try to find ways to exploit it. And because the metaverse presents a whole new way of interacting online, its immersive experience means that users’ brains are tricked into feeling it more sensorily than the words and pixels in other mediums. So when they are subject to bullying, sexual harassment, racist language or forced attention in this space, it feels more real. 

Josh and his team saw this firsthand when training their teams to moderate in the metaverse. “When someone abuses one of our moderators, we’re not seeing it in a meme or even seeing it in a video: it’s someone talking directly to us,” he says. “It’s very real and interactive because you’re face to face. It’s more personal, and the impact of a personal attack is more severe if you’re not trained properly.”

And most users, of course, are not trained in how to manage interactions like this, which is why developers in the metaverse are working with WebPurify to develop systems that weed out the bad actors while preserving the incredible user experience the metaverse provides. 

The impacts of digital abuse are well known. An exhaustive study of general online abuse victims by Amazon Web Services found that the impacts have profound and lasting effects on people’s mental health. Most people reported feeling angry or anxious, while two-thirds suffered from depression, had difficulty sleeping, felt isolated, feared for their safety or were unable to concentrate. Half said it made them concerned about leaving the house and felt like they wanted to withdraw from the world. A quarter of people felt unable to work. For children, the impacts of online abuse can manifest in anxiety, self-harm, eating disorders and even suicidal thoughts. 

Metaverse developers feel a deep-seated duty of care to their platform users, and this, too, is WebPurify’s mission.

Implications for your business

While the risks for users are clear, the risks for brands in the metaverse are also real. When new games go viral, the bad actors will quickly make themselves known – and how businesses react is integral to success.

For brands that allow this behavior to go unregulated, the community they cultivated so carefully can spin out of control fast. “When you have an unmoderated community, the word gets out that this is the place where there are no rules,” Josh explains. “It becomes a free for all. It will quickly devolve into a poor user experience for anyone who just wants to enjoy themselves.” 

These brands lose their primary user base, along with the associated advertising revenue. Giving bad actors free rein in your community both devalues your brand and has a direct impact on your bottom line. But what’s the solution?

Content moderation in virtual reality

Content moderation in virtual reality: how it works

WebPurify realized that if you’re going to ‘clock the talk’, you need to ‘walk the walk’. When it set up the world’s first VR Moderation Studio for metaverse moderation, it began with purchasing headsets. Its moderators took turns wearing the headsets, while their experience was broadcast onto a screen for the rest of the team to observe. The goal is really for each moderator to simply mimic a normal gamer on the platform. 

In reality – in virtual reality – the key to effective content moderation is to just be yourself.  “We’re not referees drawing attention to ourselves,” says Yekkanti Ravi, WebPurify’s Training Manager. “Our team simply plays the games and is part of the community. But while we’re part of the community, we’re also paying attention.” 

WebPurify’s VR moderation team is mostly women – because experience tells them that it is women who draw out the bad actors on these platforms. When it comes to spotting violations, WebPurify’s moderators are looking for minors who are too young to be there and for the obvious breaches of community guidelines: hate speech, threats, sexual advances and offensive gestures. Minors can often be identified by their voice, with a moderator’s subsequent questions exposing their age. And with infractions of community guidelines, reports are sent to the game developer.

The mental health of WebPurify’s moderators is paramount, with no one playing for more than 30 minutes without a break. To make a natural exit, they’ll craft an excuse that their battery is low or they need to go to work. The team works in shifts and is always-on, 24/7.

But despite the risks, gaming as a job is an unsurprisingly coveted position within the business. “The team really love their jobs,” concedes Josh. “Interestingly, our moderators end up also serving as ambassadors for the community. While they’re watching for abuse, they’re also orienting players on how to use the game or setting up new rooms when others get too full. In this sense, our team is actually boosting the user experience by helping first-time users through their learning curve. 

“I think it’s because we’re kind and model community members,” Ravi adds. “Many people often schedule their gameplay for when we’re online.”

For more information on WebPurify’s content moderation services in virtual reality, see our guide on how to moderate metaverse experiences.

Metaverse moderation

The future of moderation in the metaverse

As the metaverse technology advances, content moderation in virtual reality will evolve. Josh believes that more games will introduce a ‘God Mode’, allowing his team to be in more places at once. If someone is having a negative interaction, the user may be able to quickly report it, alerting our moderators.  What can moderators then do? Jump to that room and either mute the offender or remove him from the game. 

“At scale, we can’t be in every one of these games, but when something is reported, we can pop in and see what’s happening,” says Josh. “As VR gets bigger, the challenges of moderating it will scale. It’s a matter of getting prepared now.”

Stay ahead in Trust & Safety! Subscribe for expert insights, top moderation strategies, and the latest best practices.