Request a Demo Blog

How to Moderate Metaverse Experiences

November 15, 2022 | VR and Metaverse Moderation

WebPurify is excited to announce our newest offering to help protect internet users: our VR Moderation Studio.

We have proudly offered text, image, and video moderation services for over sixteen years. With virtual reality and the metaverse becoming more commonplace and gaining users by the day, there is a growing need for moderation in this space as well. WebPurify is prepared to help our customers with the new moderation challenges this medium demands.

But why is VR and metaverse moderation important and how does WebPurify do it? Let’s dive in.

What is the metaverse?

Virtual Reality (VR) is a simulation experienced by someone using a headset over their eyes. Depending on the particular application of the VR, other hardware can be used, but the headset is the primary tool that creates an immersive experience. The virtual world that the person enters mirrors reality because the person using the headset feels as though they are actually inside the simulation. They can walk around and interact within the 360 degree virtual world, just as they would in the real world. VR serves various purposes in different industries. 

From surgical training in healthcare, to combat training for the military, the ability to simulate virtual situations is invaluable. Beyond practical applications, VR has also been used for immersive gaming experiences, allowing players to feel as though they are actually in the game. This is where the metaverse comes in.

The term “metaverse” was originally coined three decades ago to describe the use of the internet as a virtual world where people interact using avatars. However, the term has only recently become popular when Facebook rebranded to Meta in 2021. Facebook announced that Meta would soon feature new virtual reality worlds called the “metaverse.” Here, people would be able to connect in all new ways, as avatars in a virtual world. While similar virtual spaces existed before this, Meta’s new focus has caused virtual reality to become more mainstream. It is no longer limited to games. Now, people are hosting comedy shows, movie nights, and meditation sessions, and companies are even creating ways for potential customers to shop, all in VR spaces.

As these new ways to interact with others expand, new challenges arise for content moderation.

Metaverse and content moderation

While virtual reality has been around for some time, up until recently, it was not common for someone to own a VR headset. Very serious gamers or tech enthusiasts with deep pockets might have owned their own hardware, but for most people, due to the prohibitive cost and complicated setup, VR existed as a means of entertainment mainly available at arcades. Now, with significantly more affordable all-in-one headsets like the Oculus Quest 2, accessing the metaverse experience is possible for many households.

Just as the need for moderation arose when more people began to use the internet, as more people engage in metaverse interactions, moderation will become increasingly necessary for them as well. The metaverse is already seeing children begin to join virtual worlds, even those that are only designed for adults. Whenever children enter digital spaces meant for adults, the risks increase exponentially. From potentially being exposed to pornographic or violent/graphic material, to interacting with child predators, children are extremely vulnerable in these environments. 

Unsurprisingly, content moderation is not always cut-and-dry in the emerging metaverse. While Facebook and Instagram have rules about types of text, image, and video content that can be uploaded to their platforms, VR and the metaverse pose new moderation challenges. Both setting guidelines and enforcing them can be especially difficult due to the unique ability for users to interact in these realistic virtual environments.

One moderator has documented various interactions they’ve seen in virtual spaces as well as the difficulty discerning whether user behavior is malicious or not. While new users may struggle to understand the VR controls and act erratically because of it, there are others who will intentionally behave oddly, or worse, break the rules of the virtual world. This can be anything from simulating sexual behaviors to climbing on virtual boulders next to a stage at a venue. Just like in reality, what might be allowed in some instances, like at a concert, might not be condoned in a meditation class. Moderators need to be prepared to enforce the bespoke rules for different types of virtual events.

In this new medium, content moderation is even more nuanced and complicated than ever. At WebPurify, we’re applying what we’ve learned over the past 16 years, as our services have evolved to address new types of User Generated Content challenges.

To that end, we have created an all-new, VR-specific service to address the moderation complexities created by virtual worlds – the WebPurify VR Moderation Studio.

With WebPurify’s new VR Moderation Studio, we offer:

  • Live moderator teams equipped with headsets who are available 24/7 and trained in: 
    • Spotting abusive behaviors in VR environments 
    • De-escalating bad behaviors
    • Escalating illegal behaviors to the client or authorities
  • Services beyond moderation such as:
    • Training and mental health support for moderators (targeting VR-specific stressors)
    • In-game ambassadors to teach players and help them acclimate to new virtual spaces
    • Bug detection and new feature recommendation
    • Regular roundtable discussions to address common pain points that players are having in their games and ways to address them

The ability to make players feel comfortable, feel at-home, feel safe…that’s really our mission. – Josh Buxbaum, WebPurify Founder/COO

Our co founder and Head of Client Services, Josh Buxbaum, was recently part of the “Growing Communities in VR with Integrity and Trust” panel at Meta Connect 2022 where he spoke about the new challenges that moderation platforms are facing with regards to VR and metaverse moderation. Check it out to learn more about how WebPurify plans on tackling those challenges.

image from meta connect 2022 with Josh Buxbaum, WebPurify Founder/COO and text that reads "growing communities in VR with integrity and trust"

Protecting women in VR and the metaverse

Like many other online spaces, women tend to be specifically targeted in the metaverse. Early on in testing, one of the beta users of Meta’s VR game Horizon World alleged that her avatar was groped in the game’s plaza and that other avatars nearby were encouraging the sexual harassment. While Meta has a feature called “Safe Zone” that allows players to keep others from coming too close, an investigation revealed the user had not engaged the feature but still acknowledged the severity of the incident.

The more real [VR] gets, the more real it feels, and the more real it feels, now we’ve got a whole new moderation risk…it’s challenging from a moderation standpoint, [and] from a tech standpoint, but we’re figuring it out. – Josh Buxbaum

WebPurify recognizes the dangers that women face online. We are dedicated to making VR and metaverse spaces safe for women. We have trained a majority-female team to better identify and moderate harassing behavior and speech in virtual environments. Any players that break any rules are immediately reported. We cannot make VR and metaverse spaces comfortable for everyone without focusing on women directly.

Conclusion

Moderation has never been an exact science. With all of the nuances involved in how people interact digitally, it is crucial to have both AI and humans working together to keep virtual spaces safe for everyone. Text, image, and video moderation are challenging enough, but VR and the metaverse create an added layer of complexity. As virtual worlds gain popularity, the need for proper moderation will only increase. WebPurify is ready to rise to the challenge and help customers keep their virtual worlds safe.

But when is the right time to start considering how to moderate a virtual world?

I think having the conversation early is what we’ve found [works best]…it tends to be challenging once a game is fully developed to then have it be an afterthought, ‘How do we moderate this community? How does it fit in?’…at WebPurify, we love taking that consultative approach very early because then everything is set up the way it should be. – Josh Buxbaum

It is never too early in game development to start thinking about moderation. Getting procedures in place so that your players can feel comfortable and easily acclimate to the virtual space you’re creating will only help in the long run. 

Reach out to us today to see how we can help you and your customers with our VR Moderation Studio.