Request a Demo Blog

What are content moderators and what is the best way to train them?

September 20, 2022 | Image Moderation, Video Moderation, UGC

CONTENT WARNING: This post discusses some of the difficulties that content moderators face, including viewing abusive and disturbing images.

Content moderation can be a taxing job. Content moderators often view harmful content daily in order to keep the internet safe for users. Companies like TikTok have come under fire for using illegal images containing child sexual abuse to train their content moderators. Do content moderators need to view content like this during their training to be effective at their jobs? What are the best and worst strategies for content moderation training and how can WebPurify help?

Let’s take a peek!

What is a Content Moderator?

While social media is now commonplace (72% of US adults used it in 2021), social media platforms are still young. The original social media platforms were launched in the early days of the 21st century. Once social media made it simple for anyone with internet access to connect with others around the world, there was a question of free speech and what should be allowed on social media sites. Unfortunately, it wasn’t easy to come up with hard-and-fast rules for content moderation.

In the infancy of content moderation, employees working for various social media platforms set out to decide what content should and should not be allowed on their sites. Of course, with all of the nuances allowed on social media, there were many times where they had to amend the rules they were creating to allow for exceptions. Should videos of prisoners being beaten be taken down or are they necessary to showcase the brutalities that some prisoners face? Rules cannot cover every case, so there needs to be someone reviewing content on a case-by-case basis.

The question then became who would be conducting moderation and enforcing the predetermined rules. Enter the role of the content moderator.

What does a content moderator do?

Whenever someone reports something inappropriate on social media or the platform’s AI tools flag borderline concerning content, it is escalated to a content moderator. Whether it’s a post that promoted a scam or a video of a gruesome murder, it is often sent to a moderator for analysis. Content moderators are tasked with reviewing any user submitted text, images, audio, or video to determine if they uphold the community standards of the website. By removing harmful content, they maintain the reputation of the company and keep users from leaving social media platforms, dating apps, e-comm platforms, or other sites that allow User Generated Content (UGC) due to offensive content.

Often, community standards for social media sites can be vast, cover numerous categories, more complex subcategories, and be quite nuanced. For example, violent sports like boxing may be allowed, but violence in non violent sports, like baseball, may not be permitted. So a video of two baseball teams fighting after a player was hit by a pitch would qualify as violence. While content moderators are often tasked with removing content that is just harmlessly spammy or off-topic, there is a risk that they can be exposed to more severe content.

Content moderator training

How does one train to become a user generated content moderator, without being exposed to upsetting content? Are there approaches that are effective without the need for moderators to view offensive material?

How content moderator training can be problematic

Recently, TikTok came under fire for using graphic images of child sexual abuse to train their content moderators. Senators from various states are calling on TikTok to answer to the allegations from past employees that they illegally store and use explicit material of children in their content moderator training.

There are legal implications for retaining these images as well as the potential impact of exposing moderators to such content. In addition, every time this content is viewed, it re-victimizes the child all over again. Parents of these children have no idea that this content is being saved and viewed regularly as part of moderator training. Of course, Tik Tok’s objective was to simply ensure that their moderators were properly trained to identify and remove such content in the future, but their approach, although well intentioned, was unacceptable.

How to train a content moderator the right way

Obviously, it is not ideal to train content moderators by exposing them to extremely graphic and upsetting content. So, how can you effectively hire and train your content moderators without inflicting unnecessary harm? Here are some steps you can take:

1. Be upfront during the hiring process

Let potential and new employees know at every step of the hiring process what they can expect from the job. Make sure they are clear on the expectations of the role so that they can choose whether or not they want to moderate abusive content.

2. Be proactive about mental well-being of content moderators

Instead of waiting for employees to come forward with potential problems they are having, regularly check in with them and encourage them to discuss their struggles with the job openly. Encourage them to take regular breaks throughout the day and even regular vacation time to minimize prolonged exposure and allow them the chance to mentally “reset.” Also, partner with mental health professionals who are familiar with the particular stresses of content moderation and offer their support to your team.

3. Collect moderator feedback and give them the option to “opt out” of certain content

Give moderators the opportunity to flag any content that they find particularly upsetting. This way, their superiors can know which types of content bother each of their employees and assign them to projects that are appropriate for them. Additionally, allow your moderators to “opt out” of any types of content they will find upsetting.

4. Modify training materials to be less harmful

Unfortunately, viewing explicit or violent images is a necessary and effective part of proper training for a content moderator. Presenting images in grayscale, using blurry images, or turning videos into a storyboard of still frames can help to make training materials less severe for trainees. As for illegal content, like discussed above, WebPurify’s co-founder Josh Buxbaum commented, ”While a picture is worth a thousand words, in the case of this kind of content, we use the thousand words. Having trainers describe content in extensive detail can help with topics that simply can’t be conveyed with imagery.”

5. Remind moderators how important their jobs are

While the job can be challenging, there are plenty of content moderators who find their jobs very rewarding. Keep reminding employees that content moderation is important, especially in cases of graphic material such as child sexual abuse. Content moderators can play a key role in putting dangerous predators behind bars. Sharing real life examples of this with your team can be impactful. Some of the best performing moderators are those who find meaning in their jobs.

How WebPurify can help

At WebPurify, we can take the challenges of training and overseeing an in-house moderation team out of your hands. We approach the careful training of our moderators in the most sensitive and least harmful way possible while ensuring they are fully prepared for their work. Our live moderation teams operate 24/7 from our own office, carefully moderating your platform’s UGC to prevent your users and in-house customer content moderation service teams from being exposed to disturbing submissions. We utilize a very effective approach combining our AI and live moderation teams. WebPurify’s AI can flag NSFW content before it is even posted to your site, while anything that is questionable can be reviewed by our moderators within minutes.

Conclusion

Content moderation is a difficult job, but incredibly important. It is impossible for AI to flag all harmful content, so human moderators are an essential part of the workflow. The challenge of properly training them while ensuring they are not impacted by the content they view is a delicate balance. Utilizing approaches such as describing harmful images as opposed to displaying them, using blurry or grayscale images, and offering employees access to mental health professionals can go a long way to making the content moderation role empowering and rewarding.