Request a Demo Blog

How to be a content moderator

April 8, 2024 | Careers

If you’re seeking to break into the tech industry in a role that can make a significant impact while also providing a huge sense of fulfillment, you should consider content moderation. As a content moderator, you’ll have a direct hand in crafting safer, more respectful digital environments, something that’s at once gratifying and provides a sense of achievement.

Content moderation is also your chance to be at the forefront of digital culture, turning challenges into opportunities for growth and learning, all while contributing to a more positive and inclusive online community. If you’re passionate about technology and driven by a purpose, content moderation is a career path wherein you can shine.

Ravi Yekkanti has over a decade of experience recruiting and training content moderators at WebPurify, who collectively moderate 3.5 million text submissions, a million images, and tens of thousands of videos every single day.

How to be a content moderator

WebPurify’s Training Director Ravi Yekkanti

He explains that the minimum qualification for the job is having a university degree, but strong communication skills and being well-versed in internet culture are also useful. “Moderation goes well beyond differentiating between an image that does or doesn’t have nudity, or a sentence with or without a profane word,” he explains. “Where it gets interesting, and where WebPurify’s highly trained moderation team sets itself apart, is parsing the more nuanced stuff.

“Context, the geography of the audience in question, and the specific enforcement criteria of a community or brand all matter. What’s more, moderators need to fold all of this into their decision-making quickly, and that takes an impressive skill set.”

Still, the preceding describes abilities content moderators obtain over time, with training and experience. The low initial barrier to entry makes content moderation a good option for new graduates, or anyone seeking to change career paths.

As Ravi explains, “Content moderation is a skill that’s very easy to learn. It’s trainable, so long as you show up with an open mind, have good time management abilities, and a willingness to adapt to new brand criteria and new projects.”

Speed of content moderation

Why it’s important

Even so, just because it’s easy to learn doesn’t mean content moderation is an easy job. What’s unique to content moderation, stresses Ravi, is that you’re making an important difference not just in peoples’ day-to-day lives, but arguably in society as a whole. In the absence of moderation work behind the scenes, the online communities and services we use regularly can quickly become toxic, inaccurate or even prone to illegal activity. In other words, the stakes are quite high.

Furthermore, moderation can admittedly be taxing. Reviewing thousands of submissions daily is no small feat and fatigue can absolutely set in without proper break times, employee wellbeing resources and support (all things WebPurify uncompromising prioritizes). When the going does get tough it also helps, we find, for moderators to regularly remind themselves how instrumental their role is, and the many positive downstream effects stemming from their work. As a moderator, you’re making the world a better place.

And don’t worry, you won’t be thrown in at the deep end. Any respectable moderation vendor will provide comprehensive training that gradually levels up, preparing the team for the work ahead.

Understanding the context

Unlike many companies, WebPurify allocates moderators to work on specific projects, for a specific client. We don’t spread personnel thin across several platforms. If and when an engagement ends, moderators will then be transitioned to another client, with further, new, training in between. How long your training takes will depend largely on the project in play. In practice, it can range from one to two hours for simpler projects to one or two weeks for more complex ones.

Much of this training centers on understanding the platform or website you’ll be helping to moderate, digesting its community guidelines, and so on. But it’s not purely nuts and bolts: the training will also help you understand the human context behind the rules you’re helping to enforce and any idiosyncrasies within a particular audience.

“For example, if we’re talking about policies regarding hate speech,” explains Ravi, “we might encounter an interview of an older lady who was in a concentration camp, and who talks about her horrible experiences there. AI might flag mentions of this as hate speech, but our human team understands there’s a historical context at play, and that in fact, this tragic event is being condemned, not elevated.”

Learning to emotionally distance

Training also involves enabling recruits to emotionally distance themselves from potentially disturbing content, to protect their own mental health, Ravi adds.

“Speaking from experience, there’s a series of emotions that you go through when you first get exposed to sensitive content,” he explains. “So one of the main things that we teach new content moderators is to – in a healthy way – compartmentalize that emotional perspective and objectively do the job.

“For instance, with practice you begin to see a body in an image as opposed to a person . You see a body part and you make a judgment: is that body part supposed to be there or not? When you think like this, you reframe things and make it manageable when looking at sensitive stuff.”

He adds that you’ll also be taught techniques to help you deal with sensitive content, such as moving through it quickly without compromising accuracy of review, or turning off the sound when appropriate.

Building up slowly

Ravi explains that you won’t be expected to work at 100% speed from day one. Once you’ve completed your training, a new content moderator will normally undergo a “ramp-up phase” where their workload gradually increases from 25% to full productivity over the first month.

So what does this all look like in practice? Typically, the core responsibility involves being assigned to a simpler user case, reviewing numerous pieces of user-generated content and making yes/no decisions: should this be allowed or not?

“If you’re processing 100 tickets or looking through 100 images in an hour, what you’re actually doing is you’re taking 100 decisions,” Ravi said. “This repetitive decision-making ultimately becomes a ‘habit’ of discernment until it’s nearly second nature”

Essential skills for content moderation

If all of this sounds appealing, you might be thinking: what are the key skills needed by someone looking to start a career in content moderation?

Alex Popken, WebPurify’s VP of Trust & Safety, points out that one doesn’t go to college to become a content moderator. It’s not exactly a course track or choice when it comes to majors. Content moderation has a way of finding you, which is exactly what happened in her case. Starting off her career in finance before Trust & Safety even existed as a concept, her role in moderating employee emails before they reached clients eventually found her at Twitter in 2013 breaking new ground as the company’s first dedicated moderator for its growing advertising business.

“Keen attention to detail and the ability to spot a needle in a haystack are the essential skills for any content moderator,” Alex explains. “The reason why this is important is because most often, content is benign – you’re looking for that outlier, the less than 1%.

“Critical thinking is also essential. As much as content moderation would be simpler if black-and-white, it rarely is – user-generated content is nuanced because people are nuanced, community guidelines rarely solve for every use case, and the whole thing requires strong judgment. Analytical skills are important too, particularly when assessing the riskiness of users who attempt to game platforms.”

Alex points out that for an entry-level content moderator, skills are more important than experience – things like attention to detail, critical thinking, and analytical prowess. For more tenured Trust & Safety professionals, we tend to see a lot of crossover with law, law enforcement, intelligence, academia, government and military fields – they translate really well to this work for a variety of reasons.

Career prospects

While working as a content moderator begins as an entry-level position, that doesn’t mean you have to stay there forever.

Ravi explains that committed content moderators can find ample opportunities for career growth within the industry. He himself exemplifies this trajectory, having advanced from entry-level moderator to trainer and then manager.

“Anyone can follow a similar course,” he adds. “If you are dedicated enough, if you are paying attention enough, you can make a career in content moderation.”

At the same time, Ravi acknowledged that content moderation isn’t for everyone. It demands a specific mindset to handle potentially disturbing content with poise. His advice? “Content moderation is not for all. But if you’re willing to learn, it has a lot of perks that so many people don’t realize.”

Alex adds that content moderation is just one facet of the broader Trust & Safety ecosystem, within which there are a whole host of career paths. From data scientists to product managers to policy writers, the roles that comprise this field are vast, and the skill sets too.

Profound rewards

For those who can maintain the necessary objectivity, embracing a content moderation role can be extraordinarily rewarding. Ravi shares a powerful story of how moderating a concerning review ended up saving a kidnapped child’s life. “That changed my life,” he reflects. “I realized that this work can be very important and very much has real-world effects.

“Overall, our work at WebPurify has assisted in the arrest of more than a thousand suspected child predators,” he adds. “That means at least a thousand kids’ lives that were changed for the better, or even saved. That’s a statistic we’re especially proud of.”

In summary, pursuing a career in content moderation requires resilience, sound judgment, and a willingness to take on the responsibility of protecting online communities.

While straightforward, the work carries profound significance that often gets overlooked. And with the proper training and mindset, it offers an accessible entry point into a tech-adjacent field with potential for skills growth and career advancement.