Request a Demo Blog

The intricacies of CSAM moderation: 3 key insights

September 7, 2023 | UGC

The nature of user-generated content (UGC) and other online content is vast and ever-changing, often reflecting the complexities – and sometimes the darker corners – of the human experience. One of the most troubling aspects we confront as content moderators is Child Sexual Abuse Material (CSAM), a grim reality that requires unwavering commitment to mitigate and report. Moderating for CSAM is the most difficult part of our business, as it challenges both the technological capabilities of moderation systems and the emotional resilience of human moderators who are tasked with intercepting and flagging this content.

Against this backdrop, WebPurify is proud to release a new ebook that profiles our specialized CSAM moderation team – a group of dedicated professionals who are highly trained to work on the front lines of this critical issue. Inside you’ll find insights into our methodology, team composition, and the support systems in place to ensure their wellbeing. It’s a bare-all look at what it truly takes to tackle CSAM effectively and humanely.

Last year alone, WebPurify’s efforts in CSAM moderation led to the arrests of more than 500 child predators, underscoring the real-world impact of our work. This is not just a job to us; it’s a mission to make the internet a safer space for everyone, especially for children.

For those interested in understanding the complexities and ethical responsibilities associated with CSAM moderation, we invite you to download our free ebook.

In the meantime, below are three key insights we’ve learned about hunting down this type of harmful content from our nearly two decades in the business.

Meet WebPurify's moderation team, responsible for the detection of Child Sexual Abuse Material

1. The importance of a layered approach: AI and human expertise

As we see the myriad ways in which artificial intelligence is revolutionizing different industries and making moderation easier on humans, it’s easy to wonder: can’t AI handle content moderation by itself? The short answer is no, especially when it comes to sensitive and high-stakes material like CSAM. At WebPurify, we employ a multi-tiered strategy incorporating both AI and human expertise to ensure that CSAM content is identified with utmost accuracy.

While AI provides the first line of defense, it’s the human moderators who offer the essential second layer of review. This combination harnesses the speed of technology and the nuance of human judgment to create a system that’s both efficient and incredibly accurate.

“In CSAM moderation, even a 99% accuracy rate isn’t good enough. Our blend of AI and human oversight aims for nothing less than 100%,” says Josh Buxbaum, co-founder of WebPurify.

2. Emotional Resilience and Mental Wellbeing of Moderators

Behind every content moderation system, there’s a team of human moderators carrying the emotional weight of the work. It’s an aspect of content moderation that often gets overshadowed by technological solutions. WebPurify places great emphasis on the mental well-being of our moderators, incorporating features like black-and-white imagery and blurring functionalities into their internal tools to mitigate psychological strain.

What’s more, our moderators are kept updated on the real-world impact of their work, which can be a powerful motivator in a job where one can often feel isolated.
For more on WebPurify’s approach, see our eBook on Best Practices for Mental Wellness in Content Moderation.

“Our human moderators are our last line of defense and our most valuable asset. Their mental well-being is not just an ethical obligation, it’s a strategic necessity,” Josh says.

3. Specialized Training and Team Composition

The job of a CSAM moderator isn’t for everyone. It requires not just training, but also a certain level of emotional resilience, ethical commitment, and dedication to the mission. WebPurify is very selective when it comes to choosing moderators for this challenging role. By recruiting from within our already experienced team, we ensure that only the most qualified and emotionally resilient individuals are put on the front lines of CSAM moderation. Coupled with extensive training and the guidance of Subject Matter Experts (SMEs), WebPurify ensures that moderators are not just capable but also well-supported.

“The right training can turn a good moderator into a great one. But the right team can turn a mission into a movement,” Josh says.