WebPurify Profanity Filter, Image Moderation, and Video Moderation Services

Faces Behind The Filters

Meet WebPurify’s moderation team responsible for the detection of Child Sexual Abuse Material (CSAM)

WebPurify Faces Behind The Filters eBook Cover

Find out what and who it takes to wage war on some of the internet’s most challenging content with strategy and staying power.

WebPurify’s team has identified over a million pieces of child sexual abuse content that led to over 500 known arrests in 2022.

Over the past 17 years, WebPurify has come to understand the unique challenges of CSAM moderation. In order to identify this content accurately and remove it quickly, an organizational and technological approach has been developed to support CSAM moderators – and protecting their mental wellbeing is a key part of this.

Here, we shine a spotlight on WebPurify’s expert CSAM team: who they are, how they work – and their advice on how your business can be best prepared.

“Faces Behind the Filters” is now available for everyone to read. No email sign-up needed! Simply click here to access – no strings attached!

 

Stay ahead in Trust & Safety! Subscribe for expert insights, top moderation strategies, and the latest best practices.