Request a Demo Blog

The role of collaboration in combatting child exploitation online

July 2, 2024 | UGC

If there’s any cause that should unite social media platforms, government, civil society and the public, it’s the issue of child safety. Recently, our VP of Trust & Safety, Alexandra Popken, spoke to digital safety advocate and child protection expert Caroline Humer to find out why it’s so important for these bodies to work together to protect children in an increasingly perilous online world.

Originally from Switzerland and now based in the US, Caroline started her career investigating train crashes for the British Transport Police. She went on to work for the US-based National Center for Missing & Exploited Children (NCMEC) and its sister organization, the International Centre for Missing & Exploited Children (ICMEC), before setting up her own consultancy in 2021 advising companies on how to build frameworks or mechanisms to protect the most vulnerable populations online and offline.

“Like a lot of people who end up working in child protection, it wasn’t something I was seeking out,” she explains. “I didn’t know this industry existed when I graduated 25 years ago, so it was something that fell into my lap, so to speak. While the child protection community has been around for quite some time, the Trust & Safety industry is relativity new. We’re trying to find way how the child protection community fits into Trust & Safety – trying to figure out what makes sense, what works, and where the experiences we’ve all had fit in.

Caroline Humer on the challenges platforms face in detecting and reporting CSAM, and the need for effective tools and policies

The impact of legislation

In April 2003, the US government enacted the Prosecutorial Remedies and Other Tools to End the Exploitation of Children Today (PROTECT) Act requiring platforms to report any child sexual abuse material (CSAM) they encountered to NCMEC’s CyberTipline. Platforms didn’t have to search for it, but if they knew it existed on their sites, or if a user reported it to them, they had to pass it on.

Despite it not being a requirement, many platforms started voluntarily using their own or third-party search tools to proactively detect CSAM, and reports have been rising significantly ever since.

Over the course of 2023, reports to NCMEC rose more than 12% compared with the previous year, surpassing 36.2 million in total. The majority related to the circulation of CSAM, but other trends indicated by the data were the continued rise in reports of financial sextortion and the use of generative AI in child sexual exploitation.

Collaboration is key

There’s a huge amount of data in 36 million reports and Caroline believes the only way the industry can get anywhere close to analyzing it effectively, and ultimately bringing that number down, is by working together and sharing best practices.

“We’ve got so many players in the child protection space: non-profits, law enforcement, and task forces. The biggest challenge is that there isn’t enough collaboration. We see each other as competitors, and we don’t want to give away our trade secrets, but we need to be able to share. We need to be able to say, ‘this is the mechanism we use, and it works 90% of the time.’

“We need to understand the trends and what’s coming next. If we don’t analyze the data as much as we analyze the offenses, then the offenders will take advantage of us not knowing what’s going to happen.”

Leading the way

As an example of what’s possible when organizations join forces, in 2023, the Tech Coalition launched Lantern – the first child safety cross-platform signal sharing program for companies to strengthen how they enforce their child safety policies.

“Lantern is an amazing project that brings together a massive amount of platforms who say, ‘Let’s share data, but not compromise our own privacy or our own integrity,’” says Caroline. “The Tech Coalition figured out a way to do it – despite all the laws and legislation. Do we need to know exactly how they’re doing it? No, as long as it’s effective. It’s a great program that shows collaboration between platforms is doable and can be successful. Let’s do more of it, instead of saying, ‘Oh no, we can’t because we’re competitors.’”

Utilizing the latest tech

Caroline believes AI could transform content moderation, potentially shielding content moderators from the very worst aspects of the role and enabling them to do their jobs more effectively – but human intervention will still be necessary to achieve the best results.

“AI is never going to be 100% correct, but we can get better at CSAM detection by having more of a structure in place, or more tools that allow us to make the content more manageable before it gets to the human,” says Caroline. “We need to find that balance.”

Despite everything that she has been exposed to, the seemingly endless rise in reports of child exploitation and the additional threats posed by AI-generated CSAM, Caroline is positive about the future.

“I am hopeful,” she says. Regarding AI-generated CSAM, Caroline notes, “I do believe AI will get better. In the simplest terms, it’s the same as an Excel spreadsheet. If you put good data in, you’re going to get good data out. If we don’t do that, it’s going to get worse, but there’s plenty of people in the AI industry who are ensuring it’s going to get better rather than worse.”

As to whether or not CSAM detection improves over time, Caroline is also hopeful. “We’ve got a massive amount of technology out there that works fantastically well,” she adds. “If we harness all of that effectively, we can make a huge impact. I would love to see everybody coming together and using every tool that’s available regardless of where it’s coming from, so we can build a comprehensive and holistic response to protecting kids online.”

Next time: The challenges platforms face in detecting and reporting CSAM, and the need for effective tools and policies.

Stay ahead in Trust & Safety! Subscribe for expert insights, top moderation strategies, and the latest best practices.