We’re just going to say it loud and proud: WebPurify was a pioneer of online moderation. When we first launched the company in 2006, we were the only player in the game, and businesses were just beginning to realize that they needed profanity filtering and image moderation services. Over time, the need for profanity filtering, image and video moderation has grown exponentially, alongside the explosion of user-generated content on the web and brands offering more and more interaction with their audiences. With this changing landscape, there has been an evolution in jobs catering to this need. The team at WebPurify has seen another development emerge-new roles called Content Manager, Content Moderation Manager, Director of Content, Social Media Manager or something similar.
Here, WebPurify’s Director of Sales and Client Services, Josh Buxbaum, talks about the essential role of content managers and how WebPurify helps them tackle content moderation problems.
When WebPurify first launched, who was the typical direct client contact?
“Initially, we mostly interacted with one of the investors in the business or the founder who recognized that content management was an important piece to take into consideration. Where agencies were concerned, we worked with the creative director or a senior project manager role.”
How did clients find WebPurify way back when?
“They found us in the same way they find us today-through online searches and referrals. We quickly became a referral business.”
Why do you think the role of content manager emerged?
“This has become a mandatory role within many companies due to the explosion of user-generated content. Some brands have had this person for a long time and others, younger companies, have added the role as they became aware of its necessity with the popularity of UGC. The role used to be more focused on creating content, but has since evolved to managing new, incoming content as well.”
What are they responsible for?
“In order to be successful as a brand and capitalize on listening to your audience, whether that involves reviews, images or campaigns, a content manager is essential. They are managing the content piece of the company, a good portion of which is often UGC. They also wear the safety hat.
Some clients even go so far as to have a role called Head of Safety. That’s something we hadn’t seen in the past. We started seeing this position with some of our sites that are targeting a younger audience in particular. In these cases, safety is important enough that Head of Safety is a full-time job. That’s literally all that someone does.”
How does WebPurify help content managers?
“We provide insight on unique ways to tackle a problem they may be facing. Oftentimes that solution is a hybrid of AI solutions and our live teams. Content moderation has always been a complicated field in regards to criteria-picking your battles with users and finding that line between them expressing themselves and you protecting your brand. This is where we really step in as a consultant to help brands and their directors of content navigate the content moderation process. We ask the questions that they may not have considered to help create the right moderation criteria to keep their brand and users safe from various angles.”
How is it beneficial for the team at WebPurify to collaborate directly with content managers vs. a CEO, for instance?
“We love working with CEOs initially, but their view is much broader, as they oversee many aspects of the business. CTOs do come to us initially as well, but their focus is more technical. They are drawn to WebPurify’s easy to integrate APIs. Typically, those roles then introduce us to their content manager or someone in a similar position since they are intimately familiar with the UGC that their platform receives and their focus is right in line with ours. It becomes a true partnership with the content managers. They’re in charge of content; we’re in charge of keeping it safe.
Sometimes it does go the other way though. The creative role contacts us first to help with a particular problem. Then we hear from developers on the integration side. Sometimes creative says to the tech guys, ‘How can we solve this? What’s the best approach?’ Then, they find us.”
Do all brands now know that they need content moderation?
“We love it when a company has the forethought to contact us before their launch so that we can help from the get-go. In general, sites geared to kids, interactive ad agencies and dating sites are pretty familiar with the content moderation routine now. And that’s because they created an internal role to handle it-and if that’s your job, then you get caught up pretty quickly on what you need to do to keep your client or brand safe.
Unfortunately, startups often still reach out to us as an afterthought. They haven’t considered the depth of the content moderation piece, and so, this is a blind spot for them. For instance, they may be excited about launching their app as quickly as possible, then uh oh, they are overwhelmed by the massive user response and recognize that their team needs urgent help monitoring content for profanity and more.”
How do you think WebPurify’s support of the content manager will continue to evolve?
“As our AI solutions get smarter, WebPurify is helping brands identify what can be solved with humans and what can be solved with machines. Technology is moving so fast and solving UGC risks requires a strong partner who remains current on these ever-evolving solutions and how to seamlessly blend them with a human team. They need someone who is really in tune with these advances, literally to the day. And that’s us.”