Request a Demo Blog

Which content moderation service is best for my business?

March 22, 2024 | Content Moderation Case Studies

Whether you’re a burgeoning startup, a medium-sized enterprise, or a large corporation, your brand’s specific needs for content moderation can vary significantly based on your industry, market, and audience size. Prospective clients often ask us, ‘Which content moderation service is best for my business?’

Recognizing that a one-size-fits-all approach falls short, WebPurify adopts a consultative strategy to guide new or prospective clients towards the content moderation solutions that best fit their current requirements, while also anticipating the evolution of those needs over time.

It can be daunting trying to figure out which blend of moderation tools, including AI and/or human moderators is right for your platform, especially when considering the unique challenges and objectives of your brand. We believe our commitment to understanding these nuances is one of the many things that sets us apart in our industry.

We like to start our journey with a conversation that probes your current needs, delving into the specifics of your brand’s sector, scale, and target market. From there, we ask questions about your future plans and resources, among other considerations, in order to tailor a service that aligns perfectly with your goals.

From startups looking to make their initial mark in the App or Play Stores to established e-commerce giants aiming to maintain their reputation, our adaptive and consultative approach ensures that every brand finds its ideal content moderation partner.

WebPurify’s Vice President of Sales, Bartell Cope, leads these conversations with our new clients, and in this blog we’ll explore the questions and key considerations he tries to understand in order to help you better discern which content moderation service is best for you.

Do you know what to look for in a content moderation service?

1. Start Simple and Scale

“As much as we always want people to have the most robust solution and comprehensive trust and safety approach, we recognize there’s a right time for everything,” Bartell says. “We don’t oversell. If you’re just launching and unsure of your content volumes or exactly what types of violative content you’re most likely to face, you should stick with an entry-level option.”

Bartell suggests that brands start by covering the basics. For example, before concerning themselves with bespoke human moderation teams or enforcing custom criteria, most brands are better advised to stick with checking for the “big stuff” – things like hate symbols, nudity, and profanity. Beginning with an entry-level option allows you to cover fundamental moderation needs without overwhelming your budget or resources.

As your understanding of your moderation requirements grows, along with your appreciation of the versatility but also limitations of moderation tools, you can scale your services to include more advanced features. Likewise, as your business scales and you have a better idea of your content volumes month over month, you’ll be in a better place to commit to a volume minimum or long-term moderation contract, which enjoys lower price points.

2. Understand Your Needs

Content complexity and volume are critical factors in choosing a moderation service. Determine whether your primary focus is on text, images, videos, or audio and consider the specific nature of your content, whether it’s user comments, chat messages, product reviews and listings, livestream vs. pre-recorded video etc.

“Platforms need to think about their key criteria,” Bartell explains. Do you already have humans moderating content but need AI to scale? If so, you can use our AI in tandem with your human team. Likewise, if you have AI but need humans, we can do that too. While we can certainly be your comprehensive solution, we don’t insist on it if that doesn’t make sense. We can be one-half of the whole.”

Understanding these characteristics of your use case help you select a service that’s adept at handling your specific content type and volumes efficiently, and able to meet your needs where they stand at the moment.

3. Turnaround Time Matters

“The responsiveness of your moderation service can significantly impact user experience,” Bartell points out. “When content is delayed going live, you’re going to see user frustration if not outright attrition pretty quickly.”

Clients also need to think about the expectations of their users in terms of the turnaround time for moderation. Photo contests, for instance, are a bit more relaxed with moderation turnaround times. You can approve these in 30 minutes in most cases without users becoming unhappy.

Likewise, dating apps also have a more relaxed timeline for approval. Social media users, though, demand near-instant validation. Similarly, on e-commerce sites that offer product customization, immediacy is key; you don’t want to interrupt the customer journey or the client will abandon their purchase before completion.

The aforementioned user expectations figure into whether you need to stick with AI and not humans, or – in the case of more nuanced moderation criteria – use a whole lot more humans. You need to choose a moderation service that offers the agility you need to meet your users’ expectations. Turnaround time is paramount.

4. Budget and Personnel

Evaluate your financial and human resources to determine if building an in-house moderation team or refining an existing system is actually feasible. If resources are limited, leveraging WebPurify’s services can provide a cost-effective and efficient solution.

You should also consider the level of integration complexity you’re prepared to manage and ensure you have the technical expertise to implement it seamlessly. Integrating our profanity filter is exceptionally easy; anyone can do it. But if you are integrating a variety of our solutions and they’re each tackling different requirements or (not uncommon) enforcing slightly different sets of rules for different segments of your audience, while initial integration will still be seamless, you probably need someone on your team who has experience doing this.

5. Pricing Models

What pricing model works for you? What are your volumes, and does it make sense to sign a contract for a better price per image or should you use WebPurify on a pay as you go basis to have more flexibility?

We work with 1 in 7 Fortune 100 companies, but we also work with lots of medium-size companies and startups,” Bartell says. “We know that choosing the right pricing model is crucial to align with your business’s financial realities and future.”

WebPurify doesn’t lock you in. We grow with you. Think about what engagement model you want to use with us. Whether your content volume is consistent or fluctuates seasonally will influence whether a fixed contract or a pay-as-you-go model is more economical. WebPurify offers flexibility to adapt to your business’s evolving needs.

6. Avoid Over-customization

Avoid any unnecessary work and manage your expectations. “Custom solutions can offer tailored results but at a higher cost,” Bartell says.

Before opting for heavy customization, assess whether adjusting your moderation criteria could address your needs more economically. Prioritize the most impactful moderation goals to streamline your process and reduce unnecessary expenditures. In other words: don’t obsess over niche cases.

However, we recognize that some clients, particularly enterprise businesses, often require customization and we can always accommodate this.

Speed of content moderation

7. Audio Moderation

While we offer a very good audio moderation service and of course want clients to use it, cost-conscious customers need to keep their relative priorities in mind. Audio content presents unique moderation challenges and often requires more resources due to the nature of the content and additional processing time, in the case of both AI and human review…

Many customers initially approach WebPurify planning to moderate all content types, including but then eventually deprioritizing audio. The fact of the matter is most content violations online are textual or visual and it’s actually quite rare for inappropriate audio to not be accompanied by inappropriate visuals. . Evaluate the necessity of audio moderation for your platform and weigh it against the potential increase in costs and turnaround times.

8. Company Life Cycle

“Consider where you are in the life cycle of your company,” Bartell advises. “Are you building your platform or is it already going strong? Are you pre-empting something (proactive) or are you closing a moderation gap and responding to something slipping through (reactive)?

“Let’s say you’ve already built your app and you have some immediate problems with violative content you need to rectify. I’ll tell these clients, ‘Let’s get some AI in place to stop the bleeding first, then have a detailed conversation about more nuanced approaches later.’”

Alternatively, if you’re coming to WebPurify pre-development or in the midst of building out a platform, this presents a prime opportunity to take things slower, and wireframe both your UX alongside your moderation logic. Here I’d suggest not just integrating our AI or retaining our human moderation services, but also taking advantage of our Trust and Safety consultancy.

Your moderation needs to evolve with your company’s growth. WebPurify’s adaptable services cater to companies at any stage, ensuring you have the right support as your needs change.

9. Subjectivity vs. Objectivity

One of the biggest questions for new clients is if their content moderation challenges require the discernment of human moderators or can be effectively managed with AI. While AI excels in identifying clear-cut violations, human moderators are better suited for content that requires contextual understanding or nuanced judgment.

So the question you need to ask is, ‘Is my content subjective or objective?’ We find that, with images and video, in particular, humans are more needed than many prospective customer first realize. And the growing pervasiveness of generative AI is making this even more the case.

An important lesson for UGC platforms is to concede when you can’t do something with AI alone. At WebPurify, we’ve found our willingness to acknowledge this and own where AI still falls short – ours or any other models on the market – and embrace human moderation as a complementary solution has set us apart within the industry.

10. Consultancy Services

As touched on earlier, WebPurify offers standard-setting consulting services. Do you have a vision for your trust and safety strategy? Whether you’re defining said strategy or looking to enhance your existing approach, our consultancy services provide expert guidance, bringing a collective 45+ years of industry experience to the table.

“From threat analysis to workflow creation, red teaming to legal compliance, and even helping you hire and train your own human moderators (not just onboarding ours) our team can help you refine your moderation strategy in a way that’s futureproofed and designed to scale,” Bartell adds.

11. Moderation Tools

Do you simply need skilled, veteran moderators, ready to dedicate themselves to your project, or do you also need a tool in which they’ll organize the content they’re reviewing, and take actions accordingly? Do you have your own tool, ready for a team to plug in? WebPurify offers flexibility in integrating with your existing moderation tools or providing a comprehensive solution with our proprietary platform. This adaptability ensures that you can choose the approach that best fits your operational workflow and needs given your current setup.

12. Content Access

Platforms need to consider how WebPurify accesses content that is being moderated.WebPurify’s approach to content moderation emphasizes privacy and security, and it follows that we require clients’ content be hosted – not passed along as actual files (.pngs or .mp4s for example). By accessing hosted content via URLs we avoid storing client data at all, maintaining that much more data security and compliance.

In short, to work with WebPurify, you’ll need to ensure your content is accessible in the cloud. If it’s not (which is rare but does happen), some architecture will need to be reworked before partnering.

13. On-Premises vs. SaaS

Occasionally clients want to actually purchase a license to our product and own a version of it, not just subscribe to access which is the more typical SaaS model. Financial services firms, for instance, are sometimes partial to on-premises solutions for security reasons. The decision between an on-premises solution and our SaaS offering depends on your security requirements and operational preferences.

“While on-premises solutions offer some incremental benefits from an “absolute control” standpoint, it’s important to consider they entail a large one-time upfront cost, and are more cumbersome to update vs. WebPurify’s SaaS model which is updated continuously.

Plus, remember WebPurify lives in AWS (Amazon Web Services) which is already encrypted and boasts enviable uptimes. There’s marginal security gained, if any, going on-prem,” Bartell explains. Still, for some brands it’s an accommodation we’re happy to make if it’s simply a matter of internal processes and their business protocols.

14. Human Moderation Options

WebPurify provides versatile human moderation services, from teams reviewing many clients’ content against a standardized set of criteria to custom monitoring enforcing bespoke rules. This flexibility allows you to choose a service level that matches your moderation goals, budget, and existing resources.

When choosing between dedicated custom teams and the more standardized “TurnKey Live” moderation, we recommend brands take stock not just of their budget, but also desired SLA, complexity of their criteria and whether they’ll be leveraging AI, too. For example, if you’re already running content through AI as a “first pass” and mainly worried about nudity, it probably makes sense to escalate any middling “possible nudity” scores to a Turnkey Live team for a second check.

On the other hand, if your first priority is flagging images of irresponsible drinking, you’ll want to use AI to flag images of alcohol but then will need to escalate those to a dedicated custom team, this “irresponsible drinking” is subjective, and something needing a custom team trained on your particulars to enforce.

15. Adapting to Change

Platforms should also think about whether your moderation needs will shift in response to new trends, societal changes, or emerging challenges. For instance, will election season bring an influx of political comments to your app that need a nuanced review? Or are you a platform with straightforward and mostly static needs, such as screening out profanity in an e-learning game for elementary school-aged kids?

Select a moderation service that can adapt quickly to whatever changes you anticipate, ensuring your platform remains a safe and positive environment for users.

16. Special Requirements

WebPurify accommodates diverse moderation needs, allowing you to apply different rules for various groups of your user base or product segments. Do you need content moderation for livestreams? Are your needs restricted to periods of time like the holidays or current events? We have lots of clients who aren’t year-round customers.

Likewise, you might want different moderation rules for different parts of your product, or you may want to subject tiers of your users to different sets of criteria, such as an 18 and under section, more stringent and frequent checks of uploads by new and unvetted accounts, or a premium version of a product. You can do that with us, and this capability ensures that your moderation strategy can be as nuanced and targeted as your platform requires.