WebPurify offers two services where image moderation is concerned—a standard and a custom image moderation solution. Many websites and apps on a budget have found great success with WebPurify’s standard image moderation offering. You receive quality service with our trained in house professionals who check images within five minutes of them posting—with no big contract attached and at a low cost to you. The standard image moderation criteria was designed to catch the “bad stuff” that most brands won’t allow on their site such as nudity, violence, hate, drugs, and offensive gestures. If you need to be able to modify the standard criteria to suit your brand, you likely need our custom service, where we train and dedicate a team specifically for your project.
We call custom image moderation “the business of defining gray” because an image that is completely appropriate for one client might be completely inappropriate for another or there may be varying degrees of correctness/incorrectness within one image subject. Shades of gray between WebPurify’s standard image moderation criteria, and a client’s level of tolerance for certain types of content can be determined by cultural differences, political correctness or simply what’s relevant content for their brand.
The following are a handful of scenarios we have run into in the past where custom image moderation solutions were warranted:
We have developed our standard criteria around North American social norms, and in the U.S., it’s typically not appropriate to feature nudity, such as topless photos. WebPurify’s standard moderation solution rejects nudity. However, in some South American and European countries where the cultural norms are less conservative, some types of nudity might be considered suitable content. While some clients simply utilize our standard image and video services and accept the conservative nature of our moderations, it just doesn’t jive with others. We work closely with our clients to determine custom criteria and identify where they draw the line based on their brand’s audience and culture.
As times change, so does our collective perspective on what is appropriate and inappropriate to show in images. Politically progressive clients might deem certain images, such as a marijuana leaf or a mother breastfeeding, perfectly acceptable today whereas the same images may have been too shocking in the past. WebPurify’s standard moderation criteria stringently removes any photos of nudity or drugs. Since some clients may want to allow these images, we can help them manage to what degree—such as perhaps accepting an image of a marijuana leaf but rejecting a pic of someone holding a joint.
WebPurify often has clients whose apps, websites or campaigns target a very specific niche. For the purpose of this example, we’ll call the niche “cat owners” and the app “a place where cat owners share cat photos and chat about their cats.” The client’s rule may be that each image must have a cat in it. But what happens when some users upload a photo of a lion that they took while on African safari? Or, what if it’s a shot of cat apparel, a drawing of a cat, a cat on a cat food can or, yikes, a stuffed cat? Now the image moderation has become more complicated. It’s never just about “cat” or “no cat.”
These are all examples of the “shades of gray” WebPurify navigates daily. In our client consultations, we drill down on the minutiae as much as possible to provide a custom image moderation service that is brand-content appropriate for each customer.
Have a question about custom image moderation? We’d love to hear from you.