Take a seat in the moderator’s chair and consider the four photos above. A client has given simple instructions for moderation: images must contain a dog. Which photos should be approved and which should be filtered out?
Obviously, the corgi toast is a work of art, but does it meet the criteria of “dog”? Technically, it’s food. What about the stuffed toy dog in a cardigan? We’re sure you caught the wolf, but shouldn’t you check with the client just in case?
If “dog” or “no dog” is a more nuanced distinction than it originally appears, further questions around acceptable user-generated content only get trickier.
Profanity is often on our clients’ lists of unwanted content. What about profanity in other languages? What about racist comments? Content that is disparaging of your brand? You’re likely to want to moderate images and videos containing violence and nudity. What if a user posts an image with drug paraphernalia? Copyrighted images posted without the owner’s permission? Should you filter spam, and how are you drawing that line carefully enough to protect your advertisers? What if your product is widgets and some enterprising users start posting photos of gadgets to all your social media pages?
Your considerations for moderation guidelines are already growing (and trust us, they’ll continue to evolve).
Some of the above criteria can be solved with automated technology, but some requires a live team of human beings. How many people do you need on the moderation team? What tools are they using? Do you have natural spikes in traffic, or a big upcoming ad campaign that will increase submissions? What’s the plan to scale for those increases? Do you require round-the-clock moderation? How quickly does the content need to be reviewed and posted (or filtered out)?
The criteria and implementation plan got complicated quickly, yet those questions were only an incomplete list of the basics. There are more blind spots to navigate, and avoiding them requires experience and complex solutions.
While user-generated content is an undisputed powerhouse among marketing tools, it’s also a leap of faith to relinquish some control of content to users. Throw in the infinite memory of the internet, and proper moderation becomes crucial.
WebPurify begins a partnership by working closely with our clients to determine custom criteria and identify guidelines based on their brand’s audience and culture. Once the project is live, we use AI to remove the obviously inappropriate content, while our team continues to drill down on minutiae, finally escalating any gray areas to the client. Between the “creativity” of users and continuous advances in technology, your moderation criteria should be ever-evolving.