If you’re running a content marketing campaign and considering using a crowdsourced solution to moderate user-uploaded images (UGC) on your website, we strongly suggest that you reconsider.
There are several important reasons why crowdsourcing for image moderation is a dangerous idea.
But before we address them, let’s briefly answer the question: What is Crowdsourcing?
Flashback to 2006: Crowdsourcing is a hot new concept that many claim will revolutionize the business world.
Simply stated, it’s where a company taps into the efforts of large online groups of people to accomplish various tasks. In other words, labor is sourced from a crowd.
Jeff Howes, who popularized the term in his Wired.com article, The Rise of Crowdsourcing, even goes as far as to suggest that crowdsourcing is “driving the future of business.”
But is crowdsourcing really living up to the hype?
Does Crowdsourcing Work?
Yes and no. There are certainly many crowdsourcing success stories.
Perhaps the most famous example is Wikipedia, the world’s largest encyclopedia that boasts over 26 million articles created and maintained entirely by volunteers. Anyone can anonymously edit a Wikipedia article — therein lies both its power and its pitfalls. But we’ll get to that shortly.
Microsoft is probably the most well-known corporate enterprise to successfully make use of crowdsourcing. For the beta test of Office 2010, Microsoft allowed 9 million people to download and test the software — and received over 2 million comments.
NASA’s Kepler program found a way to harness the power of the crowd with a game called Planet Hunters that enables amateur astronomers to detect new planets. 69 potential planets have been discovered from volunteers playing 3 million rounds of Planet Hunters.
The Dangers of Crowdsourcing
While crowdsourcing has yielded powerful results for online businesses and organizations, it has also proven destructive. While some examples are rather humorous, others have had dangerous consequences.
The most recent and poignant example was the investigation of the tragic Boston bombing. The FBI crowdsourced the identification of the two men responsible, and reddit users began their own investigation in a subreddit called “findbostonbombers”. After several false positives on the group led to innocent people and their families being harassed and threatened, the founder of the subreddit declared the effort a “disaster” and “doomed from the start,” adding that he was “naive” to think it could work. He closed his reddit account shortly after the failed effort.
But why did this crowdsourcing effort fail so miserably? The reason is actually quite simple.
Anonymous, Untrained and Unaccountable.
Generally speaking, crowdsourcing often fails because volunteers that participate are anonymous, untrained and unaccountable.
Anonymous – The identities of crowdsourced laborers are rarely known, and they often hide behind an ambiguous username.
Untrained – Crowdsourcing workers are rarely educated about or skilled in the subject matter that they’re recruited for. This leads to low-quality, unreliable and inconsistent results.
Unaccountable – Because the workers are anonymous, they have very little accountability. As such, it’s impossible to trust them with mission-critical tasks or confidential information.
To understand why this is so problematic for image moderation, let’s take a look at the steps involved.
How Does Image Moderation Work?
Generally speaking, image moderation services work like this:
You submit the URL of an image to the moderation service.
Someone reviews, or moderates, the image.
The result of the moderation is returned to a callback URL you provide.
Based on the results, your website either approves or discards the image.
Why is Crowdsourcing Image Moderation So Dangerous?
With crowdsourcing, there are no experts involved in the moderation (step 2). Instead, everyday people are assigned the task of approving or rejecting the images uploaded by your users. These moderators are inevitably anonymous, untrained and unaccountable.
As such, a business has very little confidence that these moderators:
will moderate according to the proper criteria
won’t steal the images or distribute them online
won’t allow false positives or negatives due to ignorance, laziness or malice
Let’s take look at how each of these can hurt your business and your users.
Moderation Criteria Can Vary Greatly
What is an acceptable image to your company? Is that perception the same for someone outside of your organization that’s completely unfamiliar with your standards?
Because standards can vary greatly from one moderator to the next, there is simply no way to be certain. Here are a few examples, but this list is by no means exhaustive:
Sexually-explicit material – For a highly-conservative church organization, pictures of women in bikinis may not be acceptable for their website. However, an untrained moderator may not realize this and approve the image simply because it contains no nudity.
Profanity – Moderators, without proper training, may not recognize certain slang words as being profane or offensive.
Competitor brands – When hiring moderators to ensure that competitor brands stay off your website, an untrained moderator may be completely unfamiliar with your competitors.
Images are Vulnerable to Theft and Unauthorized Distribution
Because moderators are anonymous, they’re unaccountable. Because they have no accountability, there’s nothing stopping them from stealing your images. Once they steal your images, they can send those images to other people, upload them to other websites and share them via social media.
Unfortunately, once your images are out in the open, there’s no way to stop their distribution. This represents a major violation of your users privacy, and can lead to permanent and irreparable damage. This can severely hurt your company’s reputation, and could possibly lead to legal consequences.
In the case of image moderation, an ounce of prevention is clearly the best cure.
Quality Image Moderation is Impossible to Maintain
When image moderators are untrained and unaccountable, there’s very little to prevent them from approving images that definitely shouldn’t be, or rejecting images that are benign and thus acceptable.
There are several reasons this situation can happen. Here are a few possibilities:
Ignorance – As we mentioned above, the untrained moderator may not have a firm grasp of your moderation criteria.
Laziness – They could also simply be approving or rejecting images out of laziness, to quickly clear the image moderation tasks from their work queue.
Malice – If the person tasked with moderation is opposed to an organization’s religion or dislikes a company, they may intentionally moderate the images incorrectly out of spite or protest.
In interviewing our clients about their previous crowdsourcing experiences, many of them report that false negatives and false positives happen with a frustrating and regrettable frequency.
So How Should You Moderate Images? Trained & Supervised Image Moderators Are Key.
If it’s important to you and your company that your images don’t become stolen, that a strict moderation criteria is maintained and that false positives & negatives are virtually eliminated, crowdsourced image moderation software is a dead-end.
You then only have one solid, secure and reliable option: a professional image moderation service staffed by highly-trained, supervised and strictly-monitored image moderators.
By contrast with crowdsourcing, a professional image moderation service can ensure with extremely high levels of confidence that:
Your organization’s custom moderation criteria are strictly adhered to.
Moderators cannot steal or distribute your images since they’re using company computers and their internet access is restricted and monitored.
False positives and negatives are eliminated since the moderators are supervised and their livelihood depends on it.
To protect the privacy of your users, to ensure that children aren’t exposed to harmful material, and that your company’s brand and reputation are protected, we strongly recommend that you avoid crowdsourcing for image moderation. The risks are far too high. Instead, we suggest you evaluate a professional image moderation service like WebPurify.