Best Practices for Trust and Safety Officers: An Interview with the TSPA’s Co-Founders

November 4, 2020 | Image Moderation, Profanity Filter, Video Moderation, UGC

The Trust and Safety Officer The trend towards doing business online and the need to address the safety and security issues this brings has led to the introduction of a role that until recently was behind the scenes: the trust & safety officer. This individual is typically assigned with helping to design and implement policies…

How Brands are Leveraging UGC During the COVID Crisis

October 12, 2020 | UGC

By keeping people indoors, the COVID-19 pandemic has sent businesses rushing back to the drawing board, especially regarding how they reach their customers. A sharp decline in shopping at physical retail spaces correlated with a surge in online shopping has created an opportunity for industries that have typically seen lower online engagement from their customers….

Creating Your Community Guidelines – Examples & Best Practices

August 17, 2020 | UGC

All new online communities, from dating apps to gaming to social networks, are tasked with creating and maintaining comprehensive community guidelines. You have a responsibility to not only to keep users safe, but to make sure you are clear about the rules, so when you take content down it’s justified and easily explainable. Creating your community guidelines from…

Facebook’s Independent Oversight Board – Recent Developments

July 24, 2020 | Image Moderation, UGC

Having been in the Content Moderation space for over 14 years, we are always interested in the unique approaches companies take towards the often complicated and nuanced art of UGC moderation, carefully balancing between keeping their site safe and allowing their community to openly share their content and views. While we advocate for an efficient…

Social Media Leaves Content Moderation to AI Amidst Pandemic

May 23, 2020 | UGC

  On March 16, Facebook sent home thousands of content moderators in the midst of a global pandemic. Other social media giants like YouTube and Twitter took the same cautionary measures. In the absence of a large portion of their human teams, these companies turned almost exclusively to their Artificial Intelligence solutions to enforce their…

TikTok Addresses Content Moderation Concerns Amid Steady Growth

March 10, 2020 | Video Moderation

  We’ve discussed how social media giants Facebook and Twitter, platforms that have existed since the early 2000s, have made recent policy changes to address the concerns of the U.S. government as well as of anyone pushing for a safer online community. TikTok, an incredibly popular app among teens that lets users create short videos…

How Unregulated Internet Exposure Affects Digitally Savvy Children

February 9, 2020 | UGC

  In June of last year, we spoke with child psychologist and advisor to WebPurify Dr. Adam Pletter about how cyberbullying affects children and what parents can do if their kids are being harassed online. But cyberbullying isn’t the only concern when it comes to digitally savvy kids. Recently, we reached out to Dr. Pletter,…

Using a Jury for Content Moderation – Pros and Cons

December 11, 2019 | UGC

In a post last month, we discussed Facebook’s end-of-year goal of instituting an independent oversight board to adjudicate what content will be removed from the site. Here we’ll discuss the recent details that Facebook has provided about its “content jury,” as well as content juries in general, a method that has caught the eye of…

Facebook Updates Policy for Suicide Prevention

October 18, 2019 | Image Moderation, UGC

  The last year has seen consistent adjustment of content moderation policies on the part of social media giants. Recently, we discussed the changes that Twitter and Instagram’s policies underwent. Just last month, Facebook announced a change in its practices. Around the same time, Mark Zuckerberg spoke about his involvement in determining policy as well…

Twitter and Instagram Update Policies to Combat Bullying

August 28, 2019 | Image Moderation, UGC

  In previous posts, we’ve discussed the content moderation policies of large social media platforms and the factors that are catalyzing their changes. Recently, Twitter and Instagram announced updates to how they monitor content in an effort to combat bullying on their respective sites.Both companies are working toward the same goal, but each is taking…