Having been in the Content Moderation space for over 14 years, we are always interested in the unique approaches companies take towards the often complicated and nuanced art of UGC moderation, carefully balancing between keeping their site safe and allowing their community to openly share their content and views. While we advocate for an efficient combination of our AI services and our highly trained in house human teams, when it comes to intricate policy decisions, we know those should always be made by people.
In Dec of 2019 we wrote about some companies experimenting with a “jury approach” to content moderation for deciding on policy disagreements that arose when the company removed content they felt violated their community guidelines, but the user felt otherwise. In the summer of last year, Facebook announced its plan to create an Independent Oversight Board. Although it’s not yet active, further details have been released in the last few months. Facebook shared the list of some of the people who will make up the Oversight Board, as well as how the appeal process will ostensibly work.
While some are wondering what the holdup is, others are concerned that the board will be ineffective against COVID. Before addressing those problems, let’s go over the details that Facebook has shared about its Oversight Board.
Who Are the Board Members?
Facebook established the structure and bylaws after consulting with experts around the world. The selection process began with the company choosing four co-chairs, who then went on to have a hand in selecting the next 16 board members. Once all the selections have been made, 40 members will serve on Facebook’s Oversight Board in three-year terms.
While only half filled, the board already includes members who, combined, have lived in over 27 countries and speak at minimum 29 languages. Nobel Prize winners, law professors, former heads of state, and formers judges are just some examples of the members included in the initial selections made by Facebook co-chairs.
How Does the Appeals Process Work?
It stands to reason that not every appeal will reach the board. To determine which cases are heard, a rotating five-member board within the board – referred to as a Case Selection Committee – will find cases that they consider most urgent, looking at factors like the number of users impacted or issues that raise concerns about Facebook’s role in public discourse.
In order to judge each case, the panel will collect the following information from the so-called plaintiff or defendant and Facebook alike:
• A statement by the person who submitted the case (and/or who posted the original content);
• A case history (from Facebook);
• A policy rationale (from Facebook);
• Clarifying information (from Facebook) if requested by the board; and
• All additional outside information, if requested by any member of the panel.
The board members will then weigh these findings and their circumstances against Facebook’s content policies.
A decision will then be drafted for the entire board to review (how much power the other members of the board have over the panel’s decision is unknown). If the board approves the final decision for release, it will be published on the board’s website – as well as translated to the board’s official languages. All persons involved in the appeal will also be notified.
In addition to Facebook’s Oversight Board not being operational, the information that seems to matter most is being kept close to the chest. The latest developments still leave a lot unanswered – perhaps most crucial, where the money is coming from. The board is funded by an independent trust, but details about said independent trust have yet to be made public. And the public won’t know just how independent Facebook’s Oversight Board is from the company until those details are disclosed.
And yet another crucial question remains: Will the Oversight Board’s rulings have an effect on Facebook’s actual policies? After a verdict has been reached, there’s no guarantee that it will impact policy. And even if the company factors in the board’s rulings, the same problems that Facebook and other social media giants are facing remain – chiefly the volume of misinformation posted and shared by users combined with the shortage of human moderators.
When Will It Actually Start?
Facebook points to COVID as the proverbial wrench stopping up the works. A representative of the company’s communications department stated, “There are a number of operations procedures they need to work out and they don’t even have equipment (i.e., laptops etc.) to do the work due to COVID delays.”
A lack of laptops being an obstacle for a massive social media company gives one pause for thought. All the same, Facebook maintains that they are “working hard to set the board up to begin operating later this year,” suggesting the Oversight Board won’t be operational until after the election.