Facebook’s long-hyped oversight board, billed as a “supreme court” for content decisions, will finally launch in October ahead of the US presidential election on 3 November.
The board will convene 20 high-profile thinkers (including academics, former judges, specialists in law and human rights, and former Guardian editor Alan Rusbridger) from around the world that will be empowered to make legally binding decisions on content that could overrule those made by the most senior Facebook executives.
Users will submit appeals about posts that have been removed, in the hopes of getting the board to overturn Facebook’s decision.
Given that Facebook and Instagram, which will also be covered by the board, collectively hosts approximately four billion monthly active users, it’s unclear how the board will prioritise the importance of various appeals. Facebook has said that complaints raised with the board will be resolved within 90 days.
The concept of the oversight board has faced criticism from commentators from the offset. Lecturer at Harvard Law School and specialist in online speech regulation, Evelyn Douek, has critiqued it for its narrowness on the basis that “when the [oversight board] first begins operations, its jurisdiction is limited to referrals from Facebook and ‘content that has been removed for violations of content policies’ from Facebook or Instagram” rather than content that has been left up. The latter is increasingly at the centre of controversy about Facebook’s content moderation policies. The board has now said it will examine ‘leave up’ decisions eventually too.
Facebook has faced increasing pressure on its content moderation policies and how effectively it implements them. Critics point to the great quantity of illegal content on the site that remains live, in addition to misinformation that some claim has a causative effect on real-world harms.
Others have written the oversight board off as largely a PR exercise. This has led to the formation of an anti-oversight board calling itself the “Real Facebook Oversight Board”. Another group of industry experts that includes the journalist Carole Cadwalladr, who uncovered the Cambridge Analytica scandal, and the leaders of the Stop Hate campaign, which recently organised an advertiser boycott of Facebook. The alternative board is reportedly being funded by billionaire philanthropist Pierre Omidyar.
CNBC journalist Sam Shead said on Twitter that the board told him: “The Oversight Board had no plans to announce the launch until they got hold of our concept note a week or so ago. The entire PR offensive was to try and overshadow our plans.”
How best to address speech online is a hotly contested topic. Facebook is often criticised for exhibiting an anti-right-wing bias by Trump supporters, who allege that the platform removes a disproportionately large volume of right-wing content.
Progressive liberals are increasingly pushing the company (and other social media platforms) to toughen their stance on content moderation and expand their dragnet to pull in misinformation and any other content that could conceivably be considered “harmful”. For example, the company provoked rage over its decision to leave up President Trump’s post that used the racist dog whistle “when the looting starts, the shooting starts”.
Yet others criticise the company’s content removal decisions from a left-wing perspective. For example, the site recently took the decision to shut down anti-fascist organising pages alongside those of right-wing militias – leading to outcry over the implication that these groups are equivalent.
The platform was also criticised on the left for a recent decision to shut down the page of an event taking place at San Francisco State University (SFSU) that featured Leila Khaled, a member of the Popular Front for the Liberation of Palestine (PFLP). The action was taken because the US government designates PFLP a terrorist organisation, but was derided as unfair by some prominent leftists.