One Facebook post included incendiary attacks against France from a former Malaysian prime minister. Another quoted Joseph Goebbels. Another promoted COVID-19 conspiracy theories.
All were taken down — but now an independent group will decide if that was the right decision.
The cases are some of the first to be ruled on by an outside group with powers to decide which content should stay up on Facebook, according to an announcement Tuesday.
Facebook’s Oversight Board — formed earlier this year, and made up of 20 independent legal professors, former politicians and human rights experts — said it had chosen the six cases on suggestions they may have breached the company’s policies on hate speech, dangerous organizations and others. The move comes as all social media companies face growing scrutiny over how they police what their billions of users worldwide post online.
As part of the announcement, small groups of independent experts will now review each of the posts, which have already been taken down from Facebook, to determine if they broke the company’s rules. Final decisions will likely come in January, and represent test cases for how such independent oversight can work when Facebook is struggling to combat a wave of misinformation and hate speech across its platforms.
“It’s no understatement that disinformation on Facebook is of great concern to the board,” Helle Thorning-Schmidt, the group’s co-chair and former Danish prime minister, told reporters in October. “I expect this to be an issue that will feature in cases that come before the board.”
The group said it had received more than 20,000 referrals of posts that had been removed since it began its work in October, but that it had chosen its initial cases based on either their wider societal impact or their importance to Facebook’s rules on how the company moderated content online.
That includes a post from an unnamed user who uploaded screenshots of tweets from Mahathir Mohamad, Malaysia’s former leader, who had said Muslims had the right to murder French people because of that country’s previous colonial history. The original Twitter posts were deleted, but were shared widely across social media, including Facebook.
Another case relates to a post that promoted COVID-19 conspiracy theories to a large Facebook group in France. It was deleted for violating Facebook’s rules around incitement. A separate inquiry will focus on an anti-Azerbaijani post — linked to that country’s recent war with Armenia — while a fourth case is tied to the use of a quote by Nazi official Joseph Goebbels, which an anonymous Facebook user said was published to criticize the current U.S. administration. The company removed the material for breaking its rules on dangerous individuals and organizations.
None of the cases to be reviewed deal with the recent U.S. presidential election where politicians, including Donald Trump, and average citizens published reams of false material, some of which Facebook either either removed or flagged as incorrect. While the company had the right to submit such cases for review by the independent group, Facebook declined to comment if had sent any U.S. election-related posts for review.
Not everyone has welcome the greater scrutiny of how Facebook polices online content.
The so-called Real Oversight Board — not approved by the company — made up of campaigners, legal experts and Facebook critics, said it had opened its own, parallel investigations into Facebook content that appeared to break the company’s content moderation rules.
That includes the profile of Trump’s former chief strategist Steve Bannon, who posted a call to behead the U.S.’s leading public health official Anthony Fauci, and Christopher Wray, the director of the Federal Bureau of Investigation. Facebook removed the post, but did not delete Bannon’s page.
“Facebook’s business model undermines democracy, public health and privacy,” said Robert McNamee, an early Facebook investor and co-founder of the Real Oversight Board. “The Facebook Oversight Board is a toothless body, with too many loopholes to address the massive harms on the site.”
This article is part of POLITICO’s premium Tech policy coverage: Pro Technology. Our expert journalism and suite of policy intelligence tools allow you to seamlessly search, track and understand the developments and stakeholders shaping EU Tech policy and driving decisions impacting your industry. Email [email protected] with the code ‘TECH’ for a complimentary trial.