A million appeals for justice, and 14 cases overturned -- Facebook Oversight Board off to a slow start
A million appeals for justice, and 14 reversals. That's the scorecard from the Facebook Oversight Board's first annual report, released this week. The creative project has plenty going for it, and I think some future oversight board can benefit greatly from the experience of this experiment, launched by Facebook parent Meta in 2020. Still, it's hard to see how this effort is making a big impact on the problems dogging Facebook and Instagram right now.
A few months ago, I interviewed Duke University law student Alexys Ogorek about her ongoing research into the Oversight Board for our podcast, "Defending Democracy from Big Tech." Her conclusion: There are plenty of interesting ideas in the organization, but in practice, it's not accomplishing much. Only a tiny fraction of cases are considered, she found, and decisions take many months. Not very practical for people who feel like their innocent comment about a political candidate was wrongly removed a month before an election. You can hear our discussion of this on Apple Podcasts, or by clicking play below. The Oversight Board's annual report confirmed most of Ogorek's research, but there are plenty of interesting nuggets in it. I've cobbled them together below.
Facebook removes user posts all the time -- perhaps it's happened to you -- with little or no explanation. After years of public frustration with this practice, the firm launched an innovative project called the Facebook Oversight Board. It's billed as an independent, outside entity that can make binding decisions -- mainly, tasked with telling Facebook to restore posts it has removed incorrectly. Most of the time, these takedown decisions are made by automated tools designed to detect hate speech, harassment, violence, or nudity. In a typical scenario, a user posts a comment that contains language that is judged to include racial slurs, or language that encourages violence, or adult content, or medical misinformation, and the post is removed. Users who disagree can file an appeal, which might be judged by a person at Facebook. If that appeal fails, users now have the option to appeal to this outside Oversight Board.
This is a good idea. We should all be uncomfortable that a large corporation like Facebook gets to make decisions about what stays and what goes in the digital public square. Yes, the First Amendment doesn't apply to Facebook in most of these cases, but because it's such a powerful entity when Meta acts as judge and jury, it offends our notions of free speech. So, the experiment is worthwhile and like Ogorek, I've tried to look at it with an open mind.
One big problem revealed in the report is the tiny, tiny fraction of cases the board can take up, combined with the 83 days it took to decide cases. About 1.1 million people appealed to the board from October 2020 to December 2021, and only 20 cases were completed. Of them, the board overturned Facebook's choice 14 times. To be fair, the board says it tried to choose cases that had wider impact, and could set precedent. Still, the numbers show the board process, to put it politely, doesn't scale.
"I am struggling with this due to a cognitive disconnect. They had 1.1 million requests but only examined 20 cases. In those 20 cases they found that Meta was wrong 70% of the time. So, is it likely that over 700,000 mistakes by Meta have gone unexamined," said Duke professor David Hoffman. "The small number of decisions when compared to the demand indicates to me that the (board) is at best a sampling mechanism to see how Meta is doing, and based on this sample it appears that Meta’s efforts at enforcing their own policies are a dismal failure. It all begs the question, what additional structure is necessary so that all 1.1 million claims can be analyzed and resolved."
Reading through the cases Facebook did pick, one can gain sympathy for the complexity of the task at hand. I've pasted a chart above to show a sample of cases that rose to the top of the heap. But here's one example of competing interests that require nuanced decisions: in one case, a video of political protestors in Columbia included homophobic slurs in some chants. Facebook initially removed the video; the board restored it because it was newsworthy. In another case, an image involving a women's breast was removed for violating nudity rules, but the image was connected to health care advocacy. It was also restored.
Other items in the report I found interesting: the board openly criticized Facebook's lack of transparency in many situations. It urges the firm to explain initial takedown decisions, and notes that moderators "are not required to record their reasoning for individual content decisions."
There are other critical comments:
"It is concerning that in just under 4 out of 10 shortlisted cases Meta found its decision to have been incorrect. This high error rate raises wider questions both about the accuracy of Meta’s content moderation and the appeals process Meta applies before cases reach the board."
"The board continues to have significant concerns, including around Meta’s transparency and provision of information related to certain cases and policy recommendations."
"We have raised concerns that some of Meta’s content rules are too vague, too broad, or unclear, prompting recommendations to clarify rules or make secretive internal guidance on interpretation of those rules public."
"We made one recommendation to Meta more times than any other, repeating it in six decisions: when you remove people’s content, tell them which specific rule they broke." Facebook has partly addressed this suggestion.
The board also briefly took up the issues raised by Facebook whistleblower Frances Haugen. Among her revelations, she exposed a practice by the company to "whitelist" certain celebrities, making them exempt from most content moderation rules. The board mentions this issue, and its demands for more information from Facebook about it, but only in passing. Combine this issue with other references to secret or unknown internal moderation policies that Facebook maintains, and it's easy to see how the Oversight Board has a very difficult job to do. One wonders if its work might end one day with members resigning in frustration. Until then, it's still worth learning whatever lessons this experiment might teach. There are plenty of good ideas being tested.