The Facebook “Oversight” Board
In April 2018, Facebook CEO Mark Zuckerberg, facing intense public pressure to do something about the proliferation of false or misleading information appearing on the platform, said that he
“could imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.”
Later that year, he announced that Facebook would create an “Independent Governance and Oversight” committee by the close of 2019 “to advise on content policy and listen to user appeals on content decisions.”
That body—now called the “Facebook Oversight Board”—recently began operation. In December of last year, the Board revealed the details of the first six cases it would be deciding, and it has recently issued its decisions in those cases. The Board has also agreed, at Facebook’s request, to decide whether ex-President Trump’s Facebook and Instagram accounts were improperly terminated in the aftermath of the January 6 riot at the Capitol.
This experiment raises some complicated issues about governance and decision-making, and I will try to be as concise as I can. I have two points to make. First, that it’s a sham. The Board is not what it purports to be; it cannot and will not exercise anything that can remotely be described as “oversight” over Facebook’s content management system. And second, that even were it not a sham – even if the Board were actually empowered to provide real oversight and direction to that system – that doesn’t strike me as any sort of improvement in the current state of affairs; there are many fundamental policy choices embedded in that system which affect the fortunes of a large proportion of the individuals, corporations, and governments on the planet, and I fail to see why a hand-picked Council of the Wise is the proper repository of the power to make those choices.
Facebook’s Content Management System
To begin with, here is a simplified outline of Facebook’s existing content management system. Facebook’s speech rules* identify a number of categories of speech that are not permitted on the platform—e.g., Incitement of Violence, Adult Nudity and Sexual Activity, Bullying and Harassment, Hate Speech, Terrorist Propaganda, Violent and Graphic Content, Cruel and Insensitive Speech – along with definitions of, and some explanatory commentary about, each category.
* Facebook calls these rules its “Community Standards.” I dislike the term. I prefer to call it Facebook’s “Terms of Service” (ToS). “Community Standards” connotes, in the law and in ordinary speech, that the rules derive in some fashion from the community (however that community might be defined). Facebook’s speech rules do not, however, derive from the Facebook community; they are, like ordinary website Terms of Service with which all Internet users are familiar, imposed on the Facebook community unilaterally by the site operator.
Facebook, as a private entity, has every right to impose on users whatever speech rules, derived from whatever source, it chooses (subject, of course, to the usual and generally-applicable rules and regulations regarding corporate conduct). It is also free to call those speech-rules anything it chooses to call them. But we don’t have to follow suit when it is misleading to do so.
Facebook relies increasingly heavily on ex ante enforcement of these rules – removing (or placing warnings on) material deemed to be in violation of the ToS before that material is transmitted and displayed across the platform. As you would expect, given the astonishing volume of content involved – tens of billions of messages are posted every day at Facebook and Instagram—ex ante enforcement is, and must be, entirely automated and “algorithmic.”
Facebook also enforces these rules ex post, removing (or placing warnings on) posts after they have been disseminated across the platform. Thousands of Facebook “content moderators” patrol the system, proactively seeking out posted content that violates the ToS. In addition, Facebook users can (and do, millions of times every month) “flag” content that they believe violates the ToS; this user-flagged content goes into a queue that is reviewed by a Facebook content moderator, who decides whether or not the content violates the ToS and should be removed from the system.
Since 2018 Facebook has provided an internal appeals process for users whose content has been removed from the platform (as well as for users who flagged content that Facebook did not remove). A content moderator then makes a final decision as to whether the content violated the ToS (and should remain off the system) or not (and should be restored).
The scale at which this system operates is astonishing. In Q2 Q3 of 2019, Facebook removed 4.3 billion individual pieces of content—about 300 every second of every day. Users appealed 40 million of those removals (about 1% of the total); 10 million of those appeals were successful in cancelling the removal and restoring the content involved.
The Board’s “Oversight” Role
The Board currently consists of 19 members; it is “likely,” in the words of the Board’s Charter, to expand to 40 members. Facebook initially chose two Board Co-Chairs, and the remaining members were chosen by the co-chairs in consultation with Facebook. Going forward, additional members will be nominated by a Board Committee and approved by the Board. Board operations are controlled by a Trust, set up and funded by (but otherwise independent of) Facebook.
The Board is authorized to review individual cases brought by users challenging a Facebook “enforcement action”—either Facebook’s removal of content, or its failure to remove user-flagged content. Cases can be heard only after they have completed Facebook’s internal appeals process. Facebook also may propose cases to the Board for review. The Board has complete discretion to decide which cases it will review (except that in “exceptional circumstances,” Facebook may require the Board to give a case “expedited review” for decision within 30 days).
The Charter sets fort
Article from Latest – Reason.com