Facebook’s oversight board in action

Six months have passed since Facebook’s oversight board started working as the social media’s “Supreme Court,” and the jury is still out on its efficiency. But what’s clear is that the board is not upholding all of Facebook’s decisions on content removal; on the contrary, it is overturning most of the tech company’s actions. Still, the question of the body’s legitimacy remains. It is unclear if an external organization, much less a private company, must decide over public matters such as freedom of speech. As Facebook’s founder has recognized in the past, it is an issue for governments to settle. But while they decide on it, social media giants resort to these external juries to gain legitimacy. 

Photo by Alexander Shatov on Unsplash.

Facebook’s oversight board was first launched in October of 2020, with an initial investment of $130 million. The tech giant had a few difficult years following the Cambridge Analytica scandal in 2016. In November of 2018, Mark Zuckerberg decided it needed something else to regain its legitimacy, as users were becoming wary of Facebook’s editorial sense, or lack thereof. He approved creating an external body that would review Facebook’s content decisions, releasing its own verdicts that would then become binding for the tech company. 

Concerns over the board were soon raised. First, the body’s scope remains broad, while the cases it reviews are scarce. It cannot look at every single action taken by Facebook’s content curators, which, trying to follow the guidelines set by the tech company, deal with hundreds of cases every day. As reported by The New Yorker, the training received by those curators is lacking, and the policies are lax. As a result, the social media giant tends to leave content on the site, which discriminates or purports hate, while it takes down pictures that do not follow the guidelines but are legitimate. The board exists to write those wrongs, looking at the banned content and deciding whether it should have been taken down in the first place. As such, it picks cases that can be examples for the future. A second concern is whether the board’s existence is enough to reel Facebook in or if it is just a cosmetic and calculated change that the tech company was taking. And in the background, the question over who has the right to moderate public speech remains.

Up till now, the board has released ten decisions, six of which have overturned the company’s prior actions. For instance, Facebook had deleted photographs of a Brazilian woman showing her breasts on Instagram, as her nipples were visible and thus violated the company’s guidelines on “Adult Nudity and Sexual Activity.” However, the pictures were part of an international breast cancer awareness campaign, and the board decided to restore the content, as removing it would hinder women’s human rights. “[…] Facebook’s Community Standard on Adult Nudity and Sexual Activity, expressly allows nudity when the user seeks to ‘raise awareness about a cause or educational or medical reasons’ and specifically permits uncovered female nipples to advance ‘breast cancer awareness,'” noted the board’s case decision. 

Another case tried to establish an example of what is considered misinformation. The tech company had decided to delete a post published by a French user. He noted that the French agency that regulates health products, the Agence Nationale de Sécurité du Médicament, was promoting the use of remdesivir and did not authorize hydroxychloroquine combined with azithromycin to fight off COVID-19. Facebook removed the post saying it “contributes to the risk of imminent… physical harm.” But after reviewing the case, the board decided that decision was flawed and said: “that Facebook had not demonstrated the post would rise to the level of imminent harm, as required by its own rule in the Community Standards.” Besides, the board also concluded that the post’s removal was against freedom of expression and urged Facebook to define its standards better. “The Board also found Facebook’s misinformation and imminent harm rule, which this post is said to have violated, to be inappropriately vague and inconsistent with international human rights standards,” says the case report.

Those two cases already illustrate that the board’s work is useful, as it tries to define what should be published online and what shouldn’t. Even the Trump case furthers this point. The board reviewed the platform’s decision to restrict former President Donald Trump’s access to his Facebook and Instagram accounts in January of this year. Although the board upheld Facebook’s course of action, it noted that the firm had to review the decision, as it was not fair nor appropriate “to impose the indeterminate and standardless penalty of indefinite suspension.” This “Supreme Court” also encouraged the network to establish proportionate policies to ensure this wouldn’t happen in the future.

The concern over the board’s editorial independence can be somewhat resolved after looking at its decisions so far. The body is showing its own discretion as it overturns the company’s actions. However, concerns over the number of cases it can review and how Facebook will further fine-tune its guidelines remain. 

A solution to these issues would be better and stricter legislation. At the moment, Facebook and the other networks enjoy the benefits of the 230 section of the Communications Decency Act (CDA), which notes platforms are not liable for the content posted on them. “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” reads the 230 section. But, as private companies become more powerful, they have taken on the ethical responsibility of regulating that speech by removing or maintaining certain content. Facebook and Twitter have made editorial decisions, but the legislation does not reflect that shift, nor the times we live in. It is becoming harder to differentiate between a social media platform and a publisher, a neutral network and content-rich site. But policies have not changed.

The health of the public discourse should not be left at the hands of private firms, even if they had well-intentioned and even excellent mechanisms to curate content. It is a matter for the government, but the government must catch up. Until then, private companies remain in charge.

One thought on “Facebook’s oversight board in action

Post a comment

Your email address will not be published.