Facebook’s “Supreme Court” will change very little

Back in September, Facebook announced the creation of what is now known as Facebook’s Supreme Court—an oversight board to review appeals to the tech company’s content policy decisions. Around forty independent experts will form the committee later this year, making decisions over the company’s content policy. For Facebook, the board is the perfect ‘scapegoat’ to its accountability problems—Mark Zuckerberg can easily say his company is not managing the content; an independent body is. Still, the newly created body will only interpret Facebook’s content policies and instruct the company to remove or allow content after reviewing appeals from users. Its scope is very broad. And that’s why the more significant issue remains unresolved: with both Facebook and the board overseeing content decisions, there’s still a centralized editorial authority presiding over the online public debate.

Photo: Bill Oxford on Unsplash.

Facebook has suffered a lot of criticism during the last four years, both from the data privacy perspective—for example, the Cambridge Analytica scandal—and the misinformation angle. Despite the fact that the company signed the European Union hate speech code in 2016—a code of conduct encouraging social media companies to take action against hate speech and discrimination online,—the resolution did not help the company. (Disclaimer: it’s not legally binding.) The result? The platform has slowly bled users, who have become warier of how Facebook uses their private data and how fake news spread. According to a Pew Research report, 26% of American Facebook users deleted the app from their phones in 2018, with one in ten users deleting the account entirely. As a response, Facebook has launched several initiatives to manage its content better, prompting the eternal question—Is Facebook a news company?

In this context, the Menlo Park-based company took two real steps in 2019 (besides the EU code of conduct three years earlier.) First, Facebook banned some extremist figures from the social media platform, such as Infowars-founder Alex Jones (read Facebook’s ban on extremist figures is not enough to stop fake news; we need more). Second, the tech giant also rolled out a new feature for news organizations called Facebook News, which will allow users to read the main headlines of the day and access those same stories through the app for free (read Facebook, a veteran media outlet). How this second step to control/manage information will fare is yet to be seen.

Still, neither have soothed critics who demand antitrust measures against the tech giant or believe Facebook should be treated as any other publication (make it liable for its content.) But maybe this 2020 oversight board will help Facebook steer away from criticism. Our opinion? We doubt it will.

While the board will be independent of the tech company, it will still be funded by it—Facebook will invest $130 million in the body. And while it will be an independent body reviewing the company’s tech policy, it will still be ‘someone’ making editorial decisions. Nevertheless, Facebook’s CEO Mark Zuckerberg insisted on the body’s independence and importance in a letter he recently published, saying, “The board’s decision will be binding, even if I or anyone at Facebook disagrees with it.”

He also explained the reasoning behind the need for a board to review Facebook’s content policies. “We are responsible for enforcing our policies every day and we make millions of content decisions every week. But ultimately, I don’t believe private companies like ours should be making so many important decisions about speech on our own,” he wrote, pointing at the backlash the company has gotten for deciding which content to ban and which to leave online. He adds: “That’s why I’ve called for governments to set clearer standards around harmful content. It’s also why we’re now giving people a way to appeal our content decisions by establishing the independent Oversight Board.” When the board is first launched, users will only be able to appeal decisions to remove their own content. For example, if the tech company has deleted your pictures for being inappropriate, you’ll be able to appeal that decision and get your photos back. Eventually, as reported by The Verge, users will be able to demand the removal of content that has not been removed despite their will.

Yes, the board is useful, as it will be an independent entity reviewing content this time, which means it will be somewhat impartial. But then again, Facebook curators and its algorithms are also “impartial,”—although platforms preserve the biases of their creators (but that’s for another article.) However, and above all, the board is a way for Facebook to avoid further scrutiny, deflect its responsibilities and, if another scandal ensues, be able to say, “Hey, we are not responsible for user-created content. There are independent experts deciding what to remove and what to allow.”

But the question remains, bringing in the social media decentralization debate: Should any organization/body control free speech and determine what is and is not to be said in the public arena (in this case, Facebook)? And if so, how should it work? Should it be done by a centralized entity, such as Facebook or its independent board? What’s clear is that, as long as the power of decision over content remains with one body, Facebook is just shifting the blame. Indeed, it is not a step towards decentralizing the web. Decentralization advocates sustain that Internet platforms should not keep users’ private data, nor act as referees within social media sites (read Social media decentralization reaches 2020).

A final note—as for any board, especially one that makes content decisions, results will come slowly, maybe too slow. For now, all we have to do is wait and see. Facebook’s Supreme Court is slated for mid-2020. Once launched, we will be able to see how it works, and we’ll have a better understanding of whether this is just a smokescreen or if it’s improving Facebook’s handling of content. In conclusion, 2020 will be an exciting year for social media companies—with the good and the bad that may bring.

About Josep Valor

Josep Valor-Sabatier is professor of information systems and information technology and holder of the Indra Chair of Digital Strategy. He received his Ph.D. in Operations Research from MIT, and his Sc.D. in Medical Engineering from the Harvard-MIT Division of Health Sciences and Technology. Josep Valor teaches extensively at the senior executive level on Management Information Systems, Media Management, Management of Technology, and Strategy.

About Carmen Arroyo Nieto

Post a comment

Your email address will not be published. Required fields are marked *

9 + 9 =