news
Albert Hirschman Centre on Democracy
15 December 2021

Debating the Facebook Oversight Board

The event organised by UN Special Rapporteur Irene Khan explored how independent scrutiny may be fostered in such a multi-stakeholder model.

Content moderation on social media platforms raises complex human rights challenges. The Facebook Oversight Board is a novel experimentation in self-regulation at a time when social media platforms are coming under intense public criticism and renewed government scrutiny. 

This experimentation could be the first step in a content moderation system that would go beyond platforms such as Facebook or Instagram. Yet, doubts about the ability of such an initiative to have meaningful influence still remain. On December 6th, experts in the matter came together to debate and discuss the lessons that could be drawn from the first year of the Facebook Oversight Board (FOB). The panel discussion was convened by Irene Khan, UN Special Rapporteur on Freedom of Expression and Distinguished Fellow and Research Associate at the Albert Hirschman Centre on Democracy. It was co-sponsored by the Office of the High Commissioner for Human Rights, the AHCD and the GISA Technology and Security Initiative.

Irene Khan opened the debate by acknowledging that social media platforms have expanded opportunities for freedom of expression. However, they have also amplified disinformation, misinformation, hate speech and incitement to violence, including online gender based violence, and failed to respond to them with sufficient urgency, commitment or resources. Only when confronted with intense criticism, increasing governmental scrutiny and falling public trust, had Facebook (Meta), the platform with the largest global outreach, set up the Oversight Board to review key content moderation decisions. 

She noted there remained many concerns and questions about the Board, the company and the sector, including the effectiveness of the Board as an appeal mechanism, the risk that the Board, given its quasi-judicial role, could become a parallel source of platform jurisprudence, and whether self-regulation is an adequate response in light of Facebook’s business model from which many of the problems emanated. She asked the panelists to reflect on the Board’s impact in light of these expectations, fears and concerns. 

Maina Kai, a Member of the FOB and Director of Human Rights Watch’s Alliances and Partnerships Program, acknowledged the Oversight Board is still an experiment. He was nonetheless optimistic about it being a real avenue for change, citing the progress that had occurred during its first year. Marietje Schaake, International Policy Director at Stanford University’s Cyber Policy Center and member of the Real Facebook Oversight Board, disagreed with him, raising concerns about the asymmetry of power present in this arrangement and emphasized the importance of independence, in terms of research, funding, and oversight. 

All participants agreed that oversight and independence are crucial for the Board’s ultimate effectiveness. Maina Kai was quick to point out that it is perhaps too soon to judge the effects of the FOB. Peggy Hicks, Director of the Thematic Engagement, Office of the United Nations High Commissioner for Human Rights, agreed and argued that the decisions of the Oversight board, particularly when taking account of the UN Guiding Principles on Business and Human Rights, could provide a roadmap to guide the company. She felt that the decisions of the Board and the company’s responses to them so far had been encouraging..

Nico Krisch, Professor of International Law at the Graduate Institute, reminded the audience that the FOB is one among a myriad of norm-creating actors that have emerged in recent years in the private sector, and that while FOB could not handle the broader structural elements of Facebook’s problems, it may have a positive “spiraling human rights impact” on the company in several ways. .  

Countering the optimistic view of the FOB expressed by other panelists, Marietje Schaake drew attention to the continued lack of transparency of the company and reiterated the need to go beyond self-regulation. She emphasized that Facebook’s attempts at improving content moderation are too little, too late, and regulation and multi-stakeholder engagement is needed to enforce transparency and accountability of the platforms. 

All participants agreed on the need for more fundamental changes and underlined the importance of multi-stakeholder processes with representation of civil society in the process. Such multi-stakeholder processes should, according to Marietje Schaake, assign a specific role to each actor –including governments – in order to maintain transparency and fairness. 

The issue of transparency was embraced by all speakers. Some, like Maina Kai, argued that external and internal mechanisms should be put in place to grant the best oversight of platform content moderation. Others, like Peggy Hicks and Nico Krisch, advocated for corporations themselves to make their algorithms and processes available to the public in the era of information.

Irene Khan thanked the participants for a lively and candid discussion and concluded by noting that there was no single “silver bullet” solution. She called for external independent multistakeholder oversight mechanisms alongside company initiatives like the Oversight Board, and urged civil society and users as rightsholders to continue to put pressure on Facebook and other companies to make digital platforms a safe and rights-respecting space for all. 

Debating the Facebook Oversight Board: human rights protection and digital platform governance

The UN Special Rapporteur on freedom of opinion and expression has covered the challenges of content moderation and human rights in her reports on Disinformation and Freedom of Expression (A/HRC/47/25) to the Human Rights Council in June 2021 and on Gender Justice and Freedom of Expression (A/76/258) to the UN General Assembly in October 2021.