Journalists and media leaders were among more than 2,000 people consulted around the world for Facebook’s soon-to-be-formed Oversight Board for Content Decisions — a group of 40 experts who will review Facebook’s most challenging decisions to allow or remove content from Facebook or Instagram. Exercising independent judgement, the Board will have the power to reverse Facebook’s decisions to remove content, as well as recommend changes to Community Standards.
“The idea is to create a separation of powers,” CEO Mark Zuckerberg recently said, “so that while Facebook is responsible for enforcing our policies, we aren’t in the position to make so many decisions about speech on our own. This Board will be tasked with upholding the principle of free expression while ensuring we keep our community safe.”
Read on for five things to know about how the Board will work. Early this fall, Facebook will release a final charter outlining the Board’s operations and structure. By the end of the year, initial Board members will be selected and begin preparations to review their first cases.
Experts around the world, including journalists and news organizations, advised Facebook on how to structure the Board.
After Mark Zuckerberg wrote a note last November saying that Facebook would create a new way for people to appeal content decisions, through an independent body, the company spent months gathering input from various stakeholders around the world. More than 650 people from 88 different countries took part in 22 roundtables and six in-depth workshops. 1,200 more people submitted their thoughts in a public consultation process. They included vocal critics of Facebook; with voices from a spectrum of regions, viewpoints, and cultures; and experts whose work focuses on freedom of expression, democracy, journalism, technology, rule of law, child safety, civil rights, human rights protection, and many other disciplines in both the private and public sectors. A report published in June summarized their thoughts and contributions.
The Board will focus on the toughest cases with significant real-world impact.
That means content that ignited significant public debate, or affected a large number of people, among other consequential cases. The Board’s work will complement a team of 15,000 content reviewers who already assess more than 2 million pieces of content every day, 24/7, in over 50 languages.
The Board will choose cases on its own, considering cases referred to them by Facebook and users’ appeals.
It will hear cases in which Facebook decided to leave up or remove content from Facebook or Instagram according to its Community Standards. The Board will select the cases it focuses on, and Facebook will be able to refer cases to the Board for consideration. Users may also be able to appeal to the Board, following on from Facebook’s existing appeals process.
To avoid conflicts of interest, current or former Facebook employees and government officials won’t be able to serve as Board members, among other disqualifications.
Candidates for Board membership should embody certain principles, including showing commitment to the Board as an institution. Facebook will seek Board Members who are (1) experienced at deliberating thoughtfully and collegially, as an open-minded contributor on a team; (2) skilled at making and explaining decisions based on a set of policies; and (3) familiar with matters relating to digital content and governance, including free expression, civic discourse, equality, safety, privacy, and technology.
Facebook plans to select the first few Board members who will then, with Facebook, select additional members by the end of the year. A committee of the Board will continue selecting members next year. The selection process will include rigorous vetting, interviews, and reference checks. Qualifications will be released to the public.
A potential Board member will be disqualified if, for example, he or she is a current or former Facebook employee (or spouse or domestic partner of one) or a current government official, whether in political office or tied to a government-owned or -controlled entity.
Considerations for global diversity on the Board include gender; geography; varying cultural and political backgrounds; and a range of professional experience in free speech, human rights, civil society, publishing/media/journalism, academia, anthropology, corporate, content moderation, ethics, government, judicial, law enforcement, legal, national security, online and public safety, privacy, psychology, or statistics, for example.
The Board will release public explanations of its decisions, and Facebook will publicly confirm it has implemented those decisions.
Within two weeks after closing each case, Board members will issue an explanation for their decision, and their decision will be final. They can also recommend changes to Community Standards for Facebook to review, and Facebook will report back to the Board publicly explaining if they will take up the consideration.
“This is a major experiment in governance,” Zuckerberg said in a statement, “and if it’s successful, this Board could become an important part of how online expression and communities work going forward.”
Learn more about the Facebook Oversight Board for Content Decisions here.