Facebook has appointed 20 members to an independent board that will serves as a type of appeals courts for users who disagree with the social network’s decision to remove their content from its site because it violated policies against things like hateful content, bullying, or harmful misinformation.
The board, which will begin making its first decisions later this year, is made up of former politicians, current and former journalists, academics, and human rights experts worldwide.
The group, which will eventually grow to 40 members, is led by four co-chairs: Helle Thorning-Schmidt, the first woman to serve as prime minister of Denmark; Jamal Greene, a Columbia University law professor; Michael McConnell, a former U.S. federal circuit judge and now Stanford law professor; and Catalina Botero-Marino, formerly appointed to the Freedom of Expression of the Organization of American States and now dean of law at the Universidad de los Andes in Colombia.
“Each of our members has chosen to participate in the board because they believe there is no single company that can solve most of the challenging online content decisions to date and a new model of platform of governance is needed,” Thomas Hughes, director of the oversight board administration who was appointed in January, said on a call with the media on Wednesday.
Facebook CEO Mark Zuckerberg announced plans for the independent board in 2018 to help the company to deal with content moderation decisions that critics had complained were arbitrary, misguided, or a result of political bias. The board is designed to take heat off of Facebook for its decisions and, in theory, give an outside body the final say over removing posts that may include false information to incite political violence, for example. Zuckerberg has said he is committed to fully abiding by the board’s “binding” rulings.
The board members, who are supposed to operate as an independent unit of Facebook, are, technically, employed by Oversight Board. And while the entity is separate of Facebook, it is ultimately financially dependent on the company, which committed $130 million to its funding over six years.
Earlier this year, Facebook released new details about how the oversight board is expected to operate. Now that half of the members are in place, the co-chairs provided more insight about their plans.
The new board members
Board members were selected to represent a broad set of regions and currently hail from Asia Pacific, Central and South Asia, Europe, Latin America, Middle East and North Africa, Sub-Saharan Africa, the U.S., and Canada. They also come from a diverse set of ideological backgrounds, a move that is intended to avoid accusations of political bias, according to the board’s leadership.
“Naturally we do not agree on everything,” said Botero-Marino. “But it’s precisely through discussions who think differently that it’s possible to reach decisions that take all points of view seriously.”
The members have expertise in digital rights, religious freedom, content moderation, digital copyright, internet censorship, and civil rights. Among them are a former judge of the European Court of Human Rights, editor in chief of a major newspaper in Indonesia, a civil rights and social justice advocate, and a prolific Facebook user with millions of followers.
The board expects to add the next 20 members later this year and into next year, said Hughes.
Only a “tiny fraction” of cases will be reviewed
The board expects to get “hundreds of millions” of requests for review, but it will likely handle only a “tiny fraction” of them, said McConnell.
“The sheer volume of decisions that are going to face the oversight board are going to make it impossible for the board to decide every case,” McConnell said. “We’re going to have to select just a few.”
The board expects to create a selection committee, which will choose which cases will be reviewed. The cases that are likely to get priority are those that either affect a large number of users, have a major impact on public discourse, or may raise significant policy questions on Facebook. This means that though not every case will be reviewed, a decision by the board could ultimately impact similar posts that requested review.
Review and enforcement may be bumpy
Once a specific case is selected for review, it will be passed to a panel of five board members, one of whom is expected to have ties or knowledge about the specific region from where the post originated. After debate, a decision will be made that is supposed to be disclosed online and in an annual report. Previously released documents show the board has 90 days to make a ruling, and Facebook has seven days to implement it.
But McConnell said this process is somewhat of an “experiment,” and that users will have to be patient about the timeline.
“There’s a lot we don’t know,” he said. “We will make mistakes. All I can promise is we will do our best to learn from our mistakes.
When it comes to enforcement, the board expects its transparency reports, bylaws, and commitments from Facebook to be enough to ensure that decisions are implemented.
“Facebook would have a very high reputational cost if it were not to carry out the decisions of the body that it itself has created precisely to resolve the most thorniest problems,” a translator said on behalf of Botero-Marino, who spoke in Spanish.
More must-read tech coverage from Fortune:
—Remote work, online grocery shopping, cord cutting: What coronavirus trends will stick
—How T-Mobile shifted 12,000 employees to work from home in less than two weeks
—Coronavirus patient data stored in electronic health records found difficult to study at scale
—The startup founder in India striving to improve mass transit through the pandemic
—Listen to Leadership Next, a Fortune podcast examining the evolving role of CEO
—WATCH: Zoom’s ups and downs since the coronavirus crisis
Catch up with Data Sheet, Fortune’s daily digest on the business of tech.