Facebook on Tuesday revealed how it plans to handle disputes over its content decisions, detailing how its purportedly independent “Oversight Board” will operate.
Facebook users who object to a content decision by the company or the algorithms it uses to approve or remove content, and who have exhausted appeals, will be able to request a review by the board, according to a board charter released Tuesday.
The board will select which complaints to address, focusing on “cases that have the greatest potential to guide future decisions and policies,” the charter said. Decisions will be made in accordance with “Facebook’s content policies and values.”
Board decisions will be binding on Facebook, and will also set precedents for future decisions by the board, whose members’ names will be made public, the company said in the charter.
“It’s a really great first step and it’s certainly better than the absence of doing anything, as has been happening,” said Kate Klonick, an affiliate fellow at Yale Law School’s Information Society Project who was given access to the board-creation process. “I don’t know how great it’s going to be in the long term. It might never take off. It might never gain legitimacy. People might not buy into it. It might be stymied by issues we can’t see right now.”
Facebook has been beset by controversies over content moderation, from claims by Republican U.S. senators and President Donald Trump that it’s biased against conservatives to allegations that groups have used its platform to spread hate and promote genocide. On Wednesday, the company, along with Google and Twitter, is scheduled to testify before a Senate committee in a hearing titled, “Mass Violence, Extremism, and Digital Responsibility.”
Members of the new Oversight Board can decide to allow or remove content, and also uphold or reverse content decisions such as whether a post needs a warning message about graphic violence, according to the charter.
Decisions are to be implemented “promptly” and will be made public and archived on the board’s website “subject to data and privacy restrictions,” the charter said. Facebook itself can also ask for reviews, the charter said.
The board, 11 people to start and up to 40 eventually, is to be funded by Facebook and governed by Facebook-selected trustees whom the Menlo Park social media giant says will be independent. Those trustees will appoint board members, or remove them if they break the board’s conduct code, the charter said.
The charter opens by addressing the fundamental problem facing social media companies: how to reconcile freedom of expression with the damage such expression can cause. “There are times when speech can be at odds with authenticity, safety, privacy, and dignity,” the charter said. “Some expression can endanger other people’s ability to express themselves freely.” When examining cases, the board “will pay particular attention to the impact of removing content in light of human rights norms protecting free expression.”
In a news release Tuesday, company CEO Mark Zuckerberg did not promise an immediate revolution in how Facebook responds to content-related appeals. “We expect the board will only hear a small number of cases at first, but over time we hope it will expand its scope,” Zuckerberg said.
A series of scandals led Facebook to the idea of an independent board to review content choices, Klonick said. “The entire reason they created this in the first place is because they were having a problem with long-term user trust,” she said. She said Facebook appears to be committed to enabling content-related decision-making that isn’t based on its bottom line, adding that the company has “sunk an insane amount of resources” into the project.
Once the first 11 members are in place, the board can start hearing cases, which Klonick said the company believes could happen as early as the first quarter of next year.
Once a case is chosen, the board will notify the complainant, whoever originally posted the content, and Facebook, the charter said. Each case will be reviewed by a panel of board members, organized by the staff, the charter said.
In each case, Facebook will give the board information “that is reasonably required for the board to make a decision,” but that information will depend on legal and privacy restrictions, the charter said. The Facebook user who posted the content, or the person who reported it, will be able to submit written statements to the board, according to the charter. But the board can consult outside experts, too.
Board decisions will be made by consensus, when possible, or otherwise by majority, the charter said.
Facebook retains the power to decline to implement board decisions if implementation could break the law in the U.S. or another jurisdiction, Klonick said. If the board makes a decision about a particular piece of content, and identical content with similar context exists elsewhere on the platform, Facebook will analyze “whether it is technically and operationally feasible to apply the board’s decision to that content as well,” the charter said.