Facebook on Tuesday revealed how it plans to handle disputes over its content decisions, detailing how its purportedly independent “Oversight Board” will operate.
Facebook users who object to a content decision by the company or the algorithms it uses to approve or remove content, and who have exhausted appeals, will be able to request a review by the board, according to a board charter released Tuesday.
The board will select which complaints to address, focusing on “cases that have the greatest potential to guide future decisions and policies,” the charter said. Decisions will be made in accordance with “Facebooks content policies and values.”
Board decisions will be binding on Facebook, and will also set precedents for future decisions by the board, whose members names will be made public, the company said in the charter.
“Its a really great first step and its certainly better than the absence of doing anything, as has been happening,” said Kate Klonick, an affiliate fellow at Yale Law Schools Information Society Project who was given access to the board-creation process. “I dont know how great its going to be in the long term. It might never take off. It might never gain legitimacy. People might not buy into it. It might be stymied by issues we cant see right now.”
Facebook has been beset by controversies over content moderation, from claims by Republican U.S. senators and President Donald Trump that its biased against conservatives to allegations that groups have used its platform to spread hate and promote genocide. On Wednesday, the company, along with Google and Twitter, is scheduled to testify before a Senate committee in a hearing titled, “Mass Violence, Extremism, and Digital Responsibility.”
Members of the new Oversight Board can decide to allow or remove content, and also uphold or reverse content decisions such as whether a post needs a warning message about graphic violence, according to the charter.
Decisions are to be implemented “promptly” and will be made public and archived on the boards website “subject to data and privacy restrictions,” the charter said. Facebook itself can also ask for reviews, the charter said.
The board, 11 people to start and up to 40 eventually, is to be funded by Facebook and governed by Facebook-selected trustees whom the Menlo Park social media giant says will be independent. Those trustees will appoint board members, or remove them if they break the boards conduct code, the charter said.
The charter opens by addressing the fundamental problem facing social media companies: how to reconcile freedom of expression with the damage such expression can cause. “There are times when speech can be at odds with authenticity, safety, privacy, and dignity,” the charter said. “Some expression can endanger other peoples ability to express themselves freely.” When examining cases, the board “will pay particular attention to the impact of removing content in light of human rights norms protecting free expression.”
In a news release Tuesday, company CEO Mark Zuckerberg did not promise an immediate revolution in how Facebook responds to content-related appeals. “We expect the board will only hear a small number of cases at first, but over time we hope it will expand its scope,” Zuckerberg said.
A series of scandals led Facebook to the idea of an independent board to review content choices, Klonick said. “The entire reason they created this in the first place is because they were having a problem with long-term user trust,” she said. She said Facebook appRead More – Source