Facebook will pass off content policy appeals to a new independent oversight body
Facebook doesn’t want to be the arbiter of decency when it comes to content policy decisions, similar to how it looked to third-party fact checkers rather than becoming an arbiter of truth. Today on a press call with journalists, Mark Zuckerberg announced that a new external oversight committee would be created in 2019 to handle some of Facebook’s content policy decisions. The body will take appeals and make final decisions. The hope is that beyond the influence of Facebook’s business imperatives or the public’s skepticism about the company’s internal choices, the oversight body can come to the proper conclusions about how to handle false information, calls to violence, hate speech, harassment and other problems that flow through Facebook’s user-generated content network.
“I believe the world is better off when more people have a voice to share their experiences . . . at the same time we have a responsibility to keep people safe,” Zuckerberg said. “When you connect 2 billion people, you’re going to see all the good and bad of humanity. Different cultures have different norms, not only about what content is okay, but also about who should be making those decisions in the first place.” He cites how use of a racial slur could be hate speech or condemning hate speech as the kind of decision Facebook could use help with.
Zuckerberg explained that over the past year he’s come to believe that so much power over free expression should not be concentrated solely in Facebook’s hands. That echoes his sentiment from an interview with Ezra Klein earlier this year when he suggested Facebook may need a “supreme court” to decide on controversial issues. Zuckerberg says he sees Facebook’s role as more akin to how a government is expected to reduce crime but not necessarily eliminate it entirely. “Our goal is to err on the side of giving people a voice while preventing real-world harm,” he writes. “These are not problems you fix, but issues where you continually improve.”
How the independent appeals body will work
Zuckerberg describes that when someone initially reports content, Facebook’s systems will do the first level of review. If a person wants an appeal, Facebook will also handle this second level of review and scale up its systems to handle a lot of cases. Then, he says, “The basic approach is going to be if you’re not happy after getting your appeal answered, you can try to appeal to this broader body. It’s probably not going to review every case like some of the higher courts . . . it might be able to choose which cases it thinks are incredibly important to look at. It will certainly need to be transparent about how it’s making those decisions.
Zuckerberg said Facebook will be working to get the oversight body up and running over the next year. For now, there are plenty of unanswered questions about who will be on the committee, which of the many appeals it will review and what ensures it’s truly independent from Facebook’s power. “One of the biggest questions we need to figure out in the next year is how to do the selection process for this body so that it’s independent . . . while giving people a voice . . . and keeping people safe. If the group ends up too tightly decided by Facebook it won’t feel like it’s independent enough.” Facebook plans to query experts and start running pilots over the next year to determine what approaches to codify.
Facebook launched an internal appeals system this year that let users request a second review when their content is taken now, and Facebook plans to expand that to allow people to appeal responses when they report other people’s content. But the new independent body will serve as the final level of escalation for appeals
[Update: Since we published this report, Zuckerberg has released a 5,000-word letter describing his thoughts on Facebook policy, and the oversight body. You can read it below:]
Here’s the passage about the oversight committee:
In the next year, we’re planning to create a new way for people to appeal content decisions to an independent body, whose decisions would be transparent and binding. The purpose of this body would be to uphold the principle of giving people a voice while also recognizing the reality of keeping people safe.
I believe independence is important for a few reasons. First, it will prevent the concentration of too much decision-making within our teams. Second, it will create accountability and oversight. Third, it will provide assurance that these decisions are made in the best interests of our community and not for commercial reasons.
This is an incredibly important undertaking — and we’re still in the early stages of defining how this will work in practice. Starting today, we’re beginning a consultation period to address the hardest questions, such as: how are members of the body selected? How do we ensure their independence from Facebook, but also their commitment to the principles they must uphold? How do people petition this body? How does the body pick which cases to hear from potentially millions of requests? As part of this consultation period, we will begin piloting these ideas in different regions of the world in the first half of 2019, with the aim of establishing this independent body by the end of the year.
Over time, I believe this body will play an important role in our overall governance. Just as our board of directors is accountable to our shareholders, this body would be focused only on our community. Both are important, and I believe will help us serve everyone better over the long term.
Avoiding or acknowledging the weight of its decisions?
The past year has seen Facebook criticized for how it handled calls for violence in Myanmar, harassment and fake news by conspiracy theorists like Alex Jones, election interference by Russian, Iranian and other state actors, and more. Most recently, The New York Times published a scathing report about how Facebook tried to distract from or deflect criticism of its myriad problems, including its failure to prevent election interference ahead of the 2016 presidential race.
The oversight committee could both help Facebook make smarter decisions that the world can agree with, and give Facebook a stronger defense to this criticism because it’s not the one making the final policy calls. The approach could be seen as Facebook shirking its responsibility, or as it understanding that the gravity of that responsibility exceeds its own capabilities.
You can listen to the entire press call here (I apologize for the keyboard sounds).
[Update: We’ve updated this story with information from Zuckerberg’s Blueprint letter.]
Contributer : Social – TechCrunch
No comments:
Post a Comment