Facebook’s survey to assess the trustworthiness of news is only 2 questions long — but says that's not an issue (FB)
- Facebook is asking its users to decide how trustworthy news outlets are — and its survey is only two questions long.
- The social network has been repeatedly criticised over the spread of misinformation and hoaxes.
- Facebook argues it will look at how news sources are viewed across different demographics — but it's not clear if "trustworthiness" is the best way to rank them.
We've got our first look at the survey Facebook is going to use to assess how trustworthy news outlets are — and it's only two questions long.
After months of criticism over its role in the spread of misinformation and fake news, Facebook is making sweeping changes to its News Feed. The social network announced last week that rather than judging the reliability of news sources itself, it is going to ask users to assess their trustworthiness for themselves via surveys and then promote or demote the websites in the News Feed algorithm accordingly.
BuzzFeed News has now got its hands on the survey — and on the face of it, there's not much to it.
First, it asks: "Do you recognize the following websites," and gives only two options in response: "yes" or "no."
It then follows up with "How much do you trust each of these domains?" The possible answers are "entirely," "a lot," "somewhat," "barely," and "not at all."
The brevity of the survey has attracted the ire of some critics and journalists. "Trust in news is much more-complicated," tweeted Bloomberg reporter Sarah Frier. "How well-sourced is the article? Are other sites verifying it? Is it news or analysis?"
Guardian audience editor Dave Earley chimed in: "Come on. This is a joke, right?"
this is kind of like a brand awareness survey, like an advertiser would run.
— Sarah Frier (@sarahfrier) January 23, 2018
Trust in news is much more complicated. How well-sourced is the article? Are other sites verifying it? Is it news or analysis?
Don't trust any one site "entirely."https://t.co/6xGki4uTmV
Facebook argues that it will only rank publications higher or lower if people across different groups and demographics agree on how trustworthy a given news source is — and users can't opt in to do the survey if they haven't been selected, so bad actors can't group up together to skew the results one way or the other.
On Twitter, Facebook's News Feed boss Adam Mosseri defended the change, writing that "how we incorporate survey data is every bit as important as the specific questions we ask."
He added Facebook will also be tracking what people read and using that to affect how people's responses are weighted. "The other important thing to understand is this isn’t a simple vote. We are not just valuing more publishers that a lot of people trust, but rather valuing more publishers that a lot of different types of people (based on reading habits) trust," he wrote.
(Mosseri also admitted Facebook explained the trustworthiness change badly when it was announced, saying: "We should have done a better job explaining this one, we were trying to balance clarity and detail and didn't quite get the balance right. But that's also why we're here and on Facebook answering questions.")
The other important thing to understand is this isn’t a simple vote. We are not just valuing more publishers that a lot of people trust, but rather valuing more publishers that a lot of different types of people (based on reading habits) trust.
— Adam Mosseri (@mosseri) January 24, 2018
But weighting based on users' reading habits can't account for reading behaviour that takes place outside of the Facebook ecosystem. For example, the binary option for the first question doesn't differentiate between someone who's subscribed to the print edition of The Guardian for 40 years, and a fringe conspiracy theorist who only has the vaguest conception of what The Guardian publishes.
And by treating publications as monolithic entities, to be considered trustworthy or not in their entirety, it means that those that publish a mix of news and non-news content could be penalised.
Take BuzzFeed. The media organisation publishes everything from deeply reported investigations on targeted Russian assassinations on British soil to quizzes about "What Do You Love Based On Your Zodiac Sign?" One of these is a trustworthy, researched news story. The other one is clearly just for fun — but could contribute to a negative overall perception of the website's trustworthiness.
Regardless of the exact phrasing of the survey, there are arguably broader concerns about this approach: Is asking users for their views on trustworthiness really the best way to stamp out misinformation?
In a blog post published last week, Facebook CEO Mark Zuckerberg said that the company wasn't "comfortable" assessing the trustworthiness of news outlets itself, and that asking outside experts wouldn't be "objective." So, he wrote, "we decided that having the community determine which sources are broadly trusted would be most objective."
In other words, Facebook is making a big bet on the wisdom of the crowd. Sometimes, that pays off. But sometimes, the wisdom of the crowd is dead wrong.
Join the conversation about this story »
NOW WATCH: I quit social media for a month — and it was the best choice I've ever made
Contributer : Tech Insider http://ift.tt/2F8ruwd
No comments:
Post a Comment