Facebook's Latest Fix for Fake News: Ask Users What They Trust

Facebook said it will prioritize news sources by surveying users about their trust in media brands.
Image may contain Mark Zuckerberg Crowd Audience Human Person Speech Clothing Sleeve Apparel and Lecture
Stephen Lam/Reuters

Mark Zuckerberg promised to spend 2018 fixing Facebook. Last week, he addressed Facebook making you feel bad. Now he’s onto fake news.

Late Friday, Facebook buried another major announcement at the end of the week: How to make sure that users see high-quality news on Facebook. Facebook’s solution? Let its users decide what to trust. On the difficult problem of fixing fake news, Zuckerberg took the path with the least responsibility for Facebook, but described it as the most objective.

“We could try to make that decision ourselves, but that's not something we're comfortable with,” Zuckerberg wrote on his Facebook page. “We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. We decided that having the community determine which sources are broadly trusted would be most objective.”

The vetting process will happen through Facebook’s ongoing quality surveys — the same surveys it uses to ask whether Facebook is a force for good in the world and whether the company seems to care about its users. Now, Facebook will ask users if they are familiar with a news source and, if so, whether they trust the source.

According to Zuckerberg, these surveys will help the truth about trustworthiness rise to the top: “The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don't follow them directly.”

It’s tempting to read a lot into Zuckerberg’s words, especially when the missive was so short on details. The perils are evident: Bad actors can game the survey! This only increases filter bubbles! After the year Facebook just had, how can you possibly think the masses can be objective?

Relying on users “lets them sidestep allegations of bias and take steps to fix it without directly becoming the dreaded 'arbiter of truth,'" says researcher Renee DiResta, a technologist who has been studying the manipulation of social-media platforms.

Facebook did not immediately return a request for comment. There’s a good chance the new policy could cause as many problems as it solves. For the best known media brands, the survey could be a leg up. But what about niche publications that have narrow, but credible readerships? Does this mean that National Review or Slate are deemed untrustworthy because they have definitive points of view? Do they get put in the same bucket as Fox and MSNBC? What about BuzzFeed, where fun distractions and deep investigations all show up under the same URL?

Jason Kint, CEO of Digital Content Next, a trade association representing content companies, likes the idea of using brands as a proxy for trust. “But the details are really important,” he says. “What matters most is how this is being messaged. Facebook is clearly scrambling as the industry, Washington and the global community are losing trust in them. There is nothing worse to a company long-term.”

Zuckerberg also seemed to be in scramble mode last week when Facebook said it is reorienting the newsfeed to show users “meaningful interactions.” Only Friday, eight days later, did Zuckerberg explain the scope of that change for news publishers: the percentage of news on Facebook’s newsfeed will drop to 4 percent, from 5 percent.

This isn’t Facebook’s first attempt to address fake news. It’s previous effort flopped a few weeks ago. Facebook thought putting “disputed” flags on fake news stories would help out, but people only clicked more. Despite Zuckerberg’s reluctance to work with outsiders, experts probably could have warned him about human nature.

X content

This content can also be viewed on the site it originates from.

The survey strategy may fall prey to the same misunderstanding of people. Chris Tolles, the CEO of the media site Topix, is familiar with the problem. “As a news aggregator, we wrestled with this,” he says. "People who actually share news, news is a weapon, it’s not to inform, it’s to injure. It’s a social-justice identitarian, a person with an ax to grind, or it’s a journalist. They are not sharing news to inform, they are trying to convince you of something. It comes with a point of view.”

The root of the problem, according to Tolles: Trust is not objective. The interpretation of objectivity varies wildly between Democrats and Republicans and internet users themselves may not be a trustworthy bunch. Zuckerberg’s post also mentioned refocusing on “local” news, which Tolles says is just as fraught. “It’s vicious all the way down to the local crime report. I think that they’ve got an impossible task.”

Last week the company said it was stepping away from news. “This week, they said we’re going to try to do the hardest thing in the world, which is to try to decide which narrative is true,” says Tolles.


The Power of Social Media