Advertisement

Researchers say Facebook’s anti-fake news efforts might be working

Engagements with fake news could be declining on the site.

Since the 2016 US presidential election, social network sites have acknowledged the issue of fake news as well as their roles in spreading it. Companies like Facebook and Twitter have made efforts to address the problem, instituting a number of measures aimed at stemming the spread of misinformation and disincentivizing those that spread it. But how useful have those efforts been? Researchers at Stanford University and New York University say at least in Facebook's case, they may be working.

The researchers looked at 570 websites that groups like PolitiFact and FactCheck have noted as sites that produce false news stories, and they gathered the monthly Facebook engagements and Twitter shares of the articles those sites published between January 2015 and July 2018. They found that leading up to the 2016 election, fake news interactions increased on both platforms. However, a few months after the election, Facebook engagements fell by more than 50 percent while Twitter shares continued to increase. To compare, interactions for the top major news sites, small news sites and business and culture sites remained largely stable over that same timeframe. That suggests that the efforts made by Facebook have had a measurable effect on the spread of fake news.

The team makes clear that their findings are not definitive and they note a few limitations that have to be taken into account when interpreting the results. First the selection of websites could be prone to selection biases and it's almost certainly not a comprehensive collection. Further, the stories from the chosen sites could be more prone to interactions on one social network over the other. Also, because PolitiFact and FactCheck work with Facebook on its fake news efforts, the sites they list as being peddlers of fake news could be those that Facebook is aware of and pointedly addressing while others not included in the study could be slipping by.

However, they also found that the ratio of Facebook engagements to Twitter shares changed over time when it came to the fake news sites, but not the other types of websites. During the election the Facebook to Twitter engagement ratio was 45:1 and two years later it had declined to 15:1. "We see the comparison of Facebook engagements to Twitter shares as potentially more informative," write the researchers. "The fact that Facebook engagements and Twitter shares follow similar trends prior to late 2016 and for the non-fake-news sites in our data, but diverge sharply for fake news sites following the election, suggests that some factor has slowed the relative diffusion of misinformation on Facebook. The suite of policy and algorithmic changes made by Facebook following the election seems like a plausible candidate."

But even if engagement with fake news is on the decline on Facebook, it's still high, the researchers note. In July, there were still around 70 million engagements with the study's fake news samples per month. Others have noted that fake news is still an issue for the platform as well.

Facebook's work may or may not be having an effect, but this study presents some evidence that it could be. And if true, that would be good news for those concerned about misinformation and certainly for Facebook.