Section Branding
Header Content
Far-Right Misinformation Is Thriving On Facebook. A New Study Shows Just How Much
Primary Content
Research from New York University found that far-right accounts known for spreading misinformation drive engagement at higher rates than other news sources.
Transcript
MICHEL MARTIN, HOST:
We've talked a lot in recent years about misinformation and about how it spreads online, but we have new information about that. You might remember that Facebook has promised repeatedly in recent years to address the spread of conspiracy theories and misinformation on its site. But a new study from researchers at New York University shows that far-right accounts known for spreading misinformation are not only thriving on Facebook - they are actually more successful than other kinds of accounts aimed at getting likes, shares and other forms of user engagement. Laura Edelson helped lead that research. She is part of Cybersecurity for Democracy, a group based at NYU that's studying online misinformation. And she's with us now.
Laura Edelson, thank you so much for being with us.
LAURA EDELSON: Great to be here.
MARTIN: And I do want to note that Facebook is among NPR's financial supporters. With that being said, could you walk us through these findings in nonexpert terms? As briefly as you can, what question was your research team looking at, and what did you find?
EDELSON: Absolutely. So after the events of the last few months, we really wanted to understand how different types of news media engaged with their audiences on Facebook. So we got third-party evaluations of news quality and partisanship, and we combined that with Facebook data about engagement. And what we found is that overall, far-right news sources have much more engagement with their audiences than other partisan categories. But most of that edge comes from sources with a reputation for spreading misinformation.
So on the far-right, misinformation sources outperformed more reputable sources by quite a bit. But for all other partisan categories, including slightly right, the reverse was true. Sources with a reputation for spreading misinformation performed worse, and usually significantly so. And we call that effect a misinformation penalty.
MARTIN: So when you talk about a far-right news source, do you feel comfortable giving us an example that we might recognize? I know what I certainly think of, but what are you thinking of?
EDELSON: So some of the top-performing far-right news sources in our data set were things like Newsmax, Breitbart - that kind of media source.
MARTIN: And you and your colleagues say that far-right content is the only partisan leaning in which misinformation actually drives more engagement. And there is no what you call misinformation penalty. Could you just talk a little bit more about that? So a misinformation penalty is what? Is that if you are demonstrated to be inaccurate or false, then what? People who are on the left side of the ledger would give you less credibility. Is that it?
EDELSON: So we can't say exactly why it's happening. But what we see is that for left-leaning sources, for center sources, and even slightly right, the sources that have a reputation for spreading misinformation just don't engage as well. There could be a variety of reasons for that. But certainly, the simplest explanation would be that users don't find them as credible and don't want to engage with them.
MARTIN: But you're saying that misinformation actually drives more engagement with far-right content, which is remarkable.
EDELSON: Yeah. The effect was quite striking because it's not a small edge, either. It's almost twice as much engagement, you know, per follower among the sources that have a reputation for spreading misinformation. So clearly, that portion of the news ecosystem is just behaving very differently.
MARTIN: And it's my understanding that Facebook responded by saying engagement isn't the same as how many people actually see a piece of content. So perhaps you could talk a little bit more about that. Like, what do we know about how Facebook promotes content? And what do you make of their response?
MARTIN: Well, we really don't know that much about how Facebook promotes content. We know that engagement is part of what drives Facebook's algorithm for promoting content, but they really don't make a lot of information about that available. Frankly, I would love for Facebook to make the data available that backs this assertion, but they don't make it public. And this is where I just think Facebook can't have it both ways. They can't say that their data leads to a different conclusion but then not make that data public.
MARTIN: I recognize that the purpose of the study is to analyze what is as opposed to, say, what should be. But does your team have recommendations? Because it sounds like - I mean, and I understand exactly what you're saying - there is not sort of publicly available data from Facebook that would help us understand why far-right misinformation drives more engagement.
But it sounds from what you're telling us that people seek this stuff out and believe it because they want to. They want to seek it out, and they want to engage with it. So if that's the case, do you have recommendations about that?
EDELSON: I think what's very clear is that Facebook has a misinformation problem. I think any system that attempts to promote the most engaging content, from what we can tell, will wind up promoting misinformation. And just to pull out one portion of our data that I know I was really concerned about when I saw it is, you know, of course, we saw a spike of engagement with news content on January 6. That's to be expected. The thing was that most of that spike was concentrated among the partisan extremes and misinformation providers.
And when I really sit back and think about that, I think the idea that on a day like that, which was so scary and so uncertain, that the most extreme and least reputable sources were the ones that Facebook users were engaging with is pretty troubling. And I think those are the kinds of circumstances where especially Facebook has a responsibility to its users and to the wider public to do a better job of stopping misinformation from spreading. And I think, you know, that's true every day.
MARTIN: That was Laura Edelson. She's a Ph.D. candidate at New York University and a researcher with Cybersecurity for Democracy.
Laura Edelson, thank you so much for being with us and sharing this work with us.
EDELSON: Thanks for having me.
(SOUNDBITE OF MISTY SAPPHIRE'S "BLAZO") Transcript provided by NPR, Copyright NPR.