The limits of “prebunking” in the fight against misinformation

A study suggests that inoculating internet users against misinformation might be more successful than fact checking later, but Center strategic advisor Brad Berens is not optimistic that this will help much in the fight for truth in journalism.

By Brad Berens

A new study in the journal Science Advances, “Psychological inoculation improves resilience against misinformation on social media,” suggests that “prebunking” online misinformation is more effective, cheaper, and more scaleable than trying to debunk misinformation via fact checking once people have already seen it. Prebunking tries to inoculate audiences ahead of time, like a vaccine, while debunking fights misinformation after exposure, like antibiotics.

The study provoked a small flurry of articles and posts, including:

The study finds that people who watch videos explaining various misinformation techniques (emotionally manipulative language, incoherence, false dichotomies, scapegoating, and ad hominem attacks) are less susceptible to those techniques when they later encounter them.

This seems like happy news because it means that if we can flood the internet with witty videos explaining how to spot misinformation we can stop misinformation from spreading.

I am not so optimistic for two reasons.

First, the study confines itself to online misinformation, but despite the intense efforts of digital platforms people don’t only live online. If a viewer drives around listening to talk radio or has a partisan media provider (on either side of the political spectrum) chattering in the background for hours each day, then there’s not a lot a 90 second YouTube video can do.

If you don’t know what I mean by “partisan media provider,” then a) where have you been for the last few years?, and b) check out the bottom half of the Media Bias Chart by Ad Fontes Media (disclosure: I’m an investor and advisor).

Second, the study presumes that people care about truth more than they do.

Pushing gently on the study’s inoculation analogy reveals the weakness of the optimistic conclusion: we have powerful, effective, and safe COVID vaccines that many people refuse to take them because validating their tribal identities are more important than the truth. The more partisan the viewer, the less likely accuracy is important to that viewer.

Another way of putting this is that when your opinion runs up against evidence that contradicts your opinion, you’re more likely to discount the evidence than your opinion. Despite the clear Surgeon General’s warning on all packages of cigarettes, a smoker can discount the medical evidence because the smoker’s Great Uncle Bob smoked two packs per day and lived to 100.

Psychologists call this cognitive dissonance, “the hard-wired psychological mechanism that creates self-justification and protects our certainties, self-esteem, and tribal affiliations” (page 11 of this book).

We should follow the study’s findings and increase the number of Public Service Announcements about misinformation, but that does not let YouTube and Facebook and Instagram and TikTok off the hook for being accelerants for misinformation. Nor does it let the governments of the world off the hook for regulating misinformation.

On the individual level, before you believe something that you read online, and especially before you share it with other people, think about why you want to share it. (And check out the Ad Fontes media literacy curriculum while you’re at it.)
__________

 

Brad Berens is the Center’s strategic advisor and a senior research fellow. He is principal at Big Digital Idea Consulting. You can learn more about Brad at www.bradberens.com, follow him on Twitter, and subscribe to his weekly newsletter (only some of his columns are syndicated here).

 

See all columns from the Center.

September 14, 2022