misinformation

Understanding the vulnerabilities of our own brains can help us guard against fake news.

Unsplash

This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.

Whenever you hear something repeated, it feels more true. In other words, repetition makes any statement seem more accurate. So anything you hear again will resonate more each time it's said.

Do you see what I did there? Each of the three sentences above conveyed the same message. Yet each time you read the next sentence, it felt more and more true. Cognitive neuroscientists and behavioral economists like myself call this the "illusory truth effect."

Go back and recall your experience reading the first sentence. It probably felt strange and disconcerting, perhaps with a note of resistance, as in "I don't believe things more if they're repeated!"

Reading the second sentence did not inspire such a strong reaction. Your reaction to the third sentence was tame by comparison.

Why? Because of a phenomenon called "cognitive fluency," meaning how easily we process information. Much of our vulnerability to deception in all areas of life—including to fake news and misinformation—revolves around cognitive fluency in one way or another. And unfortunately, such misinformation can swing major elections.

Keep Reading Keep Reading
Gleb Tsipursky
Dr. Gleb Tsipursky is an internationally recognized thought leader on a mission to protect leaders from dangerous judgment errors known as cognitive biases by developing the most effective decision-making strategies. A best-selling author, he wrote Resilience: Adapt and Plan for the New Abnormal of the COVID-19 Coronavirus Pandemic and Pro Truth: A Practical Plan for Putting Truth Back Into Politics. His expertise comes from over 20 years of consulting, coaching, and speaking and training as the CEO of Disaster Avoidance Experts, and over 15 years in academia as a behavioral economist and cognitive neuroscientist. He co-founded the Pro-Truth Pledge project.
Get our top stories twice a month
Follow us on

A group of protesters march for science.

( © afishman64/Adobe)


You read an online article about climate change, then start scanning the comments on Facebook. Right on cue, Seth the Science Denier chimes in with:

The study found that science deniers whose arguments go unchallenged can harm other people's attitudes toward science.

"Humans didn't cause this. Climate is always changing. The earth has always had cycles of warming and cooling—what's happening now isn't new. The idea that humans are causing something that happened long before humans were even around is absurd."

You know he's wrong. You recognize the fallacy in his argument. Do you take the time to engage with him, or write him off and move along?

New research suggests that countering science deniers like Seth is important—not necessarily to change their minds, but to keep them from influencing others.

Looking at Seth's argument, someone without much of a science background might think it makes sense. After all, climate is always changing. The earth has always gone through cycles, even before humans. Without a scientifically sound response, a reader may begin to doubt that human-caused climate change is really a thing.

A study published in Nature found that science deniers whose arguments go unchallenged can harm other people's attitudes toward science. Many people read discussions without actively engaging themselves, and some may not recognize erroneous information when they see it. Without someone to point out how a denier's statements are false or misleading, people are more likely to be influenced by the denier's arguments.

Researchers tested two strategies for countering science denial—by topic (presenting the facts) and by technique (addressing the illogical argument). Rebutting a science denier with facts and pointing out the fallacies in their arguments both had a positive effect on audience attitudes toward legitimate science. A combination of topic and technique rebuttals also had a positive effect.

"In the light of these findings we recommend that advocates for science train in topic and technique rebuttal," the authors wrote. "Both strategies were equally effective in mitigating the influence of science deniers in public debates. Advocates can choose which strategy they prefer, depending on their levels of expertise and confidence."

Who you're really addressing are the lurkers who might be swayed by misinformation if it isn't countered by real science.

So what does that look like? If we were to counter Seth's statements with a topic rebuttal, focusing on facts, it might look something like this:

Yes, climate has always changed due to varying CO2 levels in the atmosphere. Scientists have tracked that data. But they also have data showing that human activity, such as burning fossil fuels, has dramatically increased CO2 levels. Climate change is now happening at a rate that isn't natural and is dangerous for life as we know it.

A technique rebuttal might focus on how Seth is using selective information and leaving out important facts:

Climate has always changed, that's true. But you've omitted important information about why it changes and what's different about the changes we're seeing now.

Ultimately, we could combine the two techniques in something like this:

Climate has always changed, but you've omitted important information about why it changes and what's different about what we're seeing now. Levels of CO2 in the atmosphere are largely what drives natural climate change, but human activity has increased CO2 beyond natural levels. That's making climate change happen faster than it should, with devastating effects for life on Earth.

Remember that the point is not to convince Seth, though it's great if that happens. Who you're really addressing are the lurkers who might be swayed by misinformation if it isn't countered by truth.

It's a wacky world out there, science lovers. Keep on fighting the good fight.

Annie Reneau
Annie is a writer, wife, and mother of three with a penchant for coffee, wanderlust, and practical idealism. On good days, she enjoys the beautiful struggle of maintaining a well-balanced life. On bad days, she binges on chocolate and dreams of traveling the world alone.

A new video game exposes players to the deceptive tactics that are used to spread misinformation online, in an attempt to reduce people's susceptibility to fake news.

(© MicroOne/Adobe)


There's no shortage of fake news going around the internet these days, but how do we become more aware as consumers of what's real and what's not?

"We are hoping to create what you might call a general 'vaccine' against fake news, rather than trying to counter each specific conspiracy or falsehood."

Researchers at the University of Cambridge may have answered just that by developing an online game designed to expose and educate participants to the tactics used by those spreading false information.

"We wanted to see if we could preemptively debunk, or 'pre-bunk', fake news by exposing people to a weak dose of the methods used to create and spread disinformation, so they have a better understanding of how they might be deceived," Dr Sander van der Linden, Director of the Cambridge Social Decision-Making Lab, said in a statement.

"This is a version of what psychologists call 'inoculation theory', with our game working like a psychological vaccination."

In February 2018, van der Linden and his coauthor, Jon Roozenbeek, helped launch the browser game, "Bad News," where players take on the role of "Disinformation and Fake News Tycoon."

They can manipulate news and social media within the game by several different methods, including deploying twitter-bots, photo-shopping evidence, creating fake accounts, and inciting conspiracy theories with the goal of attracting followers and maintaining a "credibility score" for persuasiveness.

In order to gauge the game's effectiveness, players were asked to rate the reliability of a number of real and fake news headlines and tweets both before and after playing. The data from 15,000 players was evaluated, with the results published June 25 in the journal Palgrave Communications.

The results concluded that "the perceived reliability of fake news before playing the game had reduced by an average of 21% after completing it. Yet the game made no difference to how users ranked real news."

Just 15 minutes of playing the game can have a moderate effect on people, which could play a major role on a larger scale.

Additionally, participants who "registered as most susceptible to fake news headlines at the outset benefited most from the 'inoculation,'" according to the study.

Just 15 minutes of playing the game can have a moderate effect on people, which could play a major role on a larger scale when it comes to "building a societal resistance to fake news," according to Dr. van der Linden.

"Research suggests that fake news spreads faster and deeper than the truth, so combating disinformation after-the-fact can be like fighting a losing battle," he said.

"We are hoping to create what you might call a general 'vaccine' against fake news, rather than trying to counter each specific conspiracy or falsehood," Roozenbeek added.

Van der Linden and Roozenbeek's work is an early example of the potential methods to protect people against deception by training them to be more attuned to the methods used to distribute fake news.

"I hope that the positive results give further credence to the new science of prebunking rather than only thinking about traditional debunking. On a larger level, I also hope the game and results inspire a new kind of behavioral science research where we actively engage with people and apply insights from psychological science in the public interest," van der Linden told leapsmag.

"I like the idea that the end result of a scientific theory is a real-world partnership and practical tool that organizations and people can use to guard themselves against online manipulation techniques in a novel and hopefully fun and engaging manner."

Ready to be "inoculated" against fake news? Then play the game for yourself.

Michelle Gant
Michelle Gant is an editor at GOOD Worldwide. She was previously at Fox, StyleCaster, Domino, SELF, and CBS.