misinformation

Understanding the vulnerabilities of our own brains can help us guard against fake news.

Unsplash

This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.

Whenever you hear something repeated, it feels more true. In other words, repetition makes any statement seem more accurate. So anything you hear again will resonate more each time it's said.

Do you see what I did there? Each of the three sentences above conveyed the same message. Yet each time you read the next sentence, it felt more and more true. Cognitive neuroscientists and behavioral economists like myself call this the "illusory truth effect."

Go back and recall your experience reading the first sentence. It probably felt strange and disconcerting, perhaps with a note of resistance, as in "I don't believe things more if they're repeated!"

Reading the second sentence did not inspire such a strong reaction. Your reaction to the third sentence was tame by comparison.

Why? Because of a phenomenon called "cognitive fluency," meaning how easily we process information. Much of our vulnerability to deception in all areas of life—including to fake news and misinformation—revolves around cognitive fluency in one way or another. And unfortunately, such misinformation can swing major elections.

Keep Reading Keep Reading
Gleb Tsipursky
Dr. Gleb Tsipursky is an internationally recognized thought leader on a mission to protect leaders from dangerous judgment errors known as cognitive biases by developing the most effective decision-making strategies. A best-selling author, he wrote Resilience: Adapt and Plan for the New Abnormal of the COVID-19 Coronavirus Pandemic and Pro Truth: A Practical Plan for Putting Truth Back Into Politics. His expertise comes from over 20 years of consulting, coaching, and speaking and training as the CEO of Disaster Avoidance Experts, and over 15 years in academia as a behavioral economist and cognitive neuroscientist. He co-founded the Pro-Truth Pledge project.

A group of protesters march for science.

( © afishman64/Adobe)


You read an online article about climate change, then start scanning the comments on Facebook. Right on cue, Seth the Science Denier chimes in with:

The study found that science deniers whose arguments go unchallenged can harm other people's attitudes toward science.

"Humans didn't cause this. Climate is always changing. The earth has always had cycles of warming and cooling—what's happening now isn't new. The idea that humans are causing something that happened long before humans were even around is absurd."

You know he's wrong. You recognize the fallacy in his argument. Do you take the time to engage with him, or write him off and move along?

New research suggests that countering science deniers like Seth is important—not necessarily to change their minds, but to keep them from influencing others.

Looking at Seth's argument, someone without much of a science background might think it makes sense. After all, climate is always changing. The earth has always gone through cycles, even before humans. Without a scientifically sound response, a reader may begin to doubt that human-caused climate change is really a thing.

A study published in Nature found that science deniers whose arguments go unchallenged can harm other people's attitudes toward science. Many people read discussions without actively engaging themselves, and some may not recognize erroneous information when they see it. Without someone to point out how a denier's statements are false or misleading, people are more likely to be influenced by the denier's arguments.

Researchers tested two strategies for countering science denial—by topic (presenting the facts) and by technique (addressing the illogical argument). Rebutting a science denier with facts and pointing out the fallacies in their arguments both had a positive effect on audience attitudes toward legitimate science. A combination of topic and technique rebuttals also had a positive effect.

"In the light of these findings we recommend that advocates for science train in topic and technique rebuttal," the authors wrote. "Both strategies were equally effective in mitigating the influence of science deniers in public debates. Advocates can choose which strategy they prefer, depending on their levels of expertise and confidence."

Who you're really addressing are the lurkers who might be swayed by misinformation if it isn't countered by real science.

So what does that look like? If we were to counter Seth's statements with a topic rebuttal, focusing on facts, it might look something like this:

Yes, climate has always changed due to varying CO2 levels in the atmosphere. Scientists have tracked that data. But they also have data showing that human activity, such as burning fossil fuels, has dramatically increased CO2 levels. Climate change is now happening at a rate that isn't natural and is dangerous for life as we know it.

A technique rebuttal might focus on how Seth is using selective information and leaving out important facts:

Climate has always changed, that's true. But you've omitted important information about why it changes and what's different about the changes we're seeing now.

Ultimately, we could combine the two techniques in something like this:

Climate has always changed, but you've omitted important information about why it changes and what's different about what we're seeing now. Levels of CO2 in the atmosphere are largely what drives natural climate change, but human activity has increased CO2 beyond natural levels. That's making climate change happen faster than it should, with devastating effects for life on Earth.

Remember that the point is not to convince Seth, though it's great if that happens. Who you're really addressing are the lurkers who might be swayed by misinformation if it isn't countered by truth.

It's a wacky world out there, science lovers. Keep on fighting the good fight.

Annie Reneau
Annie is a writer, wife, and mother of three with a penchant for coffee, wanderlust, and practical idealism. On good days, she enjoys the beautiful struggle of maintaining a well-balanced life. On bad days, she binges on chocolate and dreams of traveling the world alone.
Get our top stories twice a month
Follow us on

A new video game exposes players to the deceptive tactics that are used to spread misinformation online, in an attempt to reduce people's susceptibility to fake news.

(© MicroOne/Adobe)


There's no shortage of fake news going around the internet these days, but how do we become more aware as consumers of what's real and what's not?

"We are hoping to create what you might call a general 'vaccine' against fake news, rather than trying to counter each specific conspiracy or falsehood."

Researchers at the University of Cambridge may have answered just that by developing an online game designed to expose and educate participants to the tactics used by those spreading false information.

"We wanted to see if we could preemptively debunk, or 'pre-bunk', fake news by exposing people to a weak dose of the methods used to create and spread disinformation, so they have a better understanding of how they might be deceived," Dr Sander van der Linden, Director of the Cambridge Social Decision-Making Lab, said in a statement.

"This is a version of what psychologists call 'inoculation theory', with our game working like a psychological vaccination."

In February 2018, van der Linden and his coauthor, Jon Roozenbeek, helped launch the browser game, "Bad News," where players take on the role of "Disinformation and Fake News Tycoon."

They can manipulate news and social media within the game by several different methods, including deploying twitter-bots, photo-shopping evidence, creating fake accounts, and inciting conspiracy theories with the goal of attracting followers and maintaining a "credibility score" for persuasiveness.

In order to gauge the game's effectiveness, players were asked to rate the reliability of a number of real and fake news headlines and tweets both before and after playing. The data from 15,000 players was evaluated, with the results published June 25 in the journal Palgrave Communications.

The results concluded that "the perceived reliability of fake news before playing the game had reduced by an average of 21% after completing it. Yet the game made no difference to how users ranked real news."

Just 15 minutes of playing the game can have a moderate effect on people, which could play a major role on a larger scale.

Additionally, participants who "registered as most susceptible to fake news headlines at the outset benefited most from the 'inoculation,'" according to the study.

Just 15 minutes of playing the game can have a moderate effect on people, which could play a major role on a larger scale when it comes to "building a societal resistance to fake news," according to Dr. van der Linden.

"Research suggests that fake news spreads faster and deeper than the truth, so combating disinformation after-the-fact can be like fighting a losing battle," he said.

"We are hoping to create what you might call a general 'vaccine' against fake news, rather than trying to counter each specific conspiracy or falsehood," Roozenbeek added.

Van der Linden and Roozenbeek's work is an early example of the potential methods to protect people against deception by training them to be more attuned to the methods used to distribute fake news.

"I hope that the positive results give further credence to the new science of prebunking rather than only thinking about traditional debunking. On a larger level, I also hope the game and results inspire a new kind of behavioral science research where we actively engage with people and apply insights from psychological science in the public interest," van der Linden told leapsmag.

"I like the idea that the end result of a scientific theory is a real-world partnership and practical tool that organizations and people can use to guard themselves against online manipulation techniques in a novel and hopefully fun and engaging manner."

Ready to be "inoculated" against fake news? Then play the game for yourself.

Michelle Gant
Michelle Gant is an editor at GOOD Worldwide. She was previously at Fox, StyleCaster, Domino, SELF, and CBS.

A religious family attends church.

(© Rawpixel.com/Adobe)


The Internet has made it easier than ever to misguide people. The anti-vaxx movement, climate change denial, protests against stem cell research, and other movements like these are rooted in the spread of misinformation and a distrust of science.

"I had been taught intelligent design and young-earth creationism instead of evolution, geology, and biology."

Science illiteracy is pervasive in the communities responsible for these movements. For the mainstream, the challenge lies not in sharing the facts, but in combating the spread of misinformation and facilitating an open dialogue between experts and nonexperts.

I grew up in a household that was deeply skeptical of science and medicine. My parents are evangelical Christians who believe the word of the Bible is law. To protect my four siblings and me from secular influence, they homeschooled some of us and put the others in private Christian schools. When my oldest brother left for a Christian college and the tuition began to add up, I was placed in a public charter school to offset the costs.

There, I became acutely aware of my ignorant upbringing. I had been taught intelligent design and young-earth creationism instead of evolution, geology, and biology. My mother skipped over world religions, and much of my history curriculum was more biblical-based than factual. She warned me that stem cell research, vaccines, genetic modification of crops, and other areas of research in biological science were examples of humans trying to be like God. At the time, biologist Richard Dawkins' The God Delusion was a bestseller and science seemed like an excuse to not believe in God, so she and my father discouraged me from studying it.

The gaps in my knowledge left me feeling frustrated and embarrassed. The solution was to learn about the things that had been censored from my education, but several obstacles stood in the way.

"When I first learned about fundamentalism, my parents' behavior finally made sense."

I lacked a good foundation in basic mathematics after being taught by my mother, who never graduated college. My father, who holds a graduate degree in computer science, repeatedly told me that I inherited my mother's "bad math genes" and was therefore ill-equipped for science. While my brothers excelled at math under his supervision and were even encouraged toward careers in engineering and psychology, I was expected to do well in other subjects, such as literature. When I tried to change this by enrolling in honors math and science classes, they scolded me -- so reluctantly, I dropped math. By the time I graduated high school, I was convinced that math and science were beyond me.

When I look back at my high school transcripts, that sense of failure was unfounded: my grades were mostly A's and B's, and I excelled in honors biology. Even my elementary standardized test scores don't reflect a student disinclined toward STEM, because I consistently scored in the top percentile for sciences. Teachers often encouraged me to consider studying science in college. Why then, I wondered, did my parents reject that idea? Why did they work so hard to sway me from that path? It wasn't until I moved away from my parents' home and started working to put myself through community college that I discovered my passion for both biology and science writing.

As a young adult venturing into the field of science communication, I've become fascinated with understanding communities that foster antagonistic views toward science. When I first learned about fundamentalism, my parents' behavior finally made sense. It is the foundation of the Religious Right, a right-wing Christian group which heavily influences the Republican party in the United States. The Religious Right crusades against secular education, stem cell research, abortion, evolution, and other controversial issues in science and medicine on the basis that they contradict Christian beliefs. They are quietly overturning the separation of church and state in order to enforce their religion as policy -- at the expense of science and progress.

Growing up in this community, I learned that strong feelings about these issues arise from both a lack of science literacy and a distrust of experts. Those who are against genetic modification of crops don't understand that GMO research aims to produce more, and longer-lasting, food for a growing planet. The anti-vaxx movement is still relying on a deeply flawed study that was ultimately retracted. Those who are against stem cell research don't understand how it works or the important benefits it provides the field of medicine, such as discovering new treatment methods.

In fact, at one point the famous Christian radio show Focus on the Family spread anti-vaxx mentality when they discussed vaccines that, long ago, were derived from aborted fetal cells. Although Focus on the Family now endorses vaccines, at the time it was enough to convince my own mother, who listened to the show every morning, not to vaccinate us unless the law required it.

"In everyday interactions with skeptics, science communicators need to shift their focus from convincing to discussing."

We can help clear up misunderstandings by sharing the facts, but the real challenge lies in willful ignorance. It was hard for me to accept, but I've come to understand that I'm not going to change anyone's mind. It's up to an individual to evaluate the facts, consider the arguments for and against, and make his or her own decision.

As my parents grew older and my siblings and I introduced them to basic concepts in science, they came around to trusting the experts a little more. They now see real doctors instead of homeopathic practitioners. They acknowledge our world's changing climate instead of denying it. And they even applaud two of their children for pursuing careers in science. Although they have held on to their fundamentalism and we still disagree on many issues, these basic changes give me hope that people in deeply skeptical communities are not entirely out of reach.

In everyday interactions with skeptics, science communicators need to shift their focus from convincing to discussing. This means creating an open dialogue with the intention of being understanding and helpful, not persuasive. This approach can be beneficial in both personal and online interactions. There are people within these movements who have doubts, and their doubts will grow as we continue to feed them through discussion.

People will only change their minds when it is the right time for them to do so. We need to be there ready to hold their hand and lead them toward truth when they reach out. Until then, all we can do is keep the channels of communication open, keep sharing the facts, and fight the spread of misinformation. Science is the pursuit of truth, and as scientists and science communicators, sometimes we need to let the truth speak for itself. We're just there to hold the megaphone.

Sarah Olson
Sarah Olson is an undergraduate student at Oregon State University where she is majoring in microbiology. Sarah works at a bookstore curating their science and math sections, and reviews popular science books on her blog readmorescience.com. She was a recipient of the National Association of Science Writer's prestigious undergraduate travel fellowship in 2018 and has written for a number of science media outlets. Sarah is a top writer in Feminism on Medium, where she regularly writes about the intersection of religion, women's issues, and science. She currently lives in Corvallis, Oregon with her fiance. You can connect with her at saraholson.net and on Twitter and Instagram at @ReadMoreScience.

A portrait of skepticism while reading information online.

(© fizkes/Fotolia)


Today's growing distrust of science is not an academic problem. It can be a matter of life and death.

Take, for example, the tragic incident in 2016 when at least 10 U.S. children died and over 400 were sickened after they tried homeopathic teething medicine laced with a poisonous herb called "deadly nightshade." Carried by CVS, Walgreens, and other major American pharmacies, the pills contained this poison based on the alternative medicine principle of homeopathy, the treatment of medical conditions by tiny doses of natural substances that produce symptoms of disease.

Such "alternative medicines" take advantage of the lack of government regulation and people's increasing hostility toward science.

These children did not have to die. Numerous research studies show that homeopathy does not work. Despite this research, homeopathy is a quickly-growing multi-billion dollar business.

Such "alternative medicines" take advantage of the lack of government regulation and people's increasing hostility toward science. Polling shows that the number of people who believe that science has "made life more difficult" increased by 50 percent from 2009 to 2015. According to a 2017 survey, only 35 percent of respondents have "a lot" of trust in scientists; the number of people who do "not at all" trust scientists increased by over 50 percent from a similar poll conducted in December 2013.

Children dying from deadly nightshade is only one consequence of this crisis of trust. For another example, consider the false claim that vaccines cause autism. This belief has spread widely across the US, and led to a host of problems. For instance, measles was practically eliminated in the US by 2000. However, in recent years outbreaks of measles have been on the rise, driven by parents failing to vaccinate their children in a number of communities.

The Internet Is for… Misinformation

The rise of the Internet, and more recently social media, is key to explaining the declining public confidence in science.

Before the Internet, the information accessible to the general public about any given topic usually came from experts. For instance, researchers on autism were invited to talk on mainstream media, they wrote encyclopedia articles, and they authored books distributed by large publishers.

The Internet has enabled anyone to be a publisher of content, connecting people around the world with any and all sources of information. On the one hand, this freedom is empowering and liberating, with Wikipedia a great example of a highly-curated and accurate source on the vast majority of subjects. On the other, anyone can publish a blog piece making false claims about links between vaccines and autism or the effectiveness of homeopathic medicine. If they are skilled at search engine optimization, or have money to invest in advertising, they can get their message spread widely. Russia has done so extensively to influence elections outside of its borders, whether in the E.U. or the U.S.

Unfortunately, research shows that people lack the skills for differentiating misinformation from true information. This lack of skills has clear real-world effects: U.S. adults believed 75 percent of fake news stories about the 2016 US Presidential election. The more often someone sees a piece of misinformation, the more likely they are to believe it.

To make matters worse, we all suffer from a series of thinking errors such as the confirmation bias, our tendency to look for and interpret information in ways that conform to our intuitions.

Blogs with falsehoods are bad enough, but the rise of social media has made the situation even worse. Most people re-share news stories without reading the actual article, judging the quality of the story by the headline and image alone. No wonder research has indicated that misinformation spreads as much as 10 times faster and further on social media than true information. After all, creators of fake news are free to devise the most appealing headline and image, while credible sources of information have to stick to factual headlines and images.

To make matters worse, we all suffer from a series of thinking errors such as the confirmation bias, our tendency to look for and interpret information in ways that conform to our intuitions and preferences, as opposed to the facts. Our inherent thinking errors combined with the Internet's turbine power has exploded the prevalence of misinformation.

So it's no wonder we see troubling gaps between what scientists and the public believe about issues like climate change, evolution, genetically modified organisms, and vaccination.

What Can We Do?

Fortunately, there are proactive steps we can take to address the crisis of trust in science and academia. The Pro-Truth Pledge, founded by a group of behavioral science experts (including myself) and concerned citizens, calls on public figures, organizations, and private citizens to commit to 12 behaviors listed on the pledge website that research in behavioral science shows correlate with truthfulness.

Signers are held accountable through a crowdsourced reporting and evaluation mechanism while getting reputational rewards because of their commitment. The scientific consensus serves as a key measure of credibility, and the pledge encourages pledge-takers to recognize the opinions of experts - especially scientists - as more likely to be true when the facts are disputed.

The pledge "really does seem to change one's habits," encouraging signers to have attitudes "of honesty and moral sincerity."

Launched in December 2016, the pledge has surprising traction. Over 6200 private citizens took the pledge. So did more than 500 politicians, including members of US state legislatures Eric Nelson (PA), James White (TX), and Ogden Driskell (WY), and national politicians such as members of U.S. Congress Beto O'Rourke (TX), Matt Cartwright (PA), and Marcia Fudge (OH). Over 700 other public figures, such as globally-known public intellectuals Peter Singer, Steven Pinker, Michael Shermer, and Jonathan Haidt, took the pledge, as well as 70 organizations such as Media Bias/Fact Check, Fugitive Watch, Earth Organization for Sustainability, and One America Movement.

The pledge is effective in changing behaviors. A candidate for Congress, Michael Smith, took the Pro-Truth Pledge. He later posted on his Facebook wall a screenshot of a tweet by Donald Trump criticizing minority and disabled children. However, after being called out that the tweet was a fake, he went and searched Trump's feed. He could not find the original tweet, and while Trump may have deleted it, the candidate edited his own Facebook post to say, "Due to a Truth Pledge I have taken, I have to say I have not been able to verify this post." He indicated that he would be more careful with future postings.

U.S. Army veteran and pledge-taker John Kirbow described how the pledge "really does seem to change one's habits," helping push him both to correct his own mistakes with an "attitude of humility and skepticism, and of honesty and moral sincerity," and also to encourage "friends and peers to do so as well."

His experience is confirmed by research on the pledge. Two research studies at Ohio State University demonstrated the effectiveness of the pledge in changing the behavior of pledge-takers to be more truthful with a strong statistical significance.

Taking the pledge yourself, and encouraging people you know and your elected representatives to do the same, is an easy and effective way to fight misinformation and to promote a culture that values the truth.

Gleb Tsipursky
Dr. Gleb Tsipursky is an internationally recognized thought leader on a mission to protect leaders from dangerous judgment errors known as cognitive biases by developing the most effective decision-making strategies. A best-selling author, he wrote Resilience: Adapt and Plan for the New Abnormal of the COVID-19 Coronavirus Pandemic and Pro Truth: A Practical Plan for Putting Truth Back Into Politics. His expertise comes from over 20 years of consulting, coaching, and speaking and training as the CEO of Disaster Avoidance Experts, and over 15 years in academia as a behavioral economist and cognitive neuroscientist. He co-founded the Pro-Truth Pledge project.