Skip to Main Content

Debunking Misinformation

An introduction to common reasons that people believe and spread misinformation, and strategies for counteracting misinformation

Debunking Misinformation: Challenges & Approaches

The spread of misinformation has always been a problem, but the Internet, social media, and other digital technologies have intensified the speed and ease at which misinformation spreads. The often reactive nature of our brains and of our personal biases also play a role, especially given increased political polarization in the U.S. and beyond. 

Unfortunately, "debunking" misinformation is also often more difficult than merely telling people that information is inaccurate. Once someone has been introduced to false information, they often continue to believe it despite the correction. This is especially true when the misinformation reinforces a person's pre-existing beliefs.

But there are ways to counter misinformation! This guide offers some key concepts and strategies for understanding and counteracting misinformation. Whether reading the news, seeing content on social media, or doing academic research, keeping these things in mind will help you evaluate information and your responses to it more critically.   


Misinformation on COVID-19: There is a lot of misinformation circulating about COVID-19 and the novel coronavirus. University of Toronto Libraries offers helpful tips on spotting such information. Many of those strategies are similar to the broader strategies presented in this guide.  

A Powerful Habit: Pause and Check Your Emotions

Do you have a strong reaction to the information you see (e.g., joy, pride, anger)? If so, slow down before you share or use that information.  

We tend to react quickly and with little thought to things that evoke strong feelings. By pausing, you give your brain time to process your initial response to analyze the information more critically. Then you are better able to critically evaluate information.

For more guidance on evaluating online sources, you can also see the Evaluating Online Sources Guide.

Key Concepts

To think more critically about information, it helps to understand some key things about the relationship between our beliefs and how we engage with information:  


"Web of belief": a metaphor for our belief systems, which include core beliefs (at the web's center), intermediary beliefs, and peripheral beliefs.

spider webOur beliefs are like parts of a spider web: each belief is connected to the larger web, and a change to one part of the web affects other parts. To keep our webs stable, we resist changes to them. This is especially for core beliefs that, if weakened, would destabilize the entire web.

Willard Van Orman Quine first introduced this "web of belief" metaphor. Maureen Linker' draws on the metaphor in Intellectual Empathy.  

Image: "spider web" by watts_photos is licensed under CC BY 2.0

 


Confirmation bias: the tendency to believe information that aligns with one's preexisting views and to dismiss information that doesn't fit with them

Example: As a coffee lover, I'm more inclined to believe studies that tout its health benefits than those that suggest it is unhealthy. When presented with research on the negative effects of high caffeine consumption, I either dismiss those studies outright or I scrutinize them more carefully than I would studies that indicate my coffee habit is good for me.


Backfire effect: the tendency, when presented with information that challenges one's beliefs, to strengthen those beliefs (rather than to weaken them) 

Example: I, the coffee lover, read a study about the potential health risks on consuming high levels of caffeine. Rather than considering if I should moderate my coffee drinking, I am even more strongly inclined to believe that coffee is good for me.  


Worldview backfire effect: the backfire effect, is most activated when one's strongly held beliefs, worldview, or sense of self is challenged. This is particularly common with "hot button" issues like gun laws and abortion. Issues like this often activate "core beliefs" that are highly resistant to change. 

Example: A person who grew up in a family in which gun ownership was viewed as essential to daily life is generally more likely to dismiss arguments for gun control. A person who grew up in a family in which guns were not commonplace is more likely to advocate for stronger gun control.