It’s hard to know what information is true and verifiable, and what isn’t, especially when anyone can have an online platform and there are many loud and seemingly credible voices. So, why are we so susceptible to believing misinformation in the first place?
People’s thinking styles and emotional states can influence how they process news. This article explores common biases and mental shortcuts that make us all susceptible to believing things that aren’t necessarily true.
Mental shortcuts, or heuristics, are cognitive strategies that help us make quick decisions and judgments by simplifying complex information. They allow us to navigate daily life without our brains getting overwhelmed by analysing every piece of data we encounter. Whilst heuristics save time and mental energy, they can also lead to biases and errors, such as stereotyping, overconfidence, or misjudging risks. Recognising when heuristics help and when they mislead is key to better decision-making.
This article explores common biases and heuristics that impact our attitudes and decision-making, such as confirmation bias, the illusory truth effect, the availability heuristic, amongst others.
Confirmation bias
Confirmation bias is the tendency to seek out, interpret, and remember information in a way that reinforces existing beliefs, expectations, or desired conclusions. It operates in two key ways:
1. Selective information seeking. This is where people actively seek sources that align with their beliefs, such as opinion pieces or news articles that confirm their existing views.
2. Biased interpretation and recall. This is where people tend to remember, emphasise, or recount information that supports their perspective while dismissing or downplaying contradictory evidence.
Confirmation bias is especially strong when beliefs are deeply held, ideologically driven, or emotionally charged. It explains why two individuals with opposing viewpoints can examine the same evidence yet walk away feeling even more convinced they are right. You can observe this in political debates, where in the same debate, supporters for each candidate often walk away believing their preferred candidate had a stronger argument, reinforcing their existing beliefs. Even when exposed to evidence that refutes one’s beliefs, some people do not change their stance, and even grow more confident in that belief (otherwise known as “doubling down”).
Other examples of selective information processing, include following pages and accounts on social media that align with your beliefs, which serves to reinforce ideas and perspectives that align with your worldview, whilst avoiding or dismissing ideas and perspectives that challenge your own. In this way, we may unintentionally create echo chambers online that reinforce their own existing beliefs, further increasing bias and attitude polarisation (Brugnoli, Cinelli, Quattrociocchi, & Scala, 2019).
One study tracked which online news articles American participants chose to read. These articles varied in political leaning (conservative or liberal) and tone (positive or negative about U.S. policies; Knobloch-Westerwick, Mothes, & Polavin, 2020). The participants preferred articles that aligned with their political views, and avoided conflicting information.
Other findings were that the participants were more drawn to negative news articles, supporting the idea that bad news gets more attention. Furthermore, the emotional state of the participants was linked to confirmation bias, whereby when people felt negative, they were more likely to stick to information that supported their beliefs.
Additionally, the researchers found that participants who thought more deeply about information were more prone to confirmation bias, suggesting they sought out information that reinforced their views rather than challenging them. Thus, intelligence and high need for cognition may not be a protective factor from confirmation bias.
Research consistently shows that people construe and respond to information in ideologically motivated ways. Chapman & Lickel (2016) authors exposed participants to an article about a famine caused by severe droughts. When the natural disaster was attributed to climate change, climate change deniers reported lower support for helping victims. This was compared to when climate change was not mentioned as a cause of the drought. Climate change deniers also showed more victim blaming, lower perceived aid effectiveness and more negative attitudes towards donating to the relief effort. Hence, ideological biases can have secondary effects on how individuals construe information about societal risk.
The illusory truth effect is the tendency to believe false information is true simply because we've heard it multiple times.
When people judge whether something is true, they often rely on two factors based on repetition:
1. Does it align with what they already know?
2. Does it feel familiar?
Our brains like to conserve energy, and so it prefers information that is easier to process. Repetition makes information easier to process, which can create a false sense of credibility. This is why hearing the same statement repeatedly, even when false, can make it seem more believable and valid (Moons, Mackie, & Garcia-Marques, 2009). Repeated messaging exposure in order to gain trust can be found in advertising, news and media, politics, religion and ideology. Repetition tricks the brain into assuming something is true.
A 2009 study examined whether people rely more on familiarity (from repetition) or argument quality (logical reasoning) when forming their opinions (Moons, Mackie, & Garcia-Marques, 2009). They found that when people didn’t think too deeply, repetition made both weak and strong arguments more persuasive. However, when people thought more carefully, argument quality mattered more. They relied more on the argument’s content rather than familiarity to form their opinions. When arguments were strong, both familiarity and reasoning increased persuasiveness, and for weak arguments, more considered thinking reduced the effect of repetition, meaning that people realised the argument wasn’t strong despite hearing it multiple times.
This explains why misinformation spreads, especially if people don’t engage deeply, just hearing something repeatedly makes them believe it.
The illusory truth effect is closely linked to hindsight bias, where people misremember how confident they were about something after learning the correct answer - often convincing themselves they "knew it all along." For example, let's say you’re watching your favorite game show (e.g. who wants to be a millionaire?), and after the answer is revealed, you convince yourself that you were more confident about knowing the answer than you actually were. This happens because once we know the correct answer, it feels obvious, and we rewrite our memory of our past uncertainty to fit our new knowledge. In the same way, people tend to overestimate how obvious a political lie or conspiracy theory seemed to them after it’s debunked.
Another mental shortcut that distorts our perception of reality is the availability heuristic. This is where we judge how true or important something is based on how easily we can recall examples. It causes us to place weight on what we can easily bring to memory when making decisions. For example, even though shark attacks are extremely rare, seeing an attack on the news may make it feel like a higher risk because you can easily recall this story. When information is shocking, emotional, or widely shared, it is more available in our memory.
The availability heuristic can have significant systemic consequences, such as influencing policy decisions or shaping public opinion on important issues. For example, highly visible but unrepresentative public reactions may push politicians to focus on controversial issues rather than those with broader, long-term impact (e.g., Butler & Vis, 2023).
When we frequently see dramatic news about crime, we might believe that crime is rising, despite the statistics showing crime declining. The availability heuristic can lead us to generalise traits across groups, perpetuating stereotypes and prejudices. For example, we might recall vivid or extreme incidents reported in the media when forming an inaccurate assessment about a group of people.
Other mental shortcuts include:
The anchoring bias occurs when our first impression of information influences how we process new facts. When misinformation is the first thing we hear about a topic, it can become the "anchor" we compare new information against. For example, if someone is falsely accused of being corrupt, later corrections might not fully change our perception.
The negativity bias occurs when our first impression of information influences how we process new facts. When misinformation is the first thing we hear about a topic, it can become the "anchor" we compare new information against. For example, if someone is falsely accused of being corrupt, later corrections might not fully change our perception.
The false consensus effect occurs when we assume more people agree with us than actually do. If we see misinformation being widely shared, we assume it's popular and correct, even if most experts disagree.
Summary
We all take mental shortcuts to conserve brain energy and make more quick decisions. However, these shortcuts can often lead us to make incorrect judgements. Mental shortcuts can lead to:
References: