example of how critical thinking cleared up biases

  • Clearer Thinking Team
  • Mar 30, 2023

A List of Common Cognitive Biases (With Examples)

Updated: Jun 13, 2023

list of cognitive biases

Cognitive biases are patterns of thinking that distort or skew information processing, often leading to errors. These biases often occur when we make a quick decision using intuition or heuristics, which are simple rules or shortcuts that we use to make decisions and solve problems quickly without necessarily considering all available information.

While human intuition is extremely useful for many things, and should not simply be ignored, there are also plenty of known situations in which using our intuition or "going with our gut" systematically leads us to inaccurate conclusions and unhelpful behaviors.

In the early 1970s, cognitive psychologists Amos Tversky and Daniel Kahneman introduced the term 'cognitive bias' after studying perceptual bias in problem-solving that used heuristics. Since then, cognitive psychology has demonstrated that cognitive biases occur systematically and universally and are involuntary: no one is totally immune to them.

example of how critical thinking cleared up biases

If you've found this article valuable so far, you may also like our free tool

List of the most common cognitive biases

Here, we list many of the most common cognitive biases. We strongly recommend reading the second part of this article, where we answer popular questions and clarify common misunderstandings about the topic.

Ambiguity Effect

The Ambiguity Effect is a cognitive bias whereby people who are faced with a decision tend to pick an option for which they know the probability of a good outcome, rather than an option for which the probability of a good outcome is unknown or ambiguous. This may occur even if the known probability is low and picking it isn't the best strategy.

Anchoring Bias

Anchoring Bias occurs when a person's expectation about one thing is affected by something mostly or entirely irrelevant they saw, heard, or thought before, such as an irrelevant number. In other words, it occurs when a person's beliefs or behaviors are influenced by a specific piece of information far more than they should be given how much evidence that information actually provides.

Attention Bias

Attention Bias occurs when some information or evidence holds a disproportionate amount of a person's attention because of that person's environment or history, or because of people's natural instincts.


Availability Bias

The Availability Bias occurs when someone's prediction about an event's frequency or probability is unduly influenced by how easily they can recall examples of that event. We have a whole mini-course about combating availability bias .

Bias Blind Spot

A Bias Blind Spot is a tendency to see oneself as being less biased or less susceptible to biases (such as those listed in this article) than others in the population.

: "In fact, viewing yourself as rational can backfire. The more objective you think you are, the more you trust your own intuitions and opinions as accurate representations of reality, and the less inclined you are to question them. 'I'm an objective person, so my views on gun control must be correct, unlike the views of all those irrational people who disagree with me,' we think." -

Choice-Supportive Bias

Choice-Supportive Bias is a cognitive bias whereby someone who has chosen between different options later remembers the option that they chose as having more positive attributes than it did at the time (while they remember options they did not choose as having more negative attributes than they'd had at the time).

Confirmation Bias

Confirmation Bias refers to a tendency for people to seek out, favor, or give more weight to information that confirms their preconceptions or hypotheses (even if the information isn't true) than information that contradicts their prior beliefs.

Denomination Effect

The Denomination Effect is a cognitive bias whereby people tend to be more likely to spend a given amount of money if it is composed of smaller individual sums than if it is composed of larger individual sums.

Hindsight Bias

Hindsight Bias refers to a tendency to perceive past events as being more predictable than they were before they took place.

Optimism Bias

Optimism Bias is the tendency to be unduly optimistic about the probability of future good and bad events, overestimating the probability of positive ones while underestimating the probability of negative ones.

Motivated Reasoning

Motivated reasoning occurs when you are disposed to interpret new evidence in ways that support your existing beliefs, or that lead to the outcome you wish was true, even when that evidence doesn't truly support your beliefs.

Frequently Asked Questions (FAQ) about cognitive biases

What are the types of bias.

There are three main types of bias.

1. Explicit biases are prejudiced beliefs regarding a group of people or ways of living. Racism, sexism, religious intolerance, and LGBTQ-phobias are examples of explicit biases. If you think that all people of group X are inferior, then you have an explicit bias against people of group X.

2. Implicit biases are unconscious beliefs that lead people to form opinions or judgments, often without being fully aware they hold the unconscious beliefs. If you subtely distrust people of group X without even realizing you're doing it, then you have an implicit bias against people of group X.

3. Cognitive biases differ from explicit and implicit biases: they are a group of systematic patterns in how our beliefs, judgments, and actions differ from what they would if we were completely rational. If most people systemtaically misjudge certain types of information in such a way that you come to false conclusions, then people have a cognitive bias related to that type of information.

How many cognitive biases are there?

There is no consensus among academics regarding how many cognitive biases exist. Some have found ~40 , others find >100 , and Wikipedia lists over 180 .

What are the common causes of cognitive bias?

As we’ve seen above, cognitive biases often appear when one is faced with a decision and has limited resources (such as time, understanding, and cognitive capacity).

For instance, when buying a banana, you can't consider every single possible other use of that money to determine whether a banana is truly the single best use. You are limited in both how much time you have to think and how much total cognitive capacity you have.

Using fast heuristics or relying on our intuition is often an effective way of coming to conclusions in these situations because such approaches require fewer resources than careful thinking. While our intuition is often reliable, there are certain cases where our intuitions systematically produce inaccurate beliefs and unhelpful behaviors - these are what we refer to as "cognitive biases".

Even when we have plenty of time to think and aren't hitting a limit on our cognitive resources, people can still be prone to cognitive biases. For instance, there are certain automatic rules of thumb that our minds evolved to use since they worked quite well for the survival of our ancestors. Unfortunately, these rules of thumb can sometimes lead us to false conclusions and unhelpful behaviors in the modern world.

Is cognitive bias a good or bad thing?

Cognitive biases are not good or bad in themselves. They are an unavoidable effect of not having infinite intelligence and infinite time to think, and hence the need to rely on heuristics and intuition. We call a tendency a cognitive bias when it leads to systemic inaccuracies in our beliefs or unhelpful behaviors. In that sense, by definition, cognitive biases cause systematic problems.

However, cognitive biases do not always lead to negative outcomes in every instance. For instance, overconfidence may cause a person to try something very difficult, that they ultimately succeed at. On the other hand, for every one person who succeeds due to overconfidence, there may be multiple other people that try something that's unrealistic due to overconfidence and end up failing.

How do you identify cognitive biases?

Just knowing about specific cognitive biases is a great first step to identifying them in yourself, but knowledge of the biases is often not sufficient to cause you to identify them. Once you’ve done that, it can be helpful to get to know the most common cognitive biases (such as the ones presented above) so that you can look out for them in your own thinking.

Can you avoid cognitive bias?

Yes and no. It is possible to reduce the influence of cognitive biases on your thinking (and this can be very beneficial!). So you may be able to avoid a cognitive bias in many particular instances. But it's not possible to completely remove all of your cognitive biases.

How do you overcome cognitive biases?

Unfortunately, it’s impossible to overcome all of your cognitive biases completely. However, that doesn’t mean you can’t do anything. A good first step on the path to getting your cognitive biases under control is familiarizing yourself with them

Here are a few of our interactive tools that might help:

The Planning Fallacy

The Sunk Cost Fallacy

Improve Your Frequency Predictions

Political Bias Test

Rhetorical Fallacies

Are Your Overconfident?

Calibrate Your Judgement

How Rational Are You, Really?

Metal Traps ,

However, just knowing about your cognitive biases isn’t enough . You need to take action! Here are some practical steps we recommend:

Biases such as overconfidence, confirmation bias, and the illusion of control can be reduced or avoided by having multiple points of view. Surrounding yourself and listening to people with diverse experiences, systems of beliefs, and expertise reduces the chances of falling into one of the said biases. This is also true for the source of information: it is less likely that you fall into a cognitive bias if you look for other data sources and conflict.

Actively seeking evidence against your current point of view (on important decisions) can be a helpful way to combat biases like overconfidence, confirmation bias, and motivated reasoning.

Another strategy recommended by researchers who studied cognitive biases in physicians, is to consciously consider the options you dismissed at first, so you can reach a more considered answer.

What is a cognitive vs. an emotional bias?

Emotional biases can be considered a subcategory of cognitive biases. What separates them from other cognitive biases is that they are based on e motions such as anger, disgust, fear, happiness, sadness, and surprise . When we're experiencing emotions, we may act in a biased way that is concordant with that emotion. For instance, anxiety may cause us to overestimate the chance of something being dangerous.

Emotional biases are linked to emotional dispositions (commonly known as ‘temperament’). Different emotional dispositions may even lead to different emotional reactions to the same occurrence of events.

Emotional biases may help us explain optimism and pessimism biases .

How do cognitive biases affect critical thinking ?

Cognitive biases interfere with impartiality, and they can negatively impact critical thinking in a myriad of different ways. Here are several:

Motivated reasoning leads us to underestimate the arguments for conclusions we don’t believe in and overestimate the arguments for conclusions we want to believe;

Availability bias messes with our critical thinking because it leads us to asses risk by how readily examples come to mind, rather than considering all of the relevant examples;

We are also prone to blind spot bias, meaning that we are less likely to identify biases in our own judgment than in other people's.

How do cognitive biases affect decision-making?

Cognitive biases affect decision-making in at least two ways: they help decision-making by speeding it up and cutting necessary corners when we have limited time or cognitive power, but they also hinder decision-making by causing us to come to false conclusions or take unhelpful actions in certain cases.

Is gender a factor for cognitive biases?

Research has shown some correlation between gender or sex and specific biases. For instance, researchers found that male investors tend to show greater overconfidence and optimism biases, while female investors tend to exhibit more anchoring and hindsight biases. The research makes no claims about what causes such gendered differences - e.g., socialization or biology or a mix of both.

Are gender stereotypes cognitive bias?

Gender stereotypes are explicit biases, which means they are not cognitive biases. However, there are many cognitive biases that involve gender stereotypes. For example, masculine bias is the tendency to assume a person is a male based on stereotypes after hearing gender-neutral information about them, and the tendency to use gender as a description only when describing women.

Gender stereotypes are also a sign of binary thinking .

Do cognitive biases cause depression?

Research has shown some cognitive biases are correlated with depression . This has been found to be the case for negative interpretation bias (the tendency to interpret ambiguous scenarios as negative) and pessimistic biases, which lead people to predict future situations as unrealistically negative.

Cognitive behavioral therapy is based on the assumption that individuals with depression have distorted negative beliefs about themselves or the world (known in CBT as "cognitive distortions").

Are cognitive biases scientific (is their existence scientifically proven)?

Yes. They have been studied since the early 1970s by cognitive psychologists, sociologists, and behavioral economists.

Do scientists exhibit cognitive biases?

Just like every other human being, scientists can exhibit cognitive biases.They may exhibit overconfidence bias or fall prety to selection biases, for example. This has been researched as it relates to the replication crisis social psychology faces today .

There is even research on the presence of cognitive biases in scientific contexts and occuring within academic publications. Nobody, not even scientists, are immune to cognitive biases!

Are cognitive biases learned? Or are we born with cognitive biases?

Both. We are born with a tendency for some cognitive biases, but we can also learn specific aspects of these biases. Our brains have evolved to be prone to all sorts of cognitive biases because those biases have been helpful in the survival of our ancestors in the environment (and under the constraints) in which they lived.

But the details of some specific cognitive biases are learned as we move through the world. For example, humans have evolved a tendency to engage in motivated reasoning, but which conclusions motivate your reasoning is something you aren’t born with and are impacted by your experiences and learning.

Keep learning by trying our mini-course on Mental Traps

Want to understand cognitive biases on a deeper level? Learn about a few of the mind's mistakes with our interactive introduction to cognitive biases!

Recent Posts

How collective memories can sometimes be inaccurate: Investigating the Mandela Effect

Can this article change your mind about how minds change?

Four Widely Believed Numbers, Which Are Actually False

Cambridge logo

Products and services

Our innovative products and services for learners, authors and customers are based on world-class research and are relevant, exciting and inspiring.

  • Academic Research
  • English Language Learning
  • English Language Assessment
  • Educational resources for schools
  • Educational Research & Network
  • Assessment Research
  • Cambridge Assessment International Education
  • Cambridge CEM
  • Cambridge Partnership for Education
  • Cambridge Dictionary
  • The Cambridge Mathematics Project

We unlock the potential of millions of people worldwide. Our assessments, publications and research spread knowledge, spark enquiry and aid understanding around the world.

  • People and planet
  • News and insights
  • Accessibility
  • Rights and permissions
  • Annual Report

No matter who you are, what you do, or where you come from, you’ll feel proud to work here.

  • Language learning materials
  • Language Assessment

View through glass partition, Hispanic and Chinese businesswomen writing solutions on adhesive notes, smiling and cheerful

A deep dive into critical thinking (part 2) – the bias battle

example of how critical thinking cleared up biases

Life Competencies   Adult Learners   Teens   Young Learners   Professional Development   Insights, Research and Linguistics  

Biases are often missed. They can be conscious or unconscious, but either way they affect our views and judgements when it comes to critical thinking. So what types of bias are there and how can we ensure our learners are aware of their own biases in the classroom?

If you read part one of this blog post, you’ll remember Tim Van Gelder and his article, ‘Teaching Critical Thinking: Some Lessons From Cognitive Science’. He believes that we all have tendencies that “corrupt our thinking and contaminate our beliefs”, so we must be aware of them and compensate for their influence. These tendencies are called biases and they impact our judgment of a particular person, opinion, or thing. They unknowingly can change our support or opposition of a particular subject matter.  

For instance, if one of your students is always late for class, you may assume that they are lazy and disorganised, even though you are not aware of the internal and external factors that may have contributed to this result. What if this student lives in an area with scarcity of public transportation? However, when it is you who are late, you expect people to attribute the reasons for that to external factors, like unexpected traffic or another unforeseen event. You fell prey to the ‘fundamental attribution error’, a tendency to attribute a particular behavior to unreasonable stereotypes, while attributing your own similar behavior to external factors. 

So the natural question is “how do I identify these biases and what should I do to overcome them?” 

Overcoming bias  

In order to identify biases, we must first be aware of their existence. Then we can learn about  each type of bias and conclude if we have that bias. Note that having biases does not necessarily imply that we’re bad people. We all have biases. They come from our upbringing, from our culture and society, from our education, and our life experiences, etc.  

As for the number of biases out there, you will find answers ranging from 3 to… 197!  

Just imagine explaining all of them to our students before we can even begin to talk about critical thinking. And yet, this might be one of the most important insights they can have in order to develop critical thinking. Biases can make us avoid information that does not align with our beliefs and make us see connections between ideas that do not exist. A bias can cause the perpetuation of misconceptions and misinformation.  

Being aware of our biases  

We can’t really think critically unless we are aware of our biases. But because there are so many of them, we have to make choices. Choose some of them and find opportunities in our lesson plans, group discussions, homework assignments, and other interactions with students to present these biases and talk about them.  

Below are suggestions on how to incorporate the study of biases into your practice. They use examples from the pages of our book series Global Changer . And if you are asking yourself “Which biases should I choose to teach?”, that’s a question only you can answer. Look for the most common types of biases or make decisions based on those you have noticed in your classes. I hope you can apply these suggestions to your reality.  

Example 1: confirmation bias

Let’s start with the one which is probably the most famous: confirmation bias . This is the tendency to only look for or believe in information that supports our opinion or affirmation. 

This poster was taken from an activity (Tirado da página 66, Starter book) and can be assigned as homework. This suggestion has no connection to the activity in the book whatsoever.    

example of how critical thinking cleared up biases

I would start in class by asking students to discuss why some people like dogs, others prefer cats, and some like both. After students have shared their thoughts, ask them to choose one side (team dog or team cat). They can then do some research at home to find evidence to support the opinion that their pet choice is better. For example, if they choose ‘team cat’, they have to find evidence that shows that cats are better than dogs.

Likewise, if they choose ‘team dog’, they have to find evidence that shows that dogs are better than cats. But there is a catch: if they are ‘team cat’, they have to search the web using the following terms: ‘Why are dogs better than cats?’. If they are ‘team dog’, they have to search using the terms: ‘Why are cats better than dogs?’. They have to go through the results and find evidence to support their sides of the argument.  

In the following class, ask them about the experience…

  • Why was it harder to find evidence to support your team when you use those search terms?
  • Were you able to find evidence?
  • Do you agree with those pieces of evidence? Why?
  • Did you find any evidence that supports the other team that you actually agree with? What was it?
  • Which search terms would be more impartial? Maybe try “the difference between dogs and cats or cats and dogs” or something like that?’

Finally, teach them about confirmation bias. Ask them to think of other situations in which this bias may be present. 

In this exercise you’re taking an opportunity to introduce a type of bias. You’re also practicing critical thinking skills with your students. You’re asking them to listen to the reasons why people may have a different opinion, find something that they can agree with (even if it doesn’t support their views), notice how their biases influence how they get information about something, and how this bias impacts the discussion they were having. 

Example 2: attribution (or attributional) bias

The attribution (or attributional) bias affects the way we determine who or what was responsible for something. We draw conclusions based on a person’s character because we don’t have the full picture of a situation, or we don’t know what’s really happening. 

example of how critical thinking cleared up biases

In this activity (Tirado da página 123, activity 1, Starter book) there are two user reviews: one is a man from Tokyo and the other is a woman from France. Again, this suggestion is not mentioned in the teacher’s guidelines and should be considered as an extra activity. 

Organise students into groups. Tell them that some users on this website disagreed with Akira’s review, saying the apartment “was in fact pretty small”. It makes sense, as Akira mentions the bedrooms were small, the balcony was small, and there wasn’t a yard. How can that be big? Then ask the groups “why did Akira say that in the review?” and have them discuss his reasons. But before they start discussing, give each group a slip of paper with one piece of information. Each group will receive a different piece of information. The information could be something like: “Akira is young/ is a man/ is Japanese/ lied”.

When students are done discussing, invite volunteers to share their answers and reveal which piece of information they were given. Ask the groups if that information influenced their answers and, if so, how? Then ask questions to show to students that they don’t really know Akira, they haven’t been to the apartment, and they didn’t have enough information to make a judgment. For example: “how many times has Akira used the services of Air BnB? How many countries have they been to? What reviews have they written? How expensive was this apartment? Where was this apartment?”… and so on.  

And then reveal the right answer…

Akira is from Tokyo, one of the most populated cities in the world and where the cost of living is very high. This causes most people to live their entire lives in small apartments because they are affordable. He spent his whole life living in a tiny apartment, so when he stayed in a hotel overseas, every apartment seemed to be bigger than the one he was used to, especially because there is even a balcony with many plants. And then explain attribution (or attributional) bias to students and ask them to think of other situations in which this type of bias may be present.  

Teaching bias and critical thinking skills 

By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 

1) Choose a bias. 

  • Search for a list of biases and read the basic definitions. 

2) Learn about it. 

  • After you choose the bias you want to teach your students, read about. You must understand how this bias is created and how it presents itself.  

3) Identify opportunities to use this bias (book, class activities, homework, etc.) 

  • When you understand a bias, you’re more likely to think of opportunities this bias will present itself in your classes. It could be used as a warm-up activity or as a complement to an activity tin your court book. You can add it to a discussion activity or assign it as homework, etc. 

4) Create conditions so that this type of bias will be present without students knowing about it. 

  • This is very important. You have to create the conditions for biases to rear up their ugly head, so to speak, in a ‘natural way”, even if you have to use inputs or guide students with questions.  

5) Reveal the influence this bias had on students. 

  • Once the activity is done, draw your students’ attention to the effect the bias had on them. 

6) Formally introduce the bias. 

  • Name the bias, explain the idea behind this bias, and provide some extra examples.  

7) Transfer that knowledge to other situations. 

  • In order to make sure your students got it, ask them to think of other scenarios, situations, in which this bias may be present. They may even give testimonials about how such bias has affected their judgment or behavior.

So, what can we take from this?

If you are a critical thinker, you probably will take everything I said with a grain of salt. Do your own research, think about what I said and showed to you today, reflect on the benefits and throwbacks of using this approach and reach your own conclusions. I hope this article gives you some insights so that you can find a way to incorporate the study of bias into your classes.  

Watch the full webinar on this topic, here:

Read about Critical Thinking, download lesson plans and watch a short video on the topic, in Cambridge researcher Jasmin Silver’s critical thinking  blog post . Or, read part 1 of this blog post, ‘ A deep dive into critical thinking – what is it and how is it taught?’ to hear more from Mauricio Shiroma.

You have to be logged in to save articles

Cognitive Bias: How We Are Wired to Misjudge

Charlotte Ruhl

Research Assistant & Psychology Graduate

BA (Hons) Psychology, Harvard University

Charlotte Ruhl, a psychology graduate from Harvard College, boasts over six years of research experience in clinical and social psychology. During her tenure at Harvard, she contributed to the Decision Science Lab, administering numerous studies in behavioral economics and social psychology.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Have you ever been so busy talking on the phone that you don’t notice the light has turned green and it is your turn to cross the street?

Have you ever shouted, “I knew that was going to happen!” after your favorite baseball team gave up a huge lead in the ninth inning and lost?

Or have you ever found yourself only reading news stories that further support your opinion?

These are just a few of the many instances of cognitive bias that we experience every day of our lives. But before we dive into these different biases, let’s backtrack first and define what bias is.

Cognitive Bias and Judgement Error - Systematic Mental Pattern of Deviation from Norm or Rationality in Judgement - Conceptual Illustration

What is Cognitive Bias?

Cognitive bias is a systematic error in thinking, affecting how we process information, perceive others, and make decisions. It can lead to irrational thoughts or judgments and is often based on our perceptions, memories, or individual and societal beliefs.

Biases are unconscious and automatic processes designed to make decision-making quicker and more efficient. Cognitive biases can be caused by many things, such as heuristics (mental shortcuts) , social pressures, and emotions.

Broadly speaking, bias is a tendency to lean in favor of or against a person, group, idea, or thing, usually in an unfair way. Biases are natural — they are a product of human nature — and they don’t simply exist in a vacuum or in our minds — they affect the way we make decisions and act.

In psychology, there are two main branches of biases: conscious and unconscious. Conscious or explicit bias is intentional — you are aware of your attitudes and the behaviors resulting from them (Lang, 2019).

Explicit bias can be good because it helps provide you with a sense of identity and can lead you to make good decisions (for example, being biased towards healthy foods).

However, these biases can often be dangerous when they take the form of conscious stereotyping.

On the other hand, unconscious bias , or cognitive bias, represents a set of unintentional biases — you are unaware of your attitudes and behaviors resulting from them (Lang, 2019).

Cognitive bias is often a result of your brain’s attempt to simplify information processing — we receive roughly 11 million bits of information per second. Still, we can only process about 40 bits of information per second (Orzan et al., 2012).

Therefore, we often rely on mental shortcuts (called heuristics) to help make sense of the world with relative speed. As such, these errors tend to arise from problems related to thinking: memory, attention, and other mental mistakes.

Cognitive biases can be beneficial because they do not require much mental effort and can allow you to make decisions relatively quickly, but like conscious biases, unconscious biases can also take the form of harmful prejudice that serves to hurt an individual or a group.

Although it may feel like there has been a recent rise of unconscious bias, especially in the context of police brutality and the Black Lives Matter movement, this is not a new phenomenon.

Thanks to Tversky and Kahneman (and several other psychologists who have paved the way), we now have an existing dictionary of our cognitive biases.

Again, these biases occur as an attempt to simplify the complex world and make information processing faster and easier. This section will dive into some of the most common forms of cognitive bias.

Cognitive biases as systematic error in thinking and behavior outline diagram. Psychological mindset feeling with non logic judgment effects vector illustration.

Confirmation Bias

Confirmation bias is the tendency to interpret new information as confirmation of your preexisting beliefs and opinions while giving disproportionately less consideration to alternative possibilities.

Real-World Examples

Since Watson’s 1960 experiment, real-world examples of confirmation bias have gained attention.

This bias often seeps into the research world when psychologists selectively interpret data or ignore unfavorable data to produce results that support their initial hypothesis.

Confirmation bias is also incredibly pervasive on the internet, particularly with social media. We tend to read online news articles that support our beliefs and fail to seek out sources that challenge them.

Various social media platforms, such as Facebook, help reinforce our confirmation bias by feeding us stories that we are likely to agree with – further pushing us down these echo chambers of political polarization.

Some examples of confirmation bias are especially harmful, specifically in the context of the law. For example, a detective may identify a suspect early in an investigation, seek out confirming evidence, and downplay falsifying evidence.

Experiments

The confirmation bias dates back to 1960 when Peter Wason challenged participants to identify a rule applying to triples of numbers.

People were first told that the sequences 2, 4, 6 fit the rule, and they then had to generate triples of their own and were told whether that sequence fits the rule. The rule was simple: any ascending sequence.

But not only did participants have an unusually difficult time realizing this and instead devised overly-complicated hypotheses, they also only generated triples that confirmed their preexisting hypothesis (Wason, 1960).

Explanations

But why does confirmation bias occur? It’s partially due to the effect of desire on our beliefs. In other words, certain desired conclusions (ones that support our beliefs) are more likely to be processed by the brain and labeled as true (Nickerson, 1998).

This motivational explanation is often coupled with a more cognitive theory.

The cognitive explanation argues that because our minds can only focus on one thing at a time, it is hard to parallel process (see information processing for more information) alternate hypotheses, so, as a result, we only process the information that aligns with our beliefs (Nickerson, 1998).

Another theory explains confirmation bias as a way of enhancing and protecting our self-esteem.

As with the self-serving bias (see more below), our minds choose to reinforce our preexisting ideas because being right helps preserve our sense of self-esteem, which is important for feeling secure in the world and maintaining positive relationships (Casad, 2019).

Although confirmation bias has obvious consequences, you can still work towards overcoming it by being open-minded and willing to look at situations from a different perspective than you might be used to (Luippold et al., 2015).

Even though this bias is unconscious, training your mind to become more flexible in its thought patterns will help mitigate the effects of this bias.

Hindsight Bias

Hindsight bias refers to the tendency to perceive past events as more predictable than they actually were (Roese & Vohs, 2012). There are cognitive and motivational explanations for why we ascribe so much certainty to knowing the outcome of an event only once the event is completed.

 Hindsight Bias Example

When sports fans know the outcome of a game, they often question certain decisions coaches make that they otherwise would not have questioned or second-guessed.

And fans are also quick to remark that they knew their team was going to win or lose, but, of course, they only make this statement after their team actually did win or lose.

Although research studies have demonstrated that the hindsight bias isn’t necessarily mitigated by pure recognition of the bias (Pohl & Hell, 1996).

You can still make a conscious effort to remind yourself that you can’t predict the future and motivate yourself to consider alternate explanations.

It’s important to do all we can to reduce this bias because when we are overly confident about our ability to predict outcomes, we might make future risky decisions that could have potentially dangerous outcomes.

Building on Tversky and Kahneman’s growing list of heuristics, researchers Baruch Fischhoff and Ruth Beyth-Marom (1975) were the first to directly investigate the hindsight bias in the empirical setting.

The team asked participants to judge the likelihood of several different outcomes of former U.S. president Richard Nixon’s visit to Beijing and Moscow.

After Nixon returned back to the States, participants were asked to recall the likelihood of each outcome they had initially assigned.

Fischhoff and Beyth found that for events that actually occurred, participants greatly overestimated the initial likelihood they assigned to those events.

That same year, Fischhoff (1975) introduced a new method for testing the hindsight bias – one that researchers still use today.

Participants are given a short story with four possible outcomes, and they are told that one is true. When they are then asked to assign the likelihood of each specific outcome, they regularly assign a higher likelihood to whichever outcome they have been told is true, regardless of how likely it actually is.

But hindsight bias does not only exist in artificial settings. In 1993, Dorothee Dietrich and Matthew Olsen asked college students to predict how the U.S. Senate would vote on the confirmation of Supreme Court nominee Clarence Thomas.

Before the vote, 58% of participants predicted that he would be confirmed, but after his actual confirmation, 78% of students said that they thought he would be approved – a prime example of the hindsight bias. And this form of bias extends beyond the research world.

From the cognitive perspective, hindsight bias may result from distortions of memories of what we knew or believed to know before an event occurred (Inman, 2016).

It is easier to recall information that is consistent with our current knowledge, so our memories become warped in a way that agrees with what actually did happen.

Motivational explanations of the hindsight bias point to the fact that we are motivated to live in a predictable world (Inman, 2016).

When surprising outcomes arise, our expectations are violated, and we may experience negative reactions as a result. Thus, we rely on the hindsight bias to avoid these adverse responses to certain unanticipated events and reassure ourselves that we actually did know what was going to happen.

Self-Serving Bias

Self-serving bias is the tendency to take personal responsibility for positive outcomes and blame external factors for negative outcomes.

You would be right to ask how this is similar to the fundamental attribution error (Ross, 1977), which identifies our tendency to overemphasize internal factors for other people’s behavior while attributing external factors to our own.

The distinction is that the self-serving bias is concerned with valence. That is, how good or bad an event or situation is. And it is also only concerned with events for which you are the actor.

In other words, if a driver cuts in front of you as the light turns green, the fundamental attribution error might cause you to think that they are a bad person and not consider the possibility that they were late for work.

On the other hand, the self-serving bias is exercised when you are the actor. In this example, you would be the driver cutting in front of the other car, which you would tell yourself is because you are late (an external attribution to a negative event) as opposed to it being because you are a bad person.

From sports to the workplace, self-serving bias is incredibly common. For example, athletes are quick to take responsibility for personal wins, attributing their successes to their hard work and mental toughness, but point to external factors, such as unfair calls or bad weather, when they lose (Allen et al., 2020).

In the workplace, people attribute internal factors when they have hired for a job but external factors when they are fired (Furnham, 1982). And in the office itself, workplace conflicts are given external attributions, and successes, whether a persuasive presentation or a promotion, are awarded internal explanations (Walther & Bazarova, 2007).

Additionally, self-serving bias is more prevalent in individualistic cultures , which place emphasis on self-esteem levels and individual goals, and it is less prevalent among individuals with depression (Mezulis et al., 2004), who are more likely to take responsibility for negative outcomes.

Overcoming this bias can be difficult because it is at the expense of our self-esteem. Nevertheless, practicing self-compassion – treating yourself with kindness even when you fall short or fail – can help reduce the self-serving bias (Neff, 2003).

The leading explanation for the self-serving bias is that it is a way of protecting our self-esteem (similar to one of the explanations for the confirmation bias).

We are quick to take credit for positive outcomes and divert the blame for negative ones to boost and preserve our individual ego, which is necessary for confidence and healthy relationships with others (Heider, 1982).

Another theory argues that self-serving bias occurs when surprising events arise. When certain outcomes run counter to our expectations, we ascribe external factors, but when outcomes are in line with our expectations, we attribute internal factors (Miller & Ross, 1975).

An extension of this theory asserts that we are naturally optimistic, so negative outcomes come as a surprise and receive external attributions as a result.

Anchoring Bias

individualistic cultures is closely related to the decision-making process. It occurs when we rely too heavily on either pre-existing information or the first piece of information (the anchor) when making a decision.

For example, if you first see a T-shirt that costs $1,000 and then see a second one that costs $100, you’re more likely to see the second shirt as cheap as you would if the first shirt you saw was $120. Here, the price of the first shirt influences how you view the second.

 Anchoring Bias Example

Sarah is looking to buy a used car. The first dealership she visits has a used sedan listed for $19,000. Sarah takes this initial listing price as an anchor and uses it to evaluate prices at other dealerships.

When she sees another similar used sedan priced at $18,000, that price seems like a good bargain compared to the $19,000 anchor price she saw first, even though the actual market value is closer to $16,000.

When Sarah finds a comparable used sedan priced at $15,500, she continues perceiving that price as cheap compared to her anchored reference price.

Ultimately, Sarah purchases the $18,000 sedan, overlooking that all of the prices seemed like bargains only in relation to the initial high anchor price.

The key elements that demonstrate anchoring bias here are:

  • Sarah establishes an initial reference price based on the first listing she sees ($19k)
  • She uses that initial price as her comparison/anchor for evaluating subsequent prices
  • This biases her perception of the market value of the cars she looks at after the initial anchor is set
  • She makes a purchase decision aligned with her anchored expectations rather than a more objective market value

Multiple theories seek to explain the existence of this bias.

One theory, known as anchoring and adjustment, argues that once an anchor is established, people insufficiently adjust away from it to arrive at their final answer, and so their final guess or decision is closer to the anchor than it otherwise would have been (Tversky & Kahneman, 1992).

And when people experience a greater cognitive load (the amount of information the working memory can hold at any given time; for example, a difficult decision as opposed to an easy one), they are more susceptible to the effects of anchoring.

Another theory, selective accessibility, holds that although we assume that the anchor is not a suitable answer (or a suitable price going back to the initial example) when we evaluate the second stimulus (or second shirt), we look for ways in which it is similar or different to the anchor (the price being way different), resulting in the anchoring effect (Mussweiler & Strack, 1999).

A final theory posits that providing an anchor changes someone’s attitudes to be more favorable to the anchor, which then biases future answers to have similar characteristics as the initial anchor.

Although there are many different theories for why we experience anchoring bias, they all agree that it affects our decisions in real ways (Wegner et al., 2001).

The first study that brought this bias to light was during one of Tversky and Kahneman’s (1974) initial experiments. They asked participants to compute the product of numbers 1-8 in five seconds, either as 1x2x3… or 8x7x6…

Participants did not have enough time to calculate the answer, so they had to estimate based on their first few calculations.

They found that those who computed the small multiplications first (i.e., 1x2x3…) gave a median estimate of 512, but those who computed the larger multiplications first gave a median estimate of 2,250 (although the actual answer is 40,320).

This demonstrates how the initial few calculations influenced the participant’s final answer.

Availability Bias

Availability bias (also commonly referred to as the availability heuristic ) refers to the tendency to think that examples of things that readily come to mind are more common than what is actually the case.

In other words, information that comes to mind faster influences the decisions we make about the future. And just like with the hindsight bias, this bias is related to an error of memory.

But instead of being a memory fabrication, it is an overemphasis on a certain memory.

In the workplace, if someone is being considered for a promotion but their boss recalls one bad thing that happened years ago but left a lasting impression, that one event might have an outsized influence on the final decision.

Another common example is buying lottery tickets because the lifestyle and benefits of winning are more readily available in mind (and the potential emotions associated with winning or seeing other people win) than the complex probability calculation of actually winning the lottery (Cherry, 2019).

A final common example that is used to demonstrate the availability heuristic describes how seeing several television shows or news reports about shark attacks (or anything that is sensationalized by the news, such as serial killers or plane crashes) might make you think that this incident is relatively common even though it is not at all.

Regardless, this thinking might make you less inclined to go in the water the next time you go to the beach (Cherry, 2019).

As with most cognitive biases, the best way to overcome them is by recognizing the bias and being more cognizant of your thoughts and decisions.

And because we fall victim to this bias when our brain relies on quick mental shortcuts in order to save time, slowing down our thinking and decision-making process is a crucial step to mitigating the effects of the availability heuristic.

Researchers think this bias occurs because the brain is constantly trying to minimize the effort necessary to make decisions, and so we rely on certain memories – ones that we can recall more easily – instead of having to endure the complicated task of calculating statistical probabilities.

Two main types of memories are easier to recall: 1) those that more closely align with the way we see the world and 2) those that evoke more emotion and leave a more lasting impression.

This first type of memory was identified in 1973, when Tversky and Kahneman, our cognitive bias pioneers, conducted a study in which they asked participants if more words begin with the letter K or if more words have K as their third letter.

Although many more words have K as their third letter, 70% of participants said that more words begin with K because the ability to recall this is not only easier, but it more closely aligns with the way they see the world (knowing the first letter of any word is infinitely more common than the third letter of any word).

In terms of the second type of memory, the same duo ran an experiment in 1983, 10 years later, where half the participants were asked to guess the likelihood of a massive flood would occur somewhere in North America, and the other half had to guess the likelihood of a flood occurring due to an earthquake in California.

Although the latter is much less likely, participants still said that this would be much more common because they could recall specific, emotionally charged events of earthquakes hitting California, largely due to the news coverage they receive.

Together, these studies highlight how memories that are easier to recall greatly influence our judgments and perceptions about future events.

Inattentional Blindness

A final popular form of cognitive bias is inattentional blindness . This occurs when a person fails to notice a stimulus that is in plain sight because their attention is directed elsewhere.

For example, while driving a car, you might be so focused on the road ahead of you that you completely fail to notice a car swerve into your lane of traffic.

Because your attention is directed elsewhere, you aren’t able to react in time, potentially leading to a car accident. Experiencing inattentional blindness has its obvious consequences (as illustrated by this example), but, like all biases, it is not impossible to overcome.

Many theories seek to explain why we experience this form of cognitive bias. In reality, it is probably some combination of these explanations.

Conspicuity holds that certain sensory stimuli (such as bright colors) and cognitive stimuli (such as something familiar) are more likely to be processed, and so stimuli that don’t fit into one of these two categories might be missed.

The mental workload theory describes how when we focus a lot of our brain’s mental energy on one stimulus, we are using up our cognitive resources and won’t be able to process another stimulus simultaneously.

Similarly, some psychologists explain how we attend to different stimuli with varying levels of attentional capacity, which might affect our ability to process multiple stimuli simultaneously.

In other words, an experienced driver might be able to see that car swerve into the lane because they are using fewer mental resources to drive, whereas a beginner driver might be using more resources to focus on the road ahead and unable to process that car swerving in.

A final explanation argues that because our attentional and processing resources are limited, our brain dedicates them to what fits into our schemas or our cognitive representations of the world (Cherry, 2020).

Thus, when an unexpected stimulus comes into our line of sight, we might not be able to process it on the conscious level. The following example illustrates how this might happen.

The most famous study to demonstrate the inattentional blindness phenomenon is the invisible gorilla study (Most et al., 2001). This experiment asked participants to watch a video of two groups passing a basketball and count how many times the white team passed the ball.

Participants are able to accurately report the number of passes, but what they fail to notice is a gorilla walking directly through the middle of the circle.

Because this would not be expected, and because our brain is using up its resources to count the number of passes, we completely fail to process something right before our eyes.

A real-world example of inattentional blindness occurred in 1995 when Boston police officer Kenny Conley was chasing a suspect and ran by a group of officers who were mistakenly holding down an undercover cop.

Conley was convicted of perjury and obstruction of justice because he supposedly saw the fight between the undercover cop and the other officers and lied about it to protect the officers, but he stood by his word that he really hadn’t seen it (due to inattentional blindness) and was ultimately exonerated (Pickel, 2015).

The key to overcoming inattentional blindness is to maximize your attention by avoiding distractions such as checking your phone. And it is also important to pay attention to what other people might not notice (if you are that driver, don’t always assume that others can see you).

By working on expanding your attention and minimizing unnecessary distractions that will use up your mental resources, you can work towards overcoming this bias.

Preventing Cognitive Bias

As we know, recognizing these biases is the first step to overcoming them. But there are other small strategies we can follow in order to train our unconscious mind to think in different ways.

From strengthening our memory and minimizing distractions to slowing down our decision-making and improving our reasoning skills, we can work towards overcoming these cognitive biases.

An individual can evaluate his or her own thought process, also known as metacognition (“thinking about thinking”), which provides an opportunity to combat bias (Flavell, 1979).

This multifactorial process involves (Croskerry, 2003):

(a) acknowledging the limitations of memory, (b) seeking perspective while making decisions, (c) being able to self-critique, (d) choosing strategies to prevent cognitive error.

Many strategies used to avoid bias that we describe are also known as cognitive forcing strategies, which are mental tools used to force unbiased decision-making.

The History of Cognitive Bias

The term cognitive bias was first coined in the 1970s by Israeli psychologists Amos Tversky and Daniel Kahneman, who used this phrase to describe people’s flawed thinking patterns in response to judgment and decision problems (Tversky & Kahneman, 1974).

Tversky and Kahneman’s research program, the heuristics and biases program, investigated how people make decisions given limited resources (for example, limited time to decide which food to eat or limited information to decide which house to buy).

As a result of these limited resources, people are forced to rely on heuristics or quick mental shortcuts to help make their decisions.

Tversky and Kahneman wanted to understand the biases associated with this judgment and decision-making process.

To do so, the two researchers relied on a research paradigm that presented participants with some type of reasoning problem with a computed normative answer (they used probability theory and statistics to compute the expected answer).

Participants’ responses were then compared with the predetermined solution to reveal the systematic deviations in the mind.

After running several experiments with countless reasoning problems, the researchers were able to identify numerous norm violations that result when our minds rely on these cognitive biases to make decisions and judgments (Wilke & Mata, 2012).

Key Takeaways

  • Cognitive biases are unconscious errors in thinking that arise from problems related to memory, attention, and other mental mistakes.
  • These biases result from our brain’s efforts to simplify the incredibly complex world in which we live.
  • Confirmation bias , hindsight bias, mere exposure effect , self-serving bias , base rate fallacy , anchoring bias , availability bias , the framing effect ,  inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect .
  • Cognitive biases directly affect our safety, interactions with others, and how we make judgments and decisions in our daily lives.
  • Although these biases are unconscious, there are small steps we can take to train our minds to adopt a new pattern of thinking and mitigate the effects of these biases.

Allen, M. S., Robson, D. A., Martin, L. J., & Laborde, S. (2020). Systematic review and meta-analysis of self-serving attribution biases in the competitive context of organized sport. Personality and Social Psychology Bulletin, 46 (7), 1027-1043.

Casad, B. (2019). Confirmation bias . Retrieved from https://www.britannica.com/science/confirmation-bias

Cherry, K. (2019). How the availability heuristic affects your decision-making . Retrieved from https://www.verywellmind.com/availability-heuristic-2794824

Cherry, K. (2020). Inattentional blindness can cause you to miss things in front of you . Retrieved from https://www.verywellmind.com/what-is-inattentional-blindness-2795020

Dietrich, D., & Olson, M. (1993). A demonstration of hindsight bias using the Thomas confirmation vote. Psychological Reports, 72 (2), 377-378.

Fischhoff, B. (1975). Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1 (3), 288.

Fischhoff, B., & Beyth, R. (1975). I knew it would happen: Remembered probabilities of once—future things. Organizational Behavior and Human Performance, 13 (1), 1-16.

Furnham, A. (1982). Explanations for unemployment in Britain. European Journal of social psychology, 12(4), 335-352.

Heider, F. (1982). The psychology of interpersonal relations . Psychology Press.

Inman, M. (2016). Hindsight bias . Retrieved from https://www.britannica.com/topic/hindsight-bias

Lang, R. (2019). What is the difference between conscious and unconscious bias? : Faqs. Retrieved from https://engageinlearning.com/faq/compliance/unconscious-bias/what-is-the-difference-between-conscious-and-unconscious-bias/

Luippold, B., Perreault, S., & Wainberg, J. (2015). Auditor’s pitfall: Five ways to overcome confirmation bias . Retrieved from https://www.babson.edu/academics/executive-education/babson-insight/finance-and-accounting/auditors-pitfall-five-ways-to-overcome-confirmation-bias/

Mezulis, A. H., Abramson, L. Y., Hyde, J. S., & Hankin, B. L. (2004). Is there a universal positivity bias in attributions? A meta-analytic review of individual, developmental, and cultural differences in the self-serving attributional bias. Psychological Bulletin, 130 (5), 711.

Miller, D. T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or fiction?. Psychological Bulletin, 82 (2), 213.

Most, S. B., Simons, D. J., Scholl, B. J., Jimenez, R., Clifford, E., & Chabris, C. F. (2001). How not to be seen: The contribution of similarity and selective ignoring to sustained inattentional blindness. Psychological Science, 12 (1), 9-17.

Mussweiler, T., & Strack, F. (1999). Hypothesis-consistent testing and semantic priming in the anchoring paradigm: A selective accessibility model. Journal of Experimental Social Psychology, 35 (2), 136-164.

Neff, K. (2003). Self-compassion: An alternative conceptualization of a healthy attitude toward oneself. Self and Identity, 2 (2), 85-101.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2 (2), 175-220.

Orzan, G., Zara, I. A., & Purcarea, V. L. (2012). Neuromarketing techniques in pharmaceutical drugs advertising. A discussion and agenda for future research. Journal of Medicine and Life, 5 (4), 428.

Pickel, K. L. (2015). Eyewitness memory. The handbook of attention , 485-502.

Pohl, R. F., & Hell, W. (1996). No reduction in hindsight bias after complete information and repeated testing. Organizational Behavior and Human Decision Processes, 67 (1), 49-58.

Roese, N. J., & Vohs, K. D. (2012). Hindsight bias. Perspectives on Psychological Science, 7 (5), 411-426.

Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. In Advances in experimental social psychology (Vol. 10, pp. 173-220). Academic Press.

Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5 (2), 207-232.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 (4157), 1124-1131.

Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review , 90(4), 293.

Tversky, A., & Kahneman, D. (1992). Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4), 297-323.

Walther, J. B., & Bazarova, N. N. (2007). Misattribution in virtual groups: The effects of member distribution on self-serving bias and partner blame. Human Communication Research, 33 (1), 1-26.

Wason, Peter C. (1960), “On the failure to eliminate hypotheses in a conceptual task”. Quarterly Journal of Experimental Psychology, 12 (3): 129–40.

Wegener, D. T., Petty, R. E., Detweiler-Bedell, B. T., & Jarvis, W. B. G. (2001). Implications of attitude change theories for numerical anchoring: Anchor plausibility and the limits of anchor effectiveness. Journal of Experimental Social Psychology, 37 (1), 62-69.

Wilke, A., & Mata, R. (2012). Cognitive bias. In Encyclopedia of human behavior (pp. 531-535). Academic Press.

Further Information

Test yourself for bias.

  • Project Implicit (IAT Test) From Harvard University
  • Implicit Association Test From the Social Psychology Network
  • Test Yourself for Hidden Bias From Teaching Tolerance
  • How The Concept Of Implicit Bias Came Into Being With Dr. Mahzarin Banaji, Harvard University. Author of Blindspot: hidden biases of good people5:28 minutes; includes transcript
  • Understanding Your Racial Biases With John Dovidio, PhD, Yale University From the American Psychological Association11:09 minutes; includes transcript
  • Talking Implicit Bias in Policing With Jack Glaser, Goldman School of Public Policy, University of California Berkeley21:59 minutes
  • Implicit Bias: A Factor in Health Communication With Dr. Winston Wong, Kaiser Permanente19:58 minutes
  • Bias, Black Lives and Academic Medicine Dr. David Ansell on Your Health Radio (August 1, 2015)21:42 minutes
  • Uncovering Hidden Biases Google talk with Dr. Mahzarin Banaji, Harvard University
  • Impact of Implicit Bias on the Justice System 9:14 minutes
  • Students Speak Up: What Bias Means to Them 2:17 minutes
  • Weight Bias in Health Care From Yale University16:56 minutes
  • Gender and Racial Bias In Facial Recognition Technology 4:43 minutes

Journal Articles

  • An implicit bias primer Mitchell, G. (2018). An implicit bias primer. Virginia Journal of Social Policy & the Law , 25, 27–59.
  • Implicit Association Test at age 7: A methodological and conceptual review Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior, 4 , 265-292.
  • Implicit Racial/Ethnic Bias Among Health Care Professionals and Its Influence on Health Care Outcomes: A Systematic Review Hall, W. J., Chapman, M. V., Lee, K. M., Merino, Y. M., Thomas, T. W., Payne, B. K., … & Coyne-Beasley, T. (2015). Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. American journal of public health, 105 (12), e60-e76.
  • Reducing Racial Bias Among Health Care Providers: Lessons from Social-Cognitive Psychology Burgess, D., Van Ryn, M., Dovidio, J., & Saha, S. (2007). Reducing racial bias among health care providers: lessons from social-cognitive psychology. Journal of general internal medicine, 22 (6), 882-887.
  • Integrating implicit bias into counselor education Boysen, G. A. (2010). Integrating Implicit Bias Into Counselor Education. Counselor Education & Supervision, 49 (4), 210–227.
  • Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect Christian, S. (2013). Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect. Journal of Mass Media Ethics, 28 (3), 160–174.
  • Empathy intervention to reduce implicit bias in pre-service teachers Whitford, D. K., & Emerson, A. M. (2019). Empathy Intervention to Reduce Implicit Bias in Pre-Service Teachers. Psychological Reports, 122 (2), 670–688.

Print Friendly, PDF & Email

Related Articles

Automatic Processing in Psychology: Definition & Examples

Cognitive Psychology

Automatic Processing in Psychology: Definition & Examples

Controlled Processing in Psychology: Definition & Examples

Controlled Processing in Psychology: Definition & Examples

How Ego Depletion Can Drain Your Willpower

How Ego Depletion Can Drain Your Willpower

What is the Default Mode Network?

What is the Default Mode Network?

Theories of Selective Attention in Psychology

Availability Heuristic and Decision Making

Availability Heuristic and Decision Making

Is MasterClass right for me?

Take this quiz to find out.

How to Identify Cognitive Bias: 12 Examples of Cognitive Bias

Written by MasterClass

Last updated: Jun 7, 2021 • 6 min read

Cognitive biases are inherent in the way we think, and many of them are unconscious. Identifying the biases you experience and purport in your everyday interactions is the first step to understanding how our mental processes work, which can help us make better, more informed decisions.

example of how critical thinking cleared up biases

Critical thinking

We’ve already established that information can be biased. Now it’s time to look at our own bias.

Studies have shown that we are more likely to accept information when it fits into our existing worldview, a phenomenon known as confirmation or myside bias (for examples see Kappes et al., 2020 ; McCrudden & Barnes, 2016 ; Pilditch & Custers, 2018 ). Wittebols (2019) defines it as a “tendency to be psychologically invested in the familiar and what we believe and less receptive to information that contradicts what we believe” (p. 211). Quite simply, we may reject information that doesn’t support our existing thinking.

This can manifest in a number of ways with Hahn and Harris (2014) suggesting four main behaviours:

  • Searching only for information that supports our held beliefs
  • Failing to critically evaluate information that supports our held beliefs - accepting it at face value - while explaining away or being overly critical of information that might contradict them
  • Becoming set in our thinking, once an opinion has been formed, and deliberately ignoring any new information on the topic
  • A tendency to be overconfident with the validity of our held beliefs.

Peters (2020) also suggests that we’re more likely to remember information that supports our way of thinking, further cementing our bias. Taken together, the research suggests that bias has a huge impact on the way we think. To learn more about how and why bias can impact our everyday thinking, watch this short video.

Filter bubbles and echo chambers

The theory of filter bubbles emerged in 2011, proposed by an Internet activist, Eli Pariser. He defined it as “your own personal unique world of information that you live in online” ( Pariser, 2011, 4:21 ). At the time that Pariser proposed the filter bubble theory, he focused on the impact of algorithms, connected with social media platforms and search engines, which prioritised content and personalised results based on the individuals past online activity, suggesting “the Internet is showing us what it thinks we want to see, but not necessarily what we should see” (Pariser, 2011, 3:47. Watch his TED talk if you’d like to know more).

Our understanding of filter bubbles has now expanded to recognise that individuals also select and create their own filter bubbles. This happens when you seek out likeminded individuals or sources; follow your friends or people you admire on social media; people that you’re likely to share common beliefs, points-of-view, and interests with. Barack Obama (2017) addressed the concept of filter bubbles in his presidential farewell address:

For too many of us it’s become safer to retreat into our own bubbles, whether in our neighbourhoods, or on college campuses, or places of worship, or especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions… Increasingly we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there. ( Obama, 2017, 22:57 ).

Filter bubbles are not unique to the social media age. Previously, the term echo chamber was used to describe the same phenomenon in the news media where different channels exist, catering to different points of view. Within an echo chamber, people are able to seek out information that supports their existing beliefs, without encountering information that might challenge, contradict or oppose.

Other forms of bias

There are many different ways in which bias can affect the way you think and how you process new information. Try the quiz below to discover some additional forms of bias, or check out Buzzfeed’s 2017 article on cognitive bias.

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons

Margin Size

  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

  • Last updated
  • Save as PDF
  • Page ID 162135

  • Nathan Smith et al.

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

Learning Objectives

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.

CONNECTIONS

See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Cognitive Biases 101, with Peter Bauman

Click to view content

Confirmation Bias

One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

The Dangers of Tribalism, Kevin deLaplante

Sunk Cost Fallacy

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Confirmation bias The tendency to search for, interpret, favor, and recall information that confirms or supports prior beliefs As part of their morning routine, a person scans news headlines on the internet and chooses to read only those stories that confirm views they already hold.
Anchoring bias The tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something When supplied with a random number and then asked to provide a number estimate in response to a question, people supply a number close to the random number they were initially given.
Availability heuristic The tendency to evaluate new information based on the most recent or most easily recalled examples People in the United States overestimate the probability of dying in a criminal attack, since these types of stories are easy to vividly recall.
Tribalism The tendency for human beings to align themselves with groups with whom they share values and practices People with a strong commitment to one political party often struggle to objectively evaluate the political positions of those who are members of the opposing party.
Bandwagon fallacy The tendency to do something or believe something because many other people do or believe the same thing Advertisers often rely on the bandwagon fallacy, attempting to create the impression that “everyone” is buying a new product, in order to inspire others to buy it.
Sunk cost fallacy The tendency to attach a value to things in which resources have been invested that is greater than the value those things actually have A business person continues to invest money in a failing venture, “throwing good money after bad.”
Gambler’s fallacy The tendency to reason that future chance events will be more likely if they have not happened recently Someone who regularly buys lottery tickets reasons that they are “due to win,” since they haven’t won once in twenty years.

Table 2.1 Common Cognitive Biases

Think Like A Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

  • The Key is Being Metacognitive
  • The Big Picture
  • Learning Outcomes
  • Test your Existing Knowledge
  • Definitions of Critical Thinking
  • Learning How to Think Critically
  • Self Reflection Activity
  • End of Module Survey
  • Test Your Existing Knowledge
  • Interpreting Information Methodically
  • Using the SEE-I Method
  • Interpreting Information Critically
  • Argument Analysis
  • Learning Activities
  • Argument Mapping
  • Summary of Anlyzing Arguments
  • Fallacious Reasoning
  • Statistical Misrepresentation
  • Biased Reasoning

Common Cognitive Biases

  • Poor Research Methods - The Wakefield Study
  • Summary of How Reasoning Fails
  • Misinformation and Disinformation
  • Media and Digital Literacy
  • Information Trustworthiness
  • Summary of How Misinformation is Spread

Critical Thinking Tutorial: Common Cognitive Biases

Cognitive biases can distort the way we interpret information, leading to false perceptions of reality, and affecting our ability to reason. Biases also prevent us from considering diverse perspectives, weighing evidence objectively, and relying on accurate information to make decisions. As a critical thinker, being aware of cognitive biases can help you actively challenge and limit their influence.

There are far too many cognitive biases to list in this tutorial ( see cognitive bias codex ) but this video explains twelve common ones. Listen attentively, then attempt the drag-and-drop activity that follows.

Test Your Understanding

  • << Previous: Biased Reasoning
  • Next: Poor Research Methods - The Wakefield Study >>
  • Library A to Z
  • Follow on Facebook
  • Follow on Twitter
  • Follow on YouTube
  • Follow on Instagram

The University of Saskatchewan's main campus is situated on  Treaty 6 Territory and the Homeland of the Métis.

© University of Saskatchewan Disclaimer | Privacy

  • Last Updated: Dec 14, 2023 3:51 PM
  • URL: https://libguides.usask.ca/CriticalThinkingTutorial

Warren Berger

A Crash Course in Critical Thinking

What you need to know—and read—about one of the essential skills needed today..

Posted April 8, 2024 | Reviewed by Michelle Quirk

  • In research for "A More Beautiful Question," I did a deep dive into the current crisis in critical thinking.
  • Many people may think of themselves as critical thinkers, but they actually are not.
  • Here is a series of questions you can ask yourself to try to ensure that you are thinking critically.

Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion about who and what to believe.

These are some of the hallmarks of the current crisis in critical thinking—which just might be the issue of our times. Because if people aren’t willing or able to think critically as they choose potential leaders, they’re apt to choose bad ones. And if they can’t judge whether the information they’re receiving is sound, they may follow faulty advice while ignoring recommendations that are science-based and solid (and perhaps life-saving).

Moreover, as a society, if we can’t think critically about the many serious challenges we face, it becomes more difficult to agree on what those challenges are—much less solve them.

On a personal level, critical thinking can enable you to make better everyday decisions. It can help you make sense of an increasingly complex and confusing world.

In the new expanded edition of my book A More Beautiful Question ( AMBQ ), I took a deep dive into critical thinking. Here are a few key things I learned.

First off, before you can get better at critical thinking, you should understand what it is. It’s not just about being a skeptic. When thinking critically, we are thoughtfully reasoning, evaluating, and making decisions based on evidence and logic. And—perhaps most important—while doing this, a critical thinker always strives to be open-minded and fair-minded . That’s not easy: It demands that you constantly question your assumptions and biases and that you always remain open to considering opposing views.

In today’s polarized environment, many people think of themselves as critical thinkers simply because they ask skeptical questions—often directed at, say, certain government policies or ideas espoused by those on the “other side” of the political divide. The problem is, they may not be asking these questions with an open mind or a willingness to fairly consider opposing views.

When people do this, they’re engaging in “weak-sense critical thinking”—a term popularized by the late Richard Paul, a co-founder of The Foundation for Critical Thinking . “Weak-sense critical thinking” means applying the tools and practices of critical thinking—questioning, investigating, evaluating—but with the sole purpose of confirming one’s own bias or serving an agenda.

In AMBQ , I lay out a series of questions you can ask yourself to try to ensure that you’re thinking critically. Here are some of the questions to consider:

  • Why do I believe what I believe?
  • Are my views based on evidence?
  • Have I fairly and thoughtfully considered differing viewpoints?
  • Am I truly open to changing my mind?

Of course, becoming a better critical thinker is not as simple as just asking yourself a few questions. Critical thinking is a habit of mind that must be developed and strengthened over time. In effect, you must train yourself to think in a manner that is more effortful, aware, grounded, and balanced.

For those interested in giving themselves a crash course in critical thinking—something I did myself, as I was working on my book—I thought it might be helpful to share a list of some of the books that have shaped my own thinking on this subject. As a self-interested author, I naturally would suggest that you start with the new 10th-anniversary edition of A More Beautiful Question , but beyond that, here are the top eight critical-thinking books I’d recommend.

The Demon-Haunted World: Science as a Candle in the Dark , by Carl Sagan

This book simply must top the list, because the late scientist and author Carl Sagan continues to be such a bright shining light in the critical thinking universe. Chapter 12 includes the details on Sagan’s famous “baloney detection kit,” a collection of lessons and tips on how to deal with bogus arguments and logical fallacies.

example of how critical thinking cleared up biases

Clear Thinking: Turning Ordinary Moments Into Extraordinary Results , by Shane Parrish

The creator of the Farnham Street website and host of the “Knowledge Project” podcast explains how to contend with biases and unconscious reactions so you can make better everyday decisions. It contains insights from many of the brilliant thinkers Shane has studied.

Good Thinking: Why Flawed Logic Puts Us All at Risk and How Critical Thinking Can Save the World , by David Robert Grimes

A brilliant, comprehensive 2021 book on critical thinking that, to my mind, hasn’t received nearly enough attention . The scientist Grimes dissects bad thinking, shows why it persists, and offers the tools to defeat it.

Think Again: The Power of Knowing What You Don't Know , by Adam Grant

Intellectual humility—being willing to admit that you might be wrong—is what this book is primarily about. But Adam, the renowned Wharton psychology professor and bestselling author, takes the reader on a mind-opening journey with colorful stories and characters.

Think Like a Detective: A Kid's Guide to Critical Thinking , by David Pakman

The popular YouTuber and podcast host Pakman—normally known for talking politics —has written a terrific primer on critical thinking for children. The illustrated book presents critical thinking as a “superpower” that enables kids to unlock mysteries and dig for truth. (I also recommend Pakman’s second kids’ book called Think Like a Scientist .)

Rationality: What It Is, Why It Seems Scarce, Why It Matters , by Steven Pinker

The Harvard psychology professor Pinker tackles conspiracy theories head-on but also explores concepts involving risk/reward, probability and randomness, and correlation/causation. And if that strikes you as daunting, be assured that Pinker makes it lively and accessible.

How Minds Change: The Surprising Science of Belief, Opinion and Persuasion , by David McRaney

David is a science writer who hosts the popular podcast “You Are Not So Smart” (and his ideas are featured in A More Beautiful Question ). His well-written book looks at ways you can actually get through to people who see the world very differently than you (hint: bludgeoning them with facts definitely won’t work).

A Healthy Democracy's Best Hope: Building the Critical Thinking Habit , by M Neil Browne and Chelsea Kulhanek

Neil Browne, author of the seminal Asking the Right Questions: A Guide to Critical Thinking, has been a pioneer in presenting critical thinking as a question-based approach to making sense of the world around us. His newest book, co-authored with Chelsea Kulhanek, breaks down critical thinking into “11 explosive questions”—including the “priors question” (which challenges us to question assumptions), the “evidence question” (focusing on how to evaluate and weigh evidence), and the “humility question” (which reminds us that a critical thinker must be humble enough to consider the possibility of being wrong).

Warren Berger

Warren Berger is a longtime journalist and author of A More Beautiful Question .

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

May 2024 magazine cover

At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that threatens to derail our entire day. Here’s how we can face our triggers with less reactivity so that we can get on with our lives.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience
  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

13 Types of Common Cognitive Biases That Might Be Impairing Your Judgment

Which of these sway your thinking the most?

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

example of how critical thinking cleared up biases

Amy Morin, LCSW, is a psychotherapist and international bestselling author. Her books, including "13 Things Mentally Strong People Don't Do," have been translated into more than 40 languages. Her TEDx talk,  "The Secret of Becoming Mentally Strong," is one of the most viewed talks of all time.

example of how critical thinking cleared up biases

The Confirmation Bias

The hindsight bias, the anchoring bias, the misinformation effect, the actor-observer bias, the false consensus effect, the halo effect, the self-serving bias, the availability heuristic, the optimism bias.

  • Other Kinds

Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases . These biases distort thinking , influence beliefs, and sway the decisions and judgments that people make each and every day.

Sometimes, cognitive biases are fairly obvious. You might even find that you recognize these tendencies in yourself or others. In other cases, these biases are so subtle that they are almost impossible to notice.

At a Glance

Attention is a limited resource. This means we can't possibly evaluate every possible detail and event ​when forming thoughts and opinions. Because of this, we often rely on mental shortcuts that speed up our ability to make judgments, but this can sometimes lead to bias. There are many types of biases—including the confirmation bias, the hindsight bias, and the anchoring bias, just to name a few—that can influence our beliefs and actions daily.

The following are just a few types of cognitive biases that have a powerful influence on how you think, how you feel, and how you behave.

Tara Moore / Getty Images

The confirmation bias is the tendency to listen more often to information that confirms our existing beliefs. Through this bias, people tend to favor information that reinforces the things they already think or believe.

Examples include:

  • Only paying attention to information that confirms your beliefs about issues such as gun control and global warming
  • Only following people on social media who share your viewpoints
  • Choosing news sources that present stories that support your views
  • Refusing to listen to the opposing side
  • Not considering all of the facts in a logical and rational manner

There are a few reasons why this happens. One is that only seeking to confirm existing opinions helps limit mental resources we need to use to make decisions. It also helps protect self-esteem by making people feel that their beliefs are accurate.

People on two sides of an issue can listen to the same story and walk away with different interpretations that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to "bias" their opinions.

The problem with this is that it can lead to poor choices, an inability to listen to opposing views, or even contribute to othering people who hold different opinions.

Things that we can do to help reduce the impact of confirmation bias include being open to hearing others' opinions and specifically looking for/researching opposing views, reading full articles (and not just headlines), questioning the source, and [doing] the research yourself to see if it is a reliable source.

The hindsight bias is a common cognitive bias that involves the tendency to see events, even random ones, as more predictable than they are. It's also commonly referred to as the "I knew it all along" phenomenon.

Some examples of the hindsight bias include:

  • Insisting that you knew who was going to win a football game once the event is over
  • Believing that you knew all along that one political candidate was going to win an election
  • Saying that you knew you weren't going to win after losing a coin flip with a friend
  • Looking back on an exam and thinking that you knew the answers to the questions you missed
  • Believing you could have predicted which stocks would become profitable

Classic Research

In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court.

Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.  

The hindsight bias occurs for a combination of reasons, including our ability to "misremember" previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.

The effect of this bias is that it causes us to overestimate our ability to predict events. This can sometimes lead people to take unwise risks.

The anchoring bias is the tendency to be overly influenced by the first piece of information that we hear. Some examples of how this works:

  • The first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based.
  • Hearing a random number can influence estimates on completely unrelated topics.
  • Doctors can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.

While the existence of the anchoring bias is well documented, its causes are still not fully understood. Some research suggests that the source of the anchor information may play a role. Other factors such as priming and mood also appear to have an influence.

Like other cognitive biases, anchoring can have an effect on the decisions you make each day. For instance, it can influence how much you are willing to pay for your home. However, it can sometimes lead to poor choices and make it more difficult for people to consider other factors that might also be important.

The misinformation effect is the tendency for memories to be heavily influenced by things that happened after the actual event itself. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.

For example:

  • Research has shown that simply asking questions about an event can change someone's memories of what happened.
  • Watching television coverage may change how people remember the event.
  • Hearing other people talk about a memory from their perspective may change your memory of what transpired.

Classic Memory Research

In one classic experiment by memory expert Elizabeth Loftus , people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”  

When the witnesses were then questioned a week later whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.

There are a few factors that may play a role in this phenomenon. New information may get blended with older memories.   In other cases, new information may be used to fill in "gaps" in memory.

The effects of misinformation can range from the trivial to much more serious. It might cause you to misremember something you thought happened at work, or it might lead to someone incorrectly identifying the wrong suspect in a criminal case.

The actor-observer bias is the tendency to attribute our actions to external influences and other people's actions to internal ones. The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation.

When it comes to our own actions, we are often far too likely to attribute things to external influences. For example:

  • You might complain that you botched an important meeting because you had jet lag.
  • You might say you failed an exam because the teacher posed too many trick questions.

When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. For example:

  • A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag).
  • A fellow student bombed a test because they lack diligence and intelligence (and not because they took the same test as you with all those trick questions).

While there are many factors that may play a role, perspective plays a key role. When we are the actors in a situation, we are able to observe our own thoughts and behaviors. When it comes to other people, however, we cannot see what they are thinking. This means we focus on situational forces for ourselves, but guess at the internal characteristics that cause other people's actions.

The problem with this is that it often leads to misunderstandings. Each side of a situation is essentially blaming the other side rather than thinking about all of the variables that might be playing a role.

The false consensus effect is the tendency people have to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values. For example:

  • Thinking that other people share your opinion on controversial topics
  • Overestimating the number of people who are similar to you
  • Believing that the majority of people share your preferences

Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.

Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem . It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.

This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions. It also means that we sometimes don't consider how other people might feel when making choices.

The halo effect is the tendency for an initial impression of a person to influence what we think of them overall. Also known as the "physical attractiveness stereotype" or the "what is beautiful is 'good' principle" we are either influenced by or use the halo to influence others almost every day. For example:

  • Thinking people who are good-looking are also smarter, kinder, and funnier than less attractive people
  • Believing that products marketed by attractive people are also more valuable
  • Thinking that a political candidate who is confident must also be intelligent and competent

One factor that may influence the halo effect is our tendency to want to be correct. If our initial impression of someone was positive, we want to look for proof that our assessment was accurate. It also helps people avoid experiencing cognitive dissonance , which involves holding contradictory beliefs.

This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more likely to be viewed as competent, smart, and qualified for the job.

The self-serving bias is a tendency for people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck.

Some examples of this:

  • Attributing good grades to being smart or studying hard
  • Believing your athletic performance is due to practice and hard work
  • Thinking you got the job because of your merits

The self-serving bias can be influenced by a variety of factors. Age and sex have been shown to play a part. Older people are more likely to take credit for their successes, while men are more likely to pin their failures on outside forces.  

This bias does serve an important role in protecting self-esteem. However, it can often also lead to faulty attributions such as blaming others for our own shortcomings.

The availability heuristic is the tendency to estimate the probability of something happening based on how many examples readily come to mind. Some examples of this:

  • After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are.
  • You might believe that plane crashes are more common than they really are because you can easily think of several examples.

It is essentially a mental shortcut designed to save us time when we are trying to determine risk. The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions.

Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking. In contrast, if you have two sisters and five neighbors who have had breast cancer, you might believe it is even more common than statistics suggest.

The optimism bias is a tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.

For example, we may assume that negative events won't affect us such as:

The optimism bias has roots in the availability heuristic. Because you can probably think of examples of bad things happening to other people it seems more likely that others will be affected by negative events.

This bias can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt. The bad news is that research has found that this optimism bias is incredibly difficult to reduce.

There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals.

Other Kinds of Cognitive Bias

Many other cognitive biases can distort how we perceive the world. Just a partial list:

  • Status quo bias reflects a desire to keep things as they are.
  • Apophenia is the tendency to perceive patterns in random occurrences.
  • Framing is presenting a situation in a way that gives a certain impression.

Keep in Mind

The cognitive biases above are common, but this is only a sampling of the many biases that can affect your thinking. These biases collectively influence much of our thoughts and ultimately, decision making.

Many of these biases are inevitable. We simply don't have the time to evaluate every thought in every decision for the presence of any bias. Understanding these biases is very helpful in learning how they can lead us to poor decisions in life.

Dietrich D, Olson M. A demonstration of hindsight bias using the Thomas confirmation vote . Psychol Rep . 1993;72(2):377-378. doi:/10.2466/pr0.1993.72.2.377

Lee KK.  An indirect debiasing method: Priming a target attribute reduces judgmental biases in likelihood estimations .  PLoS ONE . 2019;14(3):e0212609. doi:10.1371/journal.pone.0212609

Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review .  BMC Med Inform Decis Mak . 2016;16(1):138. doi:10.1186/s12911-016-0377-1

Furnham A., Boo HC. A literature review of anchoring bias .  The Journal of Socio-Economics.  2011;40(1):35-42. doi:10.1016/j.socec.2010.10.008

Loftus EF.  Leading questions and the eyewitness report .  Cognitive Psychology . 1975;7(4):560-572. doi:10.1016/0010-0285(75)90023-7

Challies DM, Hunt M, Garry M, Harper DN. Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect .  J Exp Anal Behav . 2011;96(3):343-362. doi:10.1901/jeab.2011.96-343

Miyamoto R, Kikuchi Y.  Gender differences of brain activity in the conflicts based on implicit self-esteem .  PLoS ONE . 2012;7(5):e37901. doi:10.1371/journal.pone.0037901

Weinstein ND, Klein WM.  Resistance of personal risk perceptions to debiasing interventions .  Health Psychol . 1995;14(2):132–140. doi:10.1037//0278-6133.14.2.132

Gratton G, Cooper P, Fabiani M, Carter CS, Karayanidis F. Dynamics of cognitive control: theoretical bases, paradigms, and a view for the future . Psychophysiology . 2018;55(3). doi:10.1111/psyp.13016

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

41+ Critical Thinking Examples (Definition + Practices)

practical psychology logo

Critical thinking is an essential skill in our information-overloaded world, where figuring out what is fact and fiction has become increasingly challenging.

But why is critical thinking essential? Put, critical thinking empowers us to make better decisions, challenge and validate our beliefs and assumptions, and understand and interact with the world more effectively and meaningfully.

Critical thinking is like using your brain's "superpowers" to make smart choices. Whether it's picking the right insurance, deciding what to do in a job, or discussing topics in school, thinking deeply helps a lot. In the next parts, we'll share real-life examples of when this superpower comes in handy and give you some fun exercises to practice it.

Critical Thinking Process Outline

a woman thinking

Critical thinking means thinking clearly and fairly without letting personal feelings get in the way. It's like being a detective, trying to solve a mystery by using clues and thinking hard about them.

It isn't always easy to think critically, as it can take a pretty smart person to see some of the questions that aren't being answered in a certain situation. But, we can train our brains to think more like puzzle solvers, which can help develop our critical thinking skills.

Here's what it looks like step by step:

Spotting the Problem: It's like discovering a puzzle to solve. You see that there's something you need to figure out or decide.

Collecting Clues: Now, you need to gather information. Maybe you read about it, watch a video, talk to people, or do some research. It's like getting all the pieces to solve your puzzle.

Breaking It Down: This is where you look at all your clues and try to see how they fit together. You're asking questions like: Why did this happen? What could happen next?

Checking Your Clues: You want to make sure your information is good. This means seeing if what you found out is true and if you can trust where it came from.

Making a Guess: After looking at all your clues, you think about what they mean and come up with an answer. This answer is like your best guess based on what you know.

Explaining Your Thoughts: Now, you tell others how you solved the puzzle. You explain how you thought about it and how you answered. 

Checking Your Work: This is like looking back and seeing if you missed anything. Did you make any mistakes? Did you let any personal feelings get in the way? This step helps make sure your thinking is clear and fair.

And remember, you might sometimes need to go back and redo some steps if you discover something new. If you realize you missed an important clue, you might have to go back and collect more information.

Critical Thinking Methods

Just like doing push-ups or running helps our bodies get stronger, there are special exercises that help our brains think better. These brain workouts push us to think harder, look at things closely, and ask many questions.

It's not always about finding the "right" answer. Instead, it's about the journey of thinking and asking "why" or "how." Doing these exercises often helps us become better thinkers and makes us curious to know more about the world.

Now, let's look at some brain workouts to help us think better:

1. "What If" Scenarios

Imagine crazy things happening, like, "What if there was no internet for a month? What would we do?" These games help us think of new and different ideas.

Pick a hot topic. Argue one side of it and then try arguing the opposite. This makes us see different viewpoints and think deeply about a topic.

3. Analyze Visual Data

Check out charts or pictures with lots of numbers and info but no explanations. What story are they telling? This helps us get better at understanding information just by looking at it.

4. Mind Mapping

Write an idea in the center and then draw lines to related ideas. It's like making a map of your thoughts. This helps us see how everything is connected.

There's lots of mind-mapping software , but it's also nice to do this by hand.

5. Weekly Diary

Every week, write about what happened, the choices you made, and what you learned. Writing helps us think about our actions and how we can do better.

6. Evaluating Information Sources

Collect stories or articles about one topic from newspapers or blogs. Which ones are trustworthy? Which ones might be a little biased? This teaches us to be smart about where we get our info.

There are many resources to help you determine if information sources are factual or not.

7. Socratic Questioning

This way of thinking is called the Socrates Method, named after an old-time thinker from Greece. It's about asking lots of questions to understand a topic. You can do this by yourself or chat with a friend.

Start with a Big Question:

"What does 'success' mean?"

Dive Deeper with More Questions:

"Why do you think of success that way?" "Do TV shows, friends, or family make you think that?" "Does everyone think about success the same way?"

"Can someone be a winner even if they aren't rich or famous?" "Can someone feel like they didn't succeed, even if everyone else thinks they did?"

Look for Real-life Examples:

"Who is someone you think is successful? Why?" "Was there a time you felt like a winner? What happened?"

Think About Other People's Views:

"How might a person from another country think about success?" "Does the idea of success change as we grow up or as our life changes?"

Think About What It Means:

"How does your idea of success shape what you want in life?" "Are there problems with only wanting to be rich or famous?"

Look Back and Think:

"After talking about this, did your idea of success change? How?" "Did you learn something new about what success means?"

socratic dialogue statues

8. Six Thinking Hats 

Edward de Bono came up with a cool way to solve problems by thinking in six different ways, like wearing different colored hats. You can do this independently, but it might be more effective in a group so everyone can have a different hat color. Each color has its way of thinking:

White Hat (Facts): Just the facts! Ask, "What do we know? What do we need to find out?"

Red Hat (Feelings): Talk about feelings. Ask, "How do I feel about this?"

Black Hat (Careful Thinking): Be cautious. Ask, "What could go wrong?"

Yellow Hat (Positive Thinking): Look on the bright side. Ask, "What's good about this?"

Green Hat (Creative Thinking): Think of new ideas. Ask, "What's another way to look at this?"

Blue Hat (Planning): Organize the talk. Ask, "What should we do next?"

When using this method with a group:

  • Explain all the hats.
  • Decide which hat to wear first.
  • Make sure everyone switches hats at the same time.
  • Finish with the Blue Hat to plan the next steps.

9. SWOT Analysis

SWOT Analysis is like a game plan for businesses to know where they stand and where they should go. "SWOT" stands for Strengths, Weaknesses, Opportunities, and Threats.

There are a lot of SWOT templates out there for how to do this visually, but you can also think it through. It doesn't just apply to businesses but can be a good way to decide if a project you're working on is working.

Strengths: What's working well? Ask, "What are we good at?"

Weaknesses: Where can we do better? Ask, "Where can we improve?"

Opportunities: What good things might come our way? Ask, "What chances can we grab?"

Threats: What challenges might we face? Ask, "What might make things tough for us?"

Steps to do a SWOT Analysis:

  • Goal: Decide what you want to find out.
  • Research: Learn about your business and the world around it.
  • Brainstorm: Get a group and think together. Talk about strengths, weaknesses, opportunities, and threats.
  • Pick the Most Important Points: Some things might be more urgent or important than others.
  • Make a Plan: Decide what to do based on your SWOT list.
  • Check Again Later: Things change, so look at your SWOT again after a while to update it.

Now that you have a few tools for thinking critically, let’s get into some specific examples.

Everyday Examples

Life is a series of decisions. From the moment we wake up, we're faced with choices – some trivial, like choosing a breakfast cereal, and some more significant, like buying a home or confronting an ethical dilemma at work. While it might seem that these decisions are disparate, they all benefit from the application of critical thinking.

10. Deciding to buy something

Imagine you want a new phone. Don't just buy it because the ad looks cool. Think about what you need in a phone. Look up different phones and see what people say about them. Choose the one that's the best deal for what you want.

11. Deciding what is true

There's a lot of news everywhere. Don't believe everything right away. Think about why someone might be telling you this. Check if what you're reading or watching is true. Make up your mind after you've looked into it.

12. Deciding when you’re wrong

Sometimes, friends can have disagreements. Don't just get mad right away. Try to see where they're coming from. Talk about what's going on. Find a way to fix the problem that's fair for everyone.

13. Deciding what to eat

There's always a new diet or exercise that's popular. Don't just follow it because it's trendy. Find out if it's good for you. Ask someone who knows, like a doctor. Make choices that make you feel good and stay healthy.

14. Deciding what to do today

Everyone is busy with school, chores, and hobbies. Make a list of things you need to do. Decide which ones are most important. Plan your day so you can get things done and still have fun.

15. Making Tough Choices

Sometimes, it's hard to know what's right. Think about how each choice will affect you and others. Talk to people you trust about it. Choose what feels right in your heart and is fair to others.

16. Planning for the Future

Big decisions, like where to go to school, can be tricky. Think about what you want in the future. Look at the good and bad of each choice. Talk to people who know about it. Pick what feels best for your dreams and goals.

choosing a house

Job Examples

17. solving problems.

Workers brainstorm ways to fix a machine quickly without making things worse when a machine breaks at a factory.

18. Decision Making

A store manager decides which products to order more of based on what's selling best.

19. Setting Goals

A team leader helps their team decide what tasks are most important to finish this month and which can wait.

20. Evaluating Ideas

At a team meeting, everyone shares ideas for a new project. The group discusses each idea's pros and cons before picking one.

21. Handling Conflict

Two workers disagree on how to do a job. Instead of arguing, they talk calmly, listen to each other, and find a solution they both like.

22. Improving Processes

A cashier thinks of a faster way to ring up items so customers don't have to wait as long.

23. Asking Questions

Before starting a big task, an employee asks for clear instructions and checks if they have the necessary tools.

24. Checking Facts

Before presenting a report, someone double-checks all their information to make sure there are no mistakes.

25. Planning for the Future

A business owner thinks about what might happen in the next few years, like new competitors or changes in what customers want, and makes plans based on those thoughts.

26. Understanding Perspectives

A team is designing a new toy. They think about what kids and parents would both like instead of just what they think is fun.

School Examples

27. researching a topic.

For a history project, a student looks up different sources to understand an event from multiple viewpoints.

28. Debating an Issue

In a class discussion, students pick sides on a topic, like school uniforms, and share reasons to support their views.

29. Evaluating Sources

While writing an essay, a student checks if the information from a website is trustworthy or might be biased.

30. Problem Solving in Math

When stuck on a tricky math problem, a student tries different methods to find the answer instead of giving up.

31. Analyzing Literature

In English class, students discuss why a character in a book made certain choices and what those decisions reveal about them.

32. Testing a Hypothesis

For a science experiment, students guess what will happen and then conduct tests to see if they're right or wrong.

33. Giving Peer Feedback

After reading a classmate's essay, a student offers suggestions for improving it.

34. Questioning Assumptions

In a geography lesson, students consider why certain countries are called "developed" and what that label means.

35. Designing a Study

For a psychology project, students plan an experiment to understand how people's memories work and think of ways to ensure accurate results.

36. Interpreting Data

In a science class, students look at charts and graphs from a study, then discuss what the information tells them and if there are any patterns.

Critical Thinking Puzzles

critical thinking tree

Not all scenarios will have a single correct answer that can be figured out by thinking critically. Sometimes we have to think critically about ethical choices or moral behaviors. 

Here are some mind games and scenarios you can solve using critical thinking. You can see the solution(s) at the end of the post.

37. The Farmer, Fox, Chicken, and Grain Problem

A farmer is at a riverbank with a fox, a chicken, and a grain bag. He needs to get all three items across the river. However, his boat can only carry himself and one of the three items at a time. 

Here's the challenge:

  • If the fox is left alone with the chicken, the fox will eat the chicken.
  • If the chicken is left alone with the grain, the chicken will eat the grain.

How can the farmer get all three items across the river without any item being eaten? 

38. The Rope, Jar, and Pebbles Problem

You are in a room with two long ropes hanging from the ceiling. Each rope is just out of arm's reach from the other, so you can't hold onto one rope and reach the other simultaneously. 

Your task is to tie the two rope ends together, but you can't move the position where they hang from the ceiling.

You are given a jar full of pebbles. How do you complete the task?

39. The Two Guards Problem

Imagine there are two doors. One door leads to certain doom, and the other leads to freedom. You don't know which is which.

In front of each door stands a guard. One guard always tells the truth. The other guard always lies. You don't know which guard is which.

You can ask only one question to one of the guards. What question should you ask to find the door that leads to freedom?

40. The Hourglass Problem

You have two hourglasses. One measures 7 minutes when turned over, and the other measures 4 minutes. Using just these hourglasses, how can you time exactly 9 minutes?

41. The Lifeboat Dilemma

Imagine you're on a ship that's sinking. You get on a lifeboat, but it's already too full and might flip over. 

Nearby in the water, five people are struggling: a scientist close to finding a cure for a sickness, an old couple who've been together for a long time, a mom with three kids waiting at home, and a tired teenager who helped save others but is now in danger. 

You can only save one person without making the boat flip. Who would you choose?

42. The Tech Dilemma

You work at a tech company and help make a computer program to help small businesses. You're almost ready to share it with everyone, but you find out there might be a small chance it has a problem that could show users' private info. 

If you decide to fix it, you must wait two more months before sharing it. But your bosses want you to share it now. What would you do?

43. The History Mystery

Dr. Amelia is a history expert. She's studying where a group of people traveled long ago. She reads old letters and documents to learn about it. But she finds some letters that tell a different story than what most people believe. 

If she says this new story is true, it could change what people learn in school and what they think about history. What should she do?

The Role of Bias in Critical Thinking

Have you ever decided you don’t like someone before you even know them? Or maybe someone shared an idea with you that you immediately loved without even knowing all the details. 

This experience is called bias, which occurs when you like or dislike something or someone without a good reason or knowing why. It can also take shape in certain reactions to situations, like a habit or instinct. 

Bias comes from our own experiences, what friends or family tell us, or even things we are born believing. Sometimes, bias can help us stay safe, but other times it stops us from seeing the truth.

Not all bias is bad. Bias can be a mechanism for assessing our potential safety in a new situation. If we are biased to think that anything long, thin, and curled up is a snake, we might assume the rope is something to be afraid of before we know it is just a rope.

While bias might serve us in some situations (like jumping out of the way of an actual snake before we have time to process that we need to be jumping out of the way), it often harms our ability to think critically.

How Bias Gets in the Way of Good Thinking

Selective Perception: We only notice things that match our ideas and ignore the rest. 

It's like only picking red candies from a mixed bowl because you think they taste the best, but they taste the same as every other candy in the bowl. It could also be when we see all the signs that our partner is cheating on us but choose to ignore them because we are happy the way we are (or at least, we think we are).

Agreeing with Yourself: This is called “ confirmation bias ” when we only listen to ideas that match our own and seek, interpret, and remember information in a way that confirms what we already think we know or believe. 

An example is when someone wants to know if it is safe to vaccinate their children but already believes that vaccines are not safe, so they only look for information supporting the idea that vaccines are bad.

Thinking We Know It All: Similar to confirmation bias, this is called “overconfidence bias.” Sometimes we think our ideas are the best and don't listen to others. This can stop us from learning.

Have you ever met someone who you consider a “know it”? Probably, they have a lot of overconfidence bias because while they may know many things accurately, they can’t know everything. Still, if they act like they do, they show overconfidence bias.

There's a weird kind of bias similar to this called the Dunning Kruger Effect, and that is when someone is bad at what they do, but they believe and act like they are the best .

Following the Crowd: This is formally called “groupthink”. It's hard to speak up with a different idea if everyone agrees. But this can lead to mistakes.

An example of this we’ve all likely seen is the cool clique in primary school. There is usually one person that is the head of the group, the “coolest kid in school”, and everyone listens to them and does what they want, even if they don’t think it’s a good idea.

How to Overcome Biases

Here are a few ways to learn to think better, free from our biases (or at least aware of them!).

Know Your Biases: Realize that everyone has biases. If we know about them, we can think better.

Listen to Different People: Talking to different kinds of people can give us new ideas.

Ask Why: Always ask yourself why you believe something. Is it true, or is it just a bias?

Understand Others: Try to think about how others feel. It helps you see things in new ways.

Keep Learning: Always be curious and open to new information.

city in a globe connection

In today's world, everything changes fast, and there's so much information everywhere. This makes critical thinking super important. It helps us distinguish between what's real and what's made up. It also helps us make good choices. But thinking this way can be tough sometimes because of biases. These are like sneaky thoughts that can trick us. The good news is we can learn to see them and think better.

There are cool tools and ways we've talked about, like the "Socratic Questioning" method and the "Six Thinking Hats." These tools help us get better at thinking. These thinking skills can also help us in school, work, and everyday life.

We’ve also looked at specific scenarios where critical thinking would be helpful, such as deciding what diet to follow and checking facts.

Thinking isn't just a skill—it's a special talent we improve over time. Working on it lets us see things more clearly and understand the world better. So, keep practicing and asking questions! It'll make you a smarter thinker and help you see the world differently.

Critical Thinking Puzzles (Solutions)

The farmer, fox, chicken, and grain problem.

  • The farmer first takes the chicken across the river and leaves it on the other side.
  • He returns to the original side and takes the fox across the river.
  • After leaving the fox on the other side, he returns the chicken to the starting side.
  • He leaves the chicken on the starting side and takes the grain bag across the river.
  • He leaves the grain with the fox on the other side and returns to get the chicken.
  • The farmer takes the chicken across, and now all three items -- the fox, the chicken, and the grain -- are safely on the other side of the river.

The Rope, Jar, and Pebbles Problem

  • Take one rope and tie the jar of pebbles to its end.
  • Swing the rope with the jar in a pendulum motion.
  • While the rope is swinging, grab the other rope and wait.
  • As the swinging rope comes back within reach due to its pendulum motion, grab it.
  • With both ropes within reach, untie the jar and tie the rope ends together.

The Two Guards Problem

The question is, "What would the other guard say is the door to doom?" Then choose the opposite door.

The Hourglass Problem

  • Start both hourglasses. 
  • When the 4-minute hourglass runs out, turn it over.
  • When the 7-minute hourglass runs out, the 4-minute hourglass will have been running for 3 minutes. Turn the 7-minute hourglass over. 
  • When the 4-minute hourglass runs out for the second time (a total of 8 minutes have passed), the 7-minute hourglass will run for 1 minute. Turn the 7-minute hourglass again for 1 minute to empty the hourglass (a total of 9 minutes passed).

The Boat and Weights Problem

Take the cat over first and leave it on the other side. Then, return and take the fish across next. When you get there, take the cat back with you. Leave the cat on the starting side and take the cat food across. Lastly, return to get the cat and bring it to the other side.

The Lifeboat Dilemma

There isn’t one correct answer to this problem. Here are some elements to consider:

  • Moral Principles: What values guide your decision? Is it the potential greater good for humanity (the scientist)? What is the value of long-standing love and commitment (the elderly couple)? What is the future of young children who depend on their mothers? Or the selfless bravery of the teenager?
  • Future Implications: Consider the future consequences of each choice. Saving the scientist might benefit millions in the future, but what moral message does it send about the value of individual lives?
  • Emotional vs. Logical Thinking: While it's essential to engage empathy, it's also crucial not to let emotions cloud judgment entirely. For instance, while the teenager's bravery is commendable, does it make him more deserving of a spot on the boat than the others?
  • Acknowledging Uncertainty: The scientist claims to be close to a significant breakthrough, but there's no certainty. How does this uncertainty factor into your decision?
  • Personal Bias: Recognize and challenge any personal biases, such as biases towards age, profession, or familial status.

The Tech Dilemma

Again, there isn’t one correct answer to this problem. Here are some elements to consider:

  • Evaluate the Risk: How severe is the potential vulnerability? Can it be easily exploited, or would it require significant expertise? Even if the circumstances are rare, what would be the consequences if the vulnerability were exploited?
  • Stakeholder Considerations: Different stakeholders will have different priorities. Upper management might prioritize financial projections, the marketing team might be concerned about the product's reputation, and customers might prioritize the security of their data. How do you balance these competing interests?
  • Short-Term vs. Long-Term Implications: While launching on time could meet immediate financial goals, consider the potential long-term damage to the company's reputation if the vulnerability is exploited. Would the short-term gains be worth the potential long-term costs?
  • Ethical Implications : Beyond the financial and reputational aspects, there's an ethical dimension to consider. Is it right to release a product with a known vulnerability, even if the chances of it being exploited are low?
  • Seek External Input: Consulting with cybersecurity experts outside your company might be beneficial. They could provide a more objective risk assessment and potential mitigation strategies.
  • Communication: How will you communicate the decision, whatever it may be, both internally to your team and upper management and externally to your customers and potential users?

The History Mystery

Dr. Amelia should take the following steps:

  • Verify the Letters: Before making any claims, she should check if the letters are actual and not fake. She can do this by seeing when and where they were written and if they match with other things from that time.
  • Get a Second Opinion: It's always good to have someone else look at what you've found. Dr. Amelia could show the letters to other history experts and see their thoughts.
  • Research More: Maybe there are more documents or letters out there that support this new story. Dr. Amelia should keep looking to see if she can find more evidence.
  • Share the Findings: If Dr. Amelia believes the letters are true after all her checks, she should tell others. This can be through books, talks, or articles.
  • Stay Open to Feedback: Some people might agree with Dr. Amelia, and others might not. She should listen to everyone and be ready to learn more or change her mind if new information arises.

Ultimately, Dr. Amelia's job is to find out the truth about history and share it. It's okay if this new truth differs from what people used to believe. History is about learning from the past, no matter the story.

Related posts:

  • Experimenter Bias (Definition + Examples)
  • Hasty Generalization Fallacy (31 Examples + Similar Names)
  • Ad Hoc Fallacy (29 Examples + Other Names)
  • Confirmation Bias (Examples + Definition)
  • Equivocation Fallacy (26 Examples + Description)

Reference this article:

About The Author

Photo of author

Free Personality Test

Free Personality Quiz

Free Memory Test

Free Memory Test

Free IQ Test

Free IQ Test

PracticalPie.com is a participant in the Amazon Associates Program. As an Amazon Associate we earn from qualifying purchases.

Follow Us On:

Youtube Facebook Instagram X/Twitter

Psychology Resources

Developmental

Personality

Relationships

Psychologists

Serial Killers

Psychology Tests

Personality Quiz

Memory Test

Depression test

Type A/B Personality Test

© PracticalPsychology. All rights reserved

Privacy Policy | Terms of Use

  • Exploring the SCAMPER Technique for Creative Problem Solving
  • Exploring Brainwalking: A Creative Problem-Solving Technique
  • Logic Puzzles and Brain Teasers: A Comprehensive Overview
  • Exploring the Serendipity Technique of Creative Problem Solving
  • Analytical problem solving
  • Identifying root causes
  • Analyzing consequences
  • Brainstorming solutions
  • Heuristic problem solving
  • Using analogies
  • Applying existing solutions
  • Trial and error
  • Creative problem solving
  • Mind mapping
  • Brainstorming
  • Lateral thinking
  • Research skills
  • Interpreting information
  • Data collection and analysis
  • Identifying patterns
  • Critical thinking skills
  • Recognizing bias
  • Analyzing arguments logically
  • Questioning assumptions
  • Communication skills
  • Negotiation and compromise
  • Listening skills
  • Explaining ideas clearly
  • Planning techniques
  • SWOT analysis
  • Gantt charting
  • Critical path analysis
  • Decision making techniques
  • Force field analysis
  • Paired comparison analysis
  • Cost-benefit analysis
  • Root cause analysis
  • Five whys technique
  • Fault tree analysis
  • Cause and effect diagrams
  • Brainstorming techniques
  • Brainwriting
  • Brainwalking
  • Round-robin brainstorming
  • Creative thinking techniques
  • Serendipity technique
  • SCAMPER technique
  • Innovation techniques
  • Value innovation techniques
  • Design thinking techniques
  • Idea generation techniques
  • Personal problems
  • Deciding what career to pursue
  • Managing finances effectively
  • Solving relationship issues
  • Business problems
  • Increasing efficiency and productivity
  • Improving customer service quality
  • Reducing costs and increasing profits
  • Environmental problems
  • Preserving natural resources
  • Reducing air pollution levels
  • Finding sustainable energy sources
  • Individual brainstorming techniques
  • Thinking outside the box
  • Word association and random word generation
  • Mind mapping and listing ideas
  • Group brainstorming techniques
  • Synectics technique
  • Online brainstorming techniques
  • Online whiteboarding tools
  • Virtual brainstorming sessions
  • Collaborative mind mapping software
  • Team activities
  • Group decision making activities
  • Debate activities and role-play scenarios
  • Collaborative problem solving games
  • Creative activities
  • Creative writing exercises and storyboards
  • Imagination activities and brainstorming sessions
  • Visualization activities and drawing exercises
  • Games and puzzles
  • Crossword puzzles and Sudoku
  • Logic puzzles and brain teasers
  • Jigsaw puzzles and mazes
  • Types of decisions
  • Structured decisions
  • Simple decisions
  • Complex decisions
  • Problem solving skills
  • Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

Learn how to identify and address bias in decision making with our guide to recognizing bias in problem solving and critical thinking.

Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

In today's world, it is becoming increasingly important to recognize bias and how it can affect our decision-making. Bias can cloud our judgement, lead us to make decisions that are not in our best interests, and limit our ability to solve problems effectively. In this guide, we will explore the concept of recognizing bias and how it can be used as a tool for developing critical thinking and problem-solving skills. We will discuss the various types of biases, why recognizing them is important, and how to identify and counteract them.

Confirmation bias

Cognitive bias.

This type of bias can lead to unfair judgments or decisions. Other common types of bias include cultural bias, which is the tendency to favor one’s own culture or group; and political bias, which is the tendency to favor one’s own political party or beliefs. In order to identify and address bias in oneself and others, it is important to be aware of potential sources of bias. This includes personal opinions, values, and preconceived notions. Being mindful of these potential sources of bias can help us become more aware of our own biases and recognize them in others.

Additionally, it is important to be open-minded and willing to consider alternative perspectives. Additionally, it is helpful to challenge our own assumptions and beliefs by questioning them and seeking out evidence that supports or refutes them. The potential implications of not recognizing or addressing bias are significant. If left unchecked, biases can lead to unfair decisions or judgments, as well as inaccurate conclusions. This can have serious consequences for individuals and organizations alike.

Implications of Not Recognizing or Addressing Bias

Strategies for identifying and addressing bias.

Recognizing bias in oneself and others is an important part of making informed decisions. There are several strategies that can be used to identify and address bias. One of the most effective strategies is to take a step back and look at the situation objectively. This involves examining the facts and assumptions that are being used to make decisions.

It can also involve assessing the potential impact of decisions on multiple stakeholders. By removing personal biases from the equation, it is possible to make more informed decisions. Another important strategy for identifying and addressing bias is to question the sources of information. It is important to consider the credibility of sources, as well as any potential biases that may be present.

Fact-checking sources and considering multiple perspectives can help identify any potential biases in the information being used. In addition, it is important to remain aware of our own biases. We all have preconceived notions about certain topics that can affect our decision-making process. By being mindful of our biases, we can avoid making decisions that are influenced by them. Finally, it is important to be open to other perspectives and willing to engage in meaningful dialogue with others.

Types of Bias

Halo effect, what is bias.

It can be an unconscious preference that influences decision making and can lead to adverse outcomes. It is important to recognize bias because it can have a negative impact on our ability to make sound decisions and engage in problem solving and critical thinking. Bias can manifest itself in various ways, from subtle mental shortcuts to overt prejudices. Types of bias include confirmation bias, where we seek out information that confirms our existing beliefs; availability bias, where we base decisions on the information that is most readily available; and representativeness bias, where we assume that two events or objects are related because they share similar characteristics. Other forms of bias include halo effect, where a single positive quality or trait can influence the perception of an entire person; and stereotyping, which is the tendency to make judgments about individuals based on their perceived membership in a certain group. It is important to recognize bias in ourselves and others so that we can make informed decisions and engage in problem solving and critical thinking.

Sources of Bias

Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence. Personal opinions and values can lead to biased decision-making. They can be shaped by past experiences, cultural background , and other personal factors. For example, someone's opinion about a certain topic may be based on what they have previously heard or read. Similarly, preconceived notions can also lead to biased conclusions. Cultural norms can also play a role in creating bias.

For instance, people may be more likely to believe information from a source they trust or respect, even if it is not based on fact. Similarly, people may be more likely to make decisions that conform to the expectations of their culture or society. In addition, people can also be influenced by their own prejudices or stereotypes. This type of bias can lead to unfair treatment of certain individuals or groups of people. Finally, it is important to be aware of the potential for confirmation bias, where people will seek out information that confirms their existing beliefs and disregard any contradictory evidence. By recognizing and understanding these sources of bias, people can make more informed decisions and engage in more effective problem solving and critical thinking.

In conclusion, recognizing and addressing bias is an essential part of problem solving and critical thinking. Bias can come from many sources, including our own beliefs, cultural norms, and past experiences. Knowing the types of bias and strategies for identifying and addressing them can help us make informed decisions and better engage in critical thinking. Taking time to reflect on our own biases is also important for making unbiased decisions.

Ultimately, recognizing and addressing bias will improve our problem-solving and critical thinking skills.

Mind Mapping: A Creative Problem Solving Tool

  • Mind Mapping: A Creative Problem Solving Tool

Learn all about mind mapping and how it can help you to solve problems creatively and effectively.

Visualization Activities and Drawing Exercises

  • Visualization Activities and Drawing Exercises

Learn more about visualization activities and drawing exercises, from problem-solving activities to creative activities.

Brainstorming Solutions: A Problem-Solving Guide

  • Brainstorming Solutions: A Problem-Solving Guide

Learn how to use brainstorming to come up with creative solutions to complex problems. Discover problem-solving strategies that work.

Thinking Outside the Box: An Overview of Individual Brainstorming Techniques

  • Thinking Outside the Box: An Overview of Individual Brainstorming Techniques

Learn about individual brainstorming techniques to help you think outside the box and come up with creative solutions.

  • Mind Mapping - Creative Problem Solving and Creative Thinking Techniques
  • Collaborative Problem Solving Games: Exploring Creative Solutions for Teams
  • Force Field Analysis for Problem Solving and Decision Making
  • Finding Sustainable Energy Sources

Improving Customer Service Quality

  • Value Innovation Techniques
  • Mind Mapping and Listing Ideas
  • Collaborative Mind Mapping Software
  • SWOT Analysis: A Comprehensive Overview
  • Fault Tree Analysis: A Comprehensive Overview
  • Data Collection and Analysis - Problem Solving Skills and Research Skills
  • Virtual Brainstorming Sessions: A Comprehensive Overview
  • Cause and Effect Diagrams: A Problem-Solving Technique

Exploring Trial and Error Problem Solving Strategies

  • Interpreting Information: A Problem-Solving and Research Skills Primer
  • Brainstorming: A Comprehensive Look at Creative Problem Solving
  • Gantt Charting: A Primer for Problem Solving & Planning Techniques
  • Debate Activities and Role-Play Scenarios
  • Design Thinking Techniques: A Comprehensive Overview
  • Cost-benefit Analysis: A Guide to Making Informed Decisions
  • Managing Your Finances Effectively
  • Idea Generation Techniques: A Comprehensive Overview
  • Structured Decisions: An Overview of the Decision Making Process
  • Preserving Natural Resources
  • Critical Path Analysis: A Comprehensive Guide
  • Maximizing Efficiency and Productivity
  • Crossword Puzzles and Sudoku: A Problem-Solving Exploration
  • Word Association and Random Word Generation
  • Paired Comparison Analysis: A Comprehensive Overview
  • Choosing the Right Career: Problem-Solving Examples
  • Brainwriting: A Creative Problem-Solving Technique
  • Applying Existing Solutions for Problem Solving Strategies
  • Identifying Patterns: A Practical Guide
  • Imagination Activities and Brainstorming Sessions
  • How to Explain Ideas Clearly
  • Analyzing Arguments Logically
  • Reducing Costs and Increasing Profits: A Problem Solving Example
  • Creative Writing Exercises and Storyboards
  • Exploring Synectics Technique: A Comprehensive Guide
  • Jigsaw Puzzles and Mazes: Problem Solving Activities for Fun and Learning
  • Brainwriting: A Group Brainstorming Technique
  • Questioning Assumptions: A Critical Thinking Skill
  • Analyzing Consequences: A Problem Solving Strategy
  • Identifying Root Causes
  • Exploring Lateral Thinking: A Comprehensive Guide to Problem Solving Strategies
  • Making Complex Decisions: A Comprehensive Overview
  • Round-robin Brainstorming: A Creative Problem Solving Tool
  • Solving Relationship Issues
  • Negotiation and Compromise

Using Analogies to Solve Problems

  • Five Whys Technique: A Comprehensive Analysis
  • Exploring Online Whiteboarding Tools for Brainstorming
  • Round-robin brainstorming: Exploring a Group Brainstorming Technique
  • Listening Skills: A Comprehensive Overview
  • Simple Decisions - An Overview
  • Reducing Air Pollution Levels
  • Group Decision Making Activities

New Articles

Improving Customer Service Quality

Which cookies do you want to accept?

Training and Consulting

Cultivating Human Resiliency

Cultivating Human Resiliency

7 Ways to Improve Critical Thinking and Challenge Brain Bias

brain bias

In a world where we are bombarded with quick bites of information with questionable validity, the need for critical thinking is more important than ever. However, our brain is designed to make quick decisions with a limited amount of information through the processes of assimilation and generalization. This enables us to learn and adapt quicker but it also can result in a brain bias. Cultivating a resiliency toward this bias takes practice. Here are seven ways to improve critical thinking and challenge brain bias:

Look for alternative premises:   Our brain is wired to interpret information through existing contexts causing us to overly focus on information that supports our current beliefs. Counteracting this requires a deliberate shift in focus. If you believe something to be so, look for the evidence that it is not so. Actively seek information that disputes your conclusion.

Challenge cause and effect assumptions:   We tend to assume that there is a causal relationship if one event precedes another. This can be an error in thinking. To challenge this, make it a habit to ask yourself “what else might be the reason for this to happen?”

Step in someone else’s shoes :  When you find yourself judging another’s behavior, try to imagine what it would be like to be in their place. This requires more than just thinking about how you would respond to the situation. It requires considering the thoughts and feelings involved given that person’s past experiences as well as the current circumstances. This not only increases your empathy, it challenges your brain to broaden its perspective.

Expose yourself to opposing views:   Give yourself a chance to interact and dialogue with people who hold different opinions than your own. Try to keep an open mind. Don’t just focus on convincing them of what you believe. Listen closely to their rationales for what they believe. Healthy debate is a good way to grow and learn.

Consider the source:   People tend to give undue weight to anything published or broadcast. We also can be easily swayed by numbers and statistics. With the internet and social media, it is easy to mistake questionable sources as legitimate. Even legitimate news agencies or research can and often do contain bias. Understand the source of where you are getting information and ask yourself how their bias is influencing their argument. Better yet, get in the habit of checking information from more than one source and don’t limit your exposure to your own bias.

Expand your cultural experiences :  Make it a habit to read books by authors with different cultural backgrounds. See movies depicting characters from other cultures. Travel to different regions or countries and experience how others live. This opens your mind to ways of thinking and perceiving different than your limited experiences.

Think outside the box:   Don’t get trapped into thinking there are only one or two ways to do things when there might be a wide array of choices. Learn how to creatively brainstorm solutions to problems.

Good critical thinking skills keep the brain resilient.   What other ways have you found to challenge brain bias?

Share this:

Leave a reply cancel reply.

© Cultivating Human Resiliency 2022

example of how critical thinking cleared up biases

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

Are You Aware of Your Biases?

  • Carmen Acton

example of how critical thinking cleared up biases

Four ways to overcome your shortcomings and be a more inclusive leader.

Often, it’s easy to “call out” people when we notice their microaggressions or biased behaviors. But it can be equally challenging to recognize and acknowledge our own unconscious biases. That said, becoming aware of your shortcomings can help you hone your leadership style, especially when you’re a new manager.

  • The first step is to acknowledge that you have biases and educate yourself to do better. Ask yourself: Do I hold stereotypes or assumptions about a particular social group? As a manager, do I acknowledge and leverage differences on my team? Use your answers to help you unlearn your unconscious assumptions.
  • When someone calls out your unconscious biases, try not to get defensive. Rather, assume positive intent and use their feedback as an opportunity to learn.
  • Reach out to a diverse group of peers to understand how they perceive you, and seek continuous feedback. These peers can also become “accountability buddies” who help you stay on track when you decide to change your behaviors.
  • Embrace diverse perspectives. If your close circle “looks” just like you, it’s time to build a more diverse network. Join an employee resource group or look to connect with colleagues whose backgrounds are different than your own.

Ascend logo

Where your work meets your life. See more from Ascend here .

When I became a manager for the first time, I had a clear vision of my leadership style: I wanted to value my team and treat everyone with respect. Once I took charge, I learned that leadership wasn’t as simple as I’d first imagined it.

example of how critical thinking cleared up biases

  • Carmen Acton , MBA, PCC, is a  Leadership Impact Coach and Process Consultant in the San Francisco Bay Area, California. Carmen has worked in a succession of corporate leadership roles in a variety of disciplines ranging from Safety Engineering to Employee and Leadership Development. She has worked with clients in the oil and gas, food and beverage, technology, and health care sectors, to name a few. Her passion is helping clients elevate their leadership capabilities by sparking insights and actions that matter. She works with motivated, high-potential leaders to fully embrace humanity while elevating leadership and business performance in a complex world.

Partner Center

What Is Cognitive Bias? 7 Examples & Resources (Incl. Codex)

defining cognitive biases

For example, we might:

  • Trust someone more if they’re an authority figure than if they’re not
  • Assume someone’s gender based on their profession
  • Make poor decisions based on the information that we’re given

The reasons for our poor decision making can be a consequence of heuristics and biases. In general, heuristics and biases describe a set of decision-making strategies and the way that we weigh certain types of information. The existing literature on cognitive biases and heuristics is extensive, but this post is a user-friendly summary.

Central to this post’s topic is how cognitive heuristics and biases influence our decision making. We will also learn more about how to overcome them.

Before you continue, we thought you might like to download these Positive CBT Exercises for free . These science-based exercises will provide you with detailed insight into Positive CBT and give you the tools to apply it in your therapy or coaching.

This Article Contains:

What are cognitive biases.

  • List and Types of Biases: The Codex

4 Examples of Cognitive Biases

Examples in business and everyday life, role of biases in decision making, 2 popular experiments, 4 ways to overcome your biases, bias modification exercises and activities, a look at cognitive bias modification apps, 5 relevant books, our favorite ted talks on the topic, resources from positivepsychology.com, a take-home message.

When considering the term ‘ cognitive biases ,’ it’s important to note that there is overlap between cognitive biases and heuristics . Sometimes these two terms are used interchangeably, as though they are synonyms; however, their relationship is nuanced.

In his book, Thinking, Fast and Slow , Professor Daniel Kahneman (2011, p. 98) defines heuristics as

“ a simple procedure that helps find adequate, though often imperfect, answers to difficult questions. ”

Tversky and Kahneman (1974, p. 1130) define the relationship between biases and heuristics as follows:

“ … cognitive biases that stem from the reliance on judgmental heuristics. ”

Gonzalez (2017, p. 251) also described the difference between the two terms:

“ Heuristics are the ‘ shortcuts ’ that humans use to reduce task complexity in judgment and choice, and biases are the resulting gaps between normative behavior and the heuristically determined behavior. ”

Lists and Types of Biases: The Codex

cognitive bias codex

Created by John Manoogian III and Buster Benson, this codex is a useful tool for visually representing all of the known biases that exist to date.

The biases are arranged in a circle and can be divided into four quadrants. Each quadrant is dedicated to a specific group of cognitive biases:

  • What should we remember? Biases that affect our memory for people, events, and information
  • Too much information Biases that affect how we perceive certain events and people
  • Not enough meaning Biases that we use when we have too little information and need to fill in the gaps
  • Need to act fast Biases that affect how we make decisions

The Cognitive Bias Codex is a handy visual tool that organizes biases in a meaningful way; however, it is worth pointing out that the codex lists heuristics and biases both as ‘biases.’

If you decide to rely on the Cognitive Bias Codex, then keep in mind the distinction between heuristics and biases mentioned above.

gambler's falacy

1. Confirmation bias

This bias is based on looking for or overvaluing information that confirms our beliefs or expectations (Edgar & Edgar, 2016; Nickerson, 1998). For example, a police officer who is looking for physical signs of lying might mistakenly classify other behaviors as evidence of lying.

2. Gambler’s fallacy

This false belief describes our tendency to believe that something will happen because it hasn’t happened yet (Ayton & Fischer, 2004; Clotfelter & Cook, 1993).

For example, when betting on a roulette table, if previous outcomes have landed on red, then we might mistakenly assume that the next outcome will be black; however, these events are independent of each other (i.e., the probability of their results do not affect each other).

3. Gender bias

Gender bias describes our tendency to assign specific behavior and characteristics to a particular gender without supporting evidence (Garb, 1997).

For example, complaints of pain are taken more seriously when made by male, rather than female, patients (Gawande, 2014); women are perceived as better caregivers than men (Anthony, 2004); specific clinical syndromes are more readily diagnosed in women than in men (Garb, 1997); and students often rate female lecturers lower than male lecturers (MacNell, Driscoll, & Hunt; 2014; Mitchell & Martin, 2018).

4. Group attribution error

This error describes our tendency to overgeneralize how a group of people will behave based on an interaction with only one person from that group (Pettigrew, 1979).

For example, a negative experience with someone from a different group (e.g., a different culture, gender, religion, political party, etc.) might make us say that all members of that group share the same negative characteristics. Group attribution error forms part of the explanation for prejudice in social psychology.

example of how critical thinking cleared up biases

Download 3 Free Positive CBT Exercises (PDF)

These detailed, science-based exercises will equip you or your clients with tools to find new pathways to reduce suffering and more effectively cope with life stressors.

Download 3 Free Positive CBT Tools Pack (PDF)

By filling out your name and email address below.

Gender bias in the workplace is a well-documented and researched area of cognitive bias. Women often do not occupy top senior positions. For example, in 2010, only 15.2% of top positions in US Fortune-500 companies were held by women (Soares, 2010). Women tend to earn less than their male counterparts, and women’s salaries differ according to their marital status.

For example, consider these statistics reported by Güngör and Biernat (2009, p. 232):

“ [In 2005]  … 68.1% of married and 79.8% of single mothers in the U.S. participate in the workforce, but while non-mothers earn 90 cents to a man’s dollar, mothers earn 73 cents, and single mothers earn about 60 cents.”

The social desirability bias is a concern for anyone who uses self-report data. Companies that run internal surveys investigating topics that may cast an employee in a poor light must be aware of how the social desirability bias will affect the validity of their data.

Knowing that people adjust their answers to appear more socially desirable, investigators (such as researchers and clinicians) can try to reframe their questions to be less direct, use formal tests, or anonymize responses.

Another sphere of our lives where biases can have devastating effects is in personal finance. According to Hershey, Jacobs-Lawson, and Austin (2012), there are at least 40 cognitive biases that negatively affect our ability to make sound financial decisions, thus hindering our ability to plan for retirement properly. Some of these biases include:

  • Halo effect (just because that real estate agent was nice doesn’t mean it’s a good deal)
  • Optimistic overconfidence (“I’ll be fine in the future, so I don’t need to save that much now.”)
  • Confirmation bias (looking for information to confirm or validate unwise financial decisions)

Below you might find revealing insight into how biases affect our decision making.

The Monty Hall problem

the monty hall problem

Assume that there are three doors.

  • Behind one door is a fantastic prize: a car.
  • Behind the other two doors are mediocre prizes: $1,000.

You initially choose Door 1. Before revealing what’s behind your chosen door, the presenter opens a different door, Door 2, to reveal the mediocre prize. The presenter then gives you the option to either keep what’s behind your initial chosen door or change your choice, knowing what’s behind Door 2. What should you do now? Should you stay with your initial choice, Door 1, or should you switch to Door 3?

The correct answer is that you have the best chances of winning the car if you change your choice. This is called the Monty Hall problem. Here’s why you should switch:

  • When you made your initial decision, you didn’t know what the outcome would be (mediocre prize versus awesome prize).
  • After the host reveals more information, you have a better idea about which prizes are behind which doors.
  • Based on this information, you’re more likely to find the car if you change your chosen door, an improvement from odds of 1 in 3 for your initial choice, to 2 in 3 if you switch.

Despite the statistics being in favor of switching, most people are hesitant to abandon their first choice and don’t accept the offer to change it.

Other cognitive biases

The Monty Hall problem is an excellent example of how our intuitions and heuristics lead us to make poor decisions. However, there are lots of other cognitive biases and heuristics that also affect our decision making.

Kahneman, Slovic, Slovic, & Tversky (1982) list 13 biases that arise from the following three heuristics:

  • A cognitive bias that may result from this heuristic is that we ignore the base rate of events occurring when making decisions. For example, I am afraid of flying; however, it’s more likely that I might be in a car crash than in a plane crash. Despite this, I still hate flying but am indifferent to hopping into my car.
  • For example, when a violent crime occurs in a neighborhood, neighbors in that neighborhood will give a bigger estimate of the frequency of these crimes, compared to the reported statistics. The reason for their overestimate is that the memory of the violent crime is easy to retrieve, which makes it seems like violent crime happens more frequently than it actually does.
  • For example, assume that I offer to sell you a car and I ask for $250. You counter with $200. You might think that this is a good deal because you bought the car for less than the asking price; however, your counteroffer was heavily influenced by my asking price, and you’re not likely to deviate too much from it.

To further illustrate the effect of cognitive bias, below are two popular experiments.

1. Anchoring and adjustment

Tversky and Kahneman (1974) found that our estimates are heavily influenced by the first number given to us. For example, participants were asked to estimate the percentage of African countries in the United Nations.

Before giving their answer, each participant had to spin a ‘Wheel of Fortune,’ which would determine their initial starting percentage. The result of the ‘Wheel of Fortune’ was random and meaningless. Despite this, participants’ estimate of African UN member-countries didn’t differ much from whatever random ‘Wheel of Fortune’ amount they landed on, regardless of what that amount was.

2. The attractiveness halo effect

Male students were asked to rate essays written by female authors (Landy & Sigall, 1974). The quality of the essays varied: some were poorly written, and others were well written.

Additionally, some of the essays were accompanied by a photograph of the author (who was either attractive or unattractive), and others were not. Male college students rated the quality of the essay and the talent of the authors higher when:

  • the essay was written by an attractive author, and
  • this effect was evident only when the essay was of poor quality.

In this study, the male students demonstrated the halo effect, applying the perceived attractiveness of the female author to the quality of the paper.

ways to overcome cognitive bias

1. Reflect on past decisions

If you’ve been in a similar situation before, you can reflect on the outcomes of those previous decisions to learn how to overcome your biases.

An example of this is budgeting. We tend to underestimate how much money we need to budget for certain areas of our life. However, you can learn how much money to budget by tracking your expenditure for the last few months. Using this information from the past, you can better predict how much money you’ll need for different financial categories in the future.

2. Include external viewpoints

There is some evidence that we make better decisions and negotiations when we consult with other people who are objective, such as mediators and facilitators (Caputo, 2016).

Therefore, before making a decision, talk to other people to consider different viewpoints and have your own views challenged. Importantly, other people might spot your own cognitive biases.

3. Challenge your viewpoints

When making a decision, try to see the weaknesses in your thinking regardless of how small, unlikely, or inconsequential these weaknesses might seem. You can be more confident in your decision if it withstands serious, critical scrutiny.

4. Do not make decisions under pressure

A final way to protect yourself from relying on your cognitive biases is to avoid making any decisions under time pressure. Although it might not feel like it, there are very few instances when you need to make a decision immediately. Here are some tips for making a decision that can have substantial consequences:

  • Take the necessary time to ruminate.
  • List the pros and cons.
  • Talk to friends or family members for advice (but remember that they may have their own biases).
  • Try to poke holes in your reasoning.

In the last decade, research has looked at cognitive bias modification (CBM) since cognitive biases are associated with the severity of anxiety and depression. The relationship between cognitive biases and anxiety and depression is assumed to be causal; that is, cognitive biases cause an increase in the severity of symptoms.

CBM exercises are designed with this causal relationship in mind. If the cognitive bias is removed or reduced, then the severity of the symptoms should also lessen.

There are two categories of CBM exercises:

  • Changing attentional bias: In this type of exercise, participants are trained to pay more attention to positive stimuli instead of negative stimuli.
  • Changing interpretation bias: Participants are primed with positive information before completing an emotionally ambiguous task.

At least six meta-analyses report conflicting findings (Beard, Sawyer, & Hofmann, 2012; Cristea, Kok, & Cuijpers, 2015; Hakamata et al., 2010; Hallion & Ruscio, 2011; Heeren, Mogoașe, Philippot, & McNally, 2015; Mogoaşe, David, & Koster, 2014).

There are many reasons for these differences; for example, the types of studies included, the moderators included, the definition of the interventions, the outcome variable used, the clinical condition studied, and so forth. Therefore, the jury is still out on whether CBM affects symptom severity reliably.

There are many cognitive bias modification apps available for download. Before purchasing an app, research whether the creator of the app has followed sound research principles or done any research when developing the app (Zhang, Ying, Song, Fung, & Smith, 2018).

Most of the bias modification apps aim to change the attentional bias. For example, the following apps aim to train users to respond quicker to happy faces than to sad or angry faces. All hypothesize that repeated use will result in more positive moods.

  • Bias Modification
  • Upbeat Mind: Positivity Trainer

The Cognitive Bias Cheatsheet is a useful way to remind oneself of the different cognitive biases that exist.

Here is a list of books relevant for anyone interested in cognitive biases.

Firstly, any list about biases would be remiss without Thinking, Fast and Slow by Daniel Kahneman (2011). In this book, Kahneman unpacks some of the most common biases that we experience when making decisions. (Available on Amazon )

In the same vein is The Drunkard’s Walk: How Randomness Rules Our Lives by Leonard Mlodinow (2009). This book addresses how humans misjudge the effect that randomness has on our decision making. (Available on Amazon )

Predictably Irrational by Dan Ariely (2008) is an excellent and very accessible book about how our behavior is often governed by seemingly random and illogical thought processes. The opening chapter is jaw dropping. (Available on Amazon )

Nassim Nicholas Taleb published a series of books – five, in fact – and I include two of them on this list: Fooled by Randomness (2005) and The Black Swan (2007). The entire series discusses various aspects of uncertainty. (Available on Amazon )

We’ve put together a list of our favorite impressive TED talks on cognitive biases.

If you want to learn more about cognitive biases, then these talks are a great jumping-off point:

Are We in Control of Our Own Decisions? – Dan Ariely

Confirmation bias – nassor al hilal.

Confirmation Bias in 5 Minutes – Julia Galef

If you want to learn how to overcome your biases, then we can recommend the following:

How to Outsmart Your Own Unconscious Bias – Valerie Alexander

How to design gender bias out of your workplace – Sara Sanford

Unpacking the biases that shape our beliefs – Mike Hartmann

We have useful resources that you can use when tackling cognitive biases.

First, increasing awareness of Unhelpful Thinking Styles can change the way you think about yourself and your environment. Ultimately, users will increase their awareness of their cognitive biases, and through this awareness, be able to change their behavior.

Our Neutralizing Judgmental Thoughts worksheet is also useful for combating negative thoughts and biases. This exercise helps users apply the CLEAR acronym to adopt a less critical outlook when dealing with others.

The Core Beliefs Worksheet  is a useful tool for reflecting on the origin and validity of our core beliefs. This technique might help us ‘step away’ from our biases.

An approach that is always beneficial, is to understand and find ways to apply positive psychology to your every day, and this selection of positive psychology TED Talks is a good starting point.

If you’re looking for more science-based ways to help others through CBT, this collection contains 17 validated positive CBT tools for practitioners. Use them to help others overcome unhelpful thoughts and feelings and develop more positive behaviors.

example of how critical thinking cleared up biases

World’s Largest Positive Psychology Resource

The Positive Psychology Toolkit© is a groundbreaking practitioner resource containing over 500 science-based exercises , activities, interventions, questionnaires, and assessments created by experts using the latest positive psychology research.

Updated monthly. 100% Science-based.

“The best positive psychology resource out there!” — Emiliya Zhivotovskaya , Flourishing Center CEO

We often rely on cognitive heuristics and biases when making decisions.

Heuristics can be useful in certain circumstances; however, heuristics and biases can result in poor decision making and reinforce unhealthy behavior.

There are many different types of cognitive biases, and all of us are victim to one or more.

However, being aware of our biases and how they affect our behavior is the first step toward resisting them.

We hope you enjoyed reading this article. For more information, don’t forget to download our three Positive CBT Exercises for free .

  • Anthony, A. S. (2004). Gender bias and discrimination in nursing education: Can we change it? Nurse Educator, 29 (3), 121–125.
  • Ariely, D. (2008). Predictably irrational. Harper Perennial.
  • Ayton, P., & Fischer, I. (2004). The hot hand fallacy and the gambler’s fallacy: Two faces of subjective randomness? Memory & Cognition, 32 (8), 1369–1378.
  • Beard, C., Sawyer, A. T., & Hofmann, S. G. (2012). Efficacy of attention bias modification using threat and appetitive stimuli: A meta-analytic review. Behavior Therapy, 43 (4), 724–740.
  • Caputo, A. (2016). Overcoming judgmental biases in negotiations: A scenario-based survey analysis on third party direct intervention. Journal of Business Research, 69 (10), 4304–4312.
  • Clotfelter, C. T., & Cook, P. J. (1993). The “gambler’s fallacy” in lottery play. Management Science, 39( 12), 1521–1525.
  • Cristea, I. A., Kok, R. N., & Cuijpers, P. (2015). Efficacy of cognitive bias modification interventions in anxiety and depression: Meta-analysis. The British Journal of Psychiatry, 206 (1), 7–16.
  • Edgar, G., & Edgar, H. (2016). Perception and attention: Errors and accidents. In D. Groome and M.W. Eysenck (Eds.), An introduction to applied cognitive psychology (2nd ed) (pp. 9–38). Routledge.
  • Garb, H. N. (1997). Race bias, social class bias, and gender bias in clinical judgment. Clinical Psychology: Science and Practice, 4 (2), 99–120.
  • Gawande, A. (2014). Being mortal: Medicine and what matters in the end. Metropolitan Books.
  • Gonzalez, C. (2017). Decision-making: A cognitive science perspective. In S. Chipman (Ed.), The Oxford handbook of cognitive science (pp. 249–264). Oxford University Press. Accessed on July 9, 2020 from https://www.cmu.edu/dietrich/sds/ddmlab/papers/oxfordhb-9780199842193-e-6.pdf
  • Güngör, G., & Biernat, M. (2009). Gender bias or motherhood disadvantage? Judgments of blue-collar mothers and fathers in the workplace. Sex Roles, 60 (3–4), 232–246.
  • Hakamata, Y., Lissek, S., Bar-Haim, Y., Britton, J. C., Fox, N. A., Leibenluft, E., … & Pine, D. S. (2010). Attention bias modification treatment: A meta-analysis toward the establishment of novel treatment for anxiety. Biological Psychiatry, 68 (11), 982–990.
  • Hallion, L. S., & Ruscio, A. M. (2011). A meta-analysis of the effect of cognitive bias modification on anxiety and depression. Psychological Bulletin, 137 (6), 940.
  • Heeren, A., Mogoașe, C., Philippot, P., & McNally, R. J. (2015). Attention bias modification for social anxiety: A systematic review and meta-analysis. Clinical Psychology Review, 40 , 76–90.
  • Hershey, D. A., Jacobs-Lawson, J. M., & Austin, J. T. (2012). Effective financial planning for retirement. In M. Wang (Ed.), Oxford handbook of retirement (pp. 402–430). Oxford University Press.
  • Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus, and Giroux.
  • Kahneman, D., Slovic, S. P., Slovic, P., & Tversky, A. (Eds.). (1982). Judgment under uncertainty: Heuristics and biases. Cambridge University Press.
  • Landy, D., & Sigall, H. (1974). Beauty is talent: Task evaluation as a function of the performer’s physical attractiveness. Journal of Personality and Social Psychology, 29 (3), 299.
  • MacNell, L., Driscoll, A., & Hunt, A. N. (2014). What’s in a name: Exposing gender bias in student ratings of teaching. Innovative Higher Education, 40 (4), 291–303.
  • Mitchell, K. M., & Martin, J. (2018). Gender bias in student evaluations. PS: Political Science & Politics, 51 (3), 648–652.
  • Mlodinow, L. (2009). The drunkard’s walk: How randomness rules our lives. Vintage.
  • Mogoaşe, C., David, D., & Koster, E. H. (2014). Clinical efficacy of attentional bias modification procedures: An updated meta‐analysis. Journal of Clinical Psychology, 70 (12), 1133–1157.
  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2 (2), 175–220.
  • Pettigrew, T. F. (1979). The ultimate attribution error: Extending Allport’s cognitive analysis of prejudice. Personality and Social Psychology Bulletin, 5 (4), 461–476.
  • Soares, R. (2010). 2010 Catalyst census: Fortune 500 women board directors . Catalyst.
  • Taleb, N. N. (2005). Fooled by randomness: The hidden role of chance in life and in the markets (vol. 1). Random House.
  • Taleb, N. N. (2007). The black swan: The impact of the highly improbable (vol. 2). Random House.
  • Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 , 1124–1131.
  • Zhang, M., Ying, J., Song, G., Fung, D. S., & Smith, H. (2018). Attention and cognitive bias modification apps: Review of the literature and of commercially available apps. JMIR mHealth and uHealth , 6 (5).

' src=

Share this article:

Article feedback

What our readers think.

Joseph

It’s truly appreciated your efforts I am sure this is going to be of immense help in my lectures to diploma students

Thom Moore

Dumb. I wish the author would have used the “4 Ways to Overcome Bias” prior to writing this irrelevant article. Then maybe the contradictions could have been avoided such as the codex containing both heuristics and biases or the gender bias and marital status. Leading off with gender bias, by the way, is a dead giveaway that this is just propaganda disguised as pseudo-intellectual tripe. The author painted the issue as “women make less, so = bias.” Allow me to use the group attribution bias and call this as another article from majority female psychology “profession.”

Alex

I agree that awareness of our biases is the first step to overcoming them. But I think it’s important to also understand that we all have biases, and that they’re not necessarily a bad thing. We need biases to make decisions, and without them we would be paralyzed. The key is to be aware of our biases and to try to overcome them when they lead us astray.

Bruce

You may be conflating heuristics with bias. Heuristics are shortcuts to see pattern and simplify things based on our experience and intuition. Heuristics are needed to make decisions and solve problems. But heuristics are also prone to thinking errors. Biases are thinking errors.

Elizabeth Cortez

Thank you for this feast of information that I will be savoring over and over for weeks.

Cynthia Braccini

Thank you very much for the detailed information and resources. I plan to utilize this as a link with certain highlighted components for my social psychology unit with my students.

PATRICK MITI

VERY GOOD ARTICLE. MAY USE IT TO TRAIN MY STUDENTS RESILIENCE AND GRIT AND PERCEPTIVE CAPACITY IN COUNSELLING SESSIONS

PATRICK MITI

Chris Sanders

Thanks for the detailed blog. I’m going to provide this as a link on my critical thinking course that I teach at a university.

Let us know your thoughts Cancel reply

Your email address will not be published.

Save my name, email, and website in this browser for the next time I comment.

Related articles

FEA

Fundamental Attribution Error: Shifting the Blame Game

We all try to make sense of the behaviors we observe in ourselves and others. However, sometimes this process can be marred by cognitive biases [...]

Halo effect

Halo Effect: Why We Judge a Book by Its Cover

Even though we may consider ourselves logical and rational, it appears we are easily biased by a single incident or individual characteristic (Nicolau, Mellinas, & [...]

Sunk cost fallacy

Sunk Cost Fallacy: Why We Can’t Let Go

If you’ve continued with a decision or an investment of time, money, or resources long after you should have stopped, you’ve succumbed to the ‘sunk [...]

Read other articles by their category

  • Body & Brain (53)
  • Coaching & Application (58)
  • Compassion (25)
  • Counseling (51)
  • Emotional Intelligence (23)
  • Gratitude (18)
  • Grief & Bereavement (21)
  • Happiness & SWB (40)
  • Meaning & Values (26)
  • Meditation (20)
  • Mindfulness (44)
  • Motivation & Goals (45)
  • Optimism & Mindset (34)
  • Positive CBT (30)
  • Positive Communication (22)
  • Positive Education (47)
  • Positive Emotions (32)
  • Positive Leadership (19)
  • Positive Parenting (16)
  • Positive Psychology (34)
  • Positive Workplace (37)
  • Productivity (18)
  • Relationships (45)
  • Resilience & Coping (39)
  • Self Awareness (21)
  • Self Esteem (38)
  • Strengths & Virtues (32)
  • Stress & Burnout Prevention (34)
  • Theory & Books (46)
  • Therapy Exercises (37)
  • Types of Therapy (64)

example of how critical thinking cleared up biases

3 Positive CBT Exercises (PDF)

  • Accounting & Finance
  • Communication
  • Critical Thinking
  • Marketing & Strategy
  • Starting a Business
  • Team Management
  • Corporate Philosophy
  • Diversity, Equity, & Inclusion
  • Kokorozashi
  • Sustainable Business
  • AI Ventures
  • Machine Learning
  • Alumni Voices
  • Yoshito Hori Blog
  • Unlimited Insights
  • Career Skills

When Is It Right to Be Right? How to Avoid 4 Cognitive Biases in Business

A wooden ball rolls through a labyrinth of colorful wooden blocks like critical thinking guiding decisions against cognitive bias

Critical Thinking: Structured Reasoning

Even a few simple techniques for logical decision making and persuasion can vastly improve your skills as a leader. Explore how critical thinking can help you evaluate complex business problems, reduce bias, and devise effective solutions.

Confirmation Bias

We all subconsciously collect information that reinforces our preconceptions. It's natural . . . but it does lead to a kind of flawed decision-making called confirmation bias. To become more objective and impartial, check out this course from GLOBIS Unlimited!

You like being right. It’s OK—everyone does. Being right has psychological benefits, as well as practical ones: it makes us feel like we understand the world, like we know what we’re doing. It gives us a sense of feeling grounded in reality, successful, even talented.

But as Neil deGrasse Tyson warns, “The urge to feel special knows no bounds.” No one is completely free of that urge, but if we allow ourselves to sink into it, we invite narcissism, egocentrism, and cognitive bias.

Wanting to be right, in other words, can make us more likely to be wrong.

In business, cognitive bias can lead companies down destructive and costly paths. Leaders need weapons to fight back against bias. The most powerful weapon in that fight? Critical thinking.

Next Article

The Logic Tree: The Ultimate Critical Thinking Framework

 width=

9 Technology Myths: Reprogramming Business Leaders for the Future

 width=

What is cognitive bias?

Cognitive bias happens when the human brain tries to interpret reality without using logic. It’s a tricky animal because 1) it takes many forms, and 2) it operates largely in the background, creeping in through your subconscious. Even if you’re just making a random guess, cognitive bias can make you believe you’re showcasing hard-won expertise.

The result of cognitive bias is your very own subjective version of reality.

The Good Side of Cognitive Bias

Mental biases aren’t always a bad thing. As humans, we all have a core need to feel that we are right, good, and worthy. Cognitive bias actually services that need in some healthy ways.

Bias plays a role in how we build self-esteem, a moral compass, and a sense of identity. Some forms of cognitive bias, like apophenia (more on this below) spur creativity and imagination. Others, like person-positivity bias, help us see the best in people (because we want to project our own best qualities).

But too often, cognitive biases do us more harm than good.

The Bad Side of Cognitive Bias

Negative outcomes of cognitive bias can include discrimination, bigotry, and aggressive ignorance. Perception bias is a prime example, as it leads us to equate our assumptions with fact. That’s obviously a pretty slippery slope.

“There are all manner of cognitive biases. There are some that are particularly insidious if you’re trying to understand what is objectively true.” NEIL DEGRASSE TYSON

Here are a few of the most common forms of cognitive bias that businesspeople, in particular, need to watch out for—and some critical thinking steps to thwart them.

  • Confirmation bias
  • Sampling bias
  • Survivorship bias

Clouds shaped like a rabbit in a clear blue sky, demonstrating a kind of cognitive bias called apophenia

See if you can solve this riddle: A shepherd living in Northern Ireland has two sheepdogs. It’s been raining for three days. How many sheep does he have?

Did you solve it? Of course you didn’t. This “riddle” is unsolvable. It’s less a puzzle than it is a jumble of random information. Still, our brains assume there must be hidden connections: Why three days? Why Northern Ireland ?

The tendency to look for patterns that aren’t there is a kind of bias. It’s called apophenia, and you do it all the time.

If you’ve ever looked at a cloud and thought, “Hey, that looks like a fluffy bunny!” or heard a story of someone seeing the face of Jesus in their toast, that’s apophenia. In the context of riddles, clouds, and toast, of course, it’s pretty harmless.

But apophenia is also where conspiracy theories come from.

In business, we need to take counteractive measures against conditions like apophenia. Imagine setting up a whole new marketing campaign based on what turns out to be random data. That could cost your company a lot of money—and cost you your job.

3 Critical Thinking Steps to Avoid Apophenia

  • Don’t get carried away by false positives. If you’re basing a decision on data, make sure you gather that data systematically. Check benchmarks. Test and retest your findings before acting on them. If something seems too good to be true, it very well might be.
  • Praise teams with inconsistent results. Supporting a team only when it does well may result in phony smiles, a hesitation to share bad news, and a false impression that praise leads to better work performance. Help your team assess their ups and downs.
  • Beware big data hubris . It’s hard to wrap your head around how big big data actually is. Your brain will naturally (even desperately) start searching for connections anywhere and everywhere. Make use of trained, data-savvy managers to ensure your big data isn’t bad for business .

Similar to perception bias, confirmation bias is the tendency to look for information that supports something you already believe, even if it means ignoring data that dis proves that belief.

It’s easier than ever to fall for confirmation bias in the digital age.

Google will happily help you find evidence that the earth is round. Or flat. Or all kinds of other shapes . And social media algorithms don’t help—most people know Facebook curates content based on user preferences, essentially feeding the confirmation bias beast.

Every businessperson needs a learning mindset. It’s human to learn, be wrong, and relearn. It’s super human to be proven right at every turn. An important part of critical thinking is re thinking. Even scientists resist taking facts at face value , and they’re the ones coming up with the facts!

“Much like Santa Claus and unicorns, facts don’t actually exist. At least not in the way we commonly think of them.” Julia Shaw, Author of The Memory Illusion

3 Critical Thinking Steps to Combat Confirmation Bias

  • Accept that your opinions are not facts. If you find yourself talking to someone—anyone—who believes you’re mistaken, listen to what they have to say. Then check some credible sources. Be ready to be wrong.
  • Entertain the possibility that you’re dumb. Smart people are more susceptible to confirmation bias because they’re used to being right. If you feel like you’re the smartest person in the room, try to assume the opposite. Listen to the diverse opinions around you and ask questions to explore new perspectives.
  • Understand that your past success has nothing to do with tomorrow. In a world of constant change, old lessons lose their reliability by the second. And yet, it’s the oldest lessons we tend to rely on. Remember that experience is one thing—patterns are another.

Businessman in disheveled clothing tries to find cell reception on a desert island

Survivorship Bias

Imagine you’re feeling stuck in your career, so you head to the bookstore and browse the business section for some advice. A few names catch your eye: Elon Musk, Mark Zuckerberg, Sheryl Sandberg. You hardly even notice the hundreds of other biographies. Why? Because these are the men and women who ran the gauntlet and succeeded, like you want to. They can surely teach you something the others can’t.

You’ve just had your first lesson in survivorship bias.

Survivorship bias is the tendency to draw conclusions based on the people or things that “survived” some kind of trial, like weathering the tough world of business. But let’s face it: Musk, Zuckerberg, and Sandberg are one in a million. You’d have to beat astronomical odds to replicate their success, even if you studied every move they ever made. And you’ll never know how many would-be titans did use those moves but failed anyway.

There are loads of lesser-known men and women who can give you advice better scaled to your needs, but you won’t find them if you’ve got your survivorship bias goggles on.

If your team or company chooses to do something just because a big name did it, you may be headed for trouble.

3 Critical Thinking Steps to Survive Survivorship Bias

  • Use benchmarks wisely. If your startup is a budding fashion boutique, don’t try to copy Chanel. By all means, study their early success for inspiration. But don’t forget Chanel was founded in 1910—a lot has changed since then!
  • Look beyond results to holistic circumstances. Bill Gates, one of the richest men in the world, never graduated from university. Does that mean education is useless? Gates had two years at Harvard before dropping out, and he likely learned something in that time. Consider everything that went into a “survivor’s” success, not just the flashy headlines.
  • Challenge your data. Survivorship bias thrives on shaky sources and rushed data analysis. Make sure you have real tools in your hands when you make decisions, not random Wikipedia pages from a quick Google search.

Sampling Bias

Imagine there’s an outbreak of a virus that makes dogs insatiably playful. Desperate for a cure (and some sleep), a team of scientists collects blood samples from every purebred dog in every dog park in New York City. Within a week, they have a hypothesis about which dogs are and aren’t carriers of the virus.

But that hypothesis is inevitably flawed due to sampling bias.

Sampling bias is the tendency to base conclusions on the information in plain sight. It also involves placing more weight on some members of a population than others. In the case of the playful pup virus, dogs that don’t go to dog parks had no representation. Neither did mixed breeds.

In business, sampling bias can hopelessly skew your data and give you a false sense of security. Defective products, for example, may only be reported by customers who can afford to mail them back.

3 Critical Thinking Steps to Sidestep Sampling Bias

  • Start with a plan—and don’t forget it. Sampling bias happens during the action stage, so before you act, clarify your objectives, needs, and strategy. Then use those to measure each action before you take it.
  • Remember that words have power. Surveys are a magnet for sampling bias. Careful wording and psychology should guide your way. Even too many options (a 1–10 scale instead of a 1–5 scale, for example) may have respondents listing toward a happy middle because they’d rather not commit to “ strongly agree” or “ strongly disagree.”
  • Think random. Random sampling is a popular method for avoiding sampling bias because it equalizes the odds. If you’re forced to go with the pool of respondents most readily available—a method known as convenience sampling —be aware that your results will almost certainly be tainted by sampling bias.

Betting against Cognitive Bias

Critical thinking skills are the best way to safeguard yourself against cognitive bias in all its forms (not just these four) and stop relying on mental shortcuts. They can even help you harness the power of bias for good: Apophenia can nourish creativity, for example, and confirmation bias can boost self-esteem.

You’re not a superhuman with all the answers—and that’s OK. It’s also OK to enjoy the buzz of being right. But eventually, you’ll need to shake it off and apply a little logic as a critical thinker. Find out why you were right this time so you can be right next time, too.

Related Articles

The trap of tiara syndrome: how to advocate for yourself.

 width=

360 Marketing: Where Traditional and Digital Meet

 width=

The Unsung Pillar of Japanese Work Culture: Hou-Ren-Sou

 width=

Get monthly Insights

Sign up for our newsletter! Privacy Policy

GLOBIS Insights

  • Submission Guidelines
  • Our Contributors

Accountability

  • Privacy Policy

GLOBIS Group

  • GLOBIS Corporation
  • GLOBIS University
  • GLOBIS Capital Partners
  • GLOBIS Asia Pacific
  • GLOBIS Asia Campus
  • GLOBIS China
  • GLOBIS Europe
  • GLOBIS Thailand
  • G1 Institute
  • Ibaraki Robots Sports Entertainment
  • KIBOW Foundation

© GLOBIS All Rights Reserved

Maximize Success Academy

Overcoming Biases For Developing Critical Thinking

  • Post author: Prashant Kumar
  • Post published: September 7, 2020

What Is Critical Thinking?

Critical thinking is the skill of analyzing an issue, situation, fact, or idea to make a logical and informed decision to the best of your ability. A Critical thinker looks for logic in everything and decides after considering every aspect of the situation.   Critical thinking is about being an active learner rather than a passive recipient of information.

Critical thinkers question situations and ideas rather than accepting them. They identify, analyze, and solve problems logically rather than by intuition or indistinct.

Basic steps for critical thinking:

  • Identification : Identify the situation or issue deeply.  Then figure out who is going to influence it or is getting affected by this.  Once you have a precise scenario of the case, you can begin diving deeper into the issue.
  • Research : Research the topic properly before taking any step or decision.
  • Identifying biases : Before taking any decision or action, make sure you are not making a biased decision. After considering all these points, step forward to make a decision.

Importance Of Removing Biases:

When one decides based on some selfish motive, rather than evaluating the pros and cons of everything to make an informed decision, they are biased. Or we can say being biased means taking a partial decision.  For example, John is a judge of a talent show.  His sister’s friend, Kristy, is participating in the talent show.  John votes for Kristy as the winner because he knows her and not based on the guidelines and criteria. Therefore, he made a biased decision/vote. 

Biases affect the judgments and decisions you make.  It’s important to ensure all facts and opinions involved are considered to avoid any rifts or feelings of inferiority amongst the involved parties.  

Types of biases

Following are some different kinds of biases that we experience in our day to day lives:

  • Self-serving bias : Have you ever failed a paper and blamed your teacher for not teaching in the right way, or scored good marks and appreciated yourself for studying. This is determined as a Self-Serving bias, as you praise or appreciate yourself for doing good and blame others for your bad result.
  • Curse of Knowledge or Curse of expertise : Curse of Knowledge bias occurs when you think that the person you are sharing your idea with will understand you, as he/she has the same background as yours. This can result in a misunderstanding between two individuals.
  • Optimism/Pessimism Bias : Optimism bias considers that one’s chances of experiencing positive events are higher than negative ones. This can lead to an unfair decision regarding any concept.
  • In-Group Bias : In-group bias refers to favouring members of your team or similar team and gender above others. For example, supporting your favourite team in the cricket match is In-group. 
  • The Backfire Effect : As the name suggests, the backfire effect bias strengthens people’s beliefs even after encountering challenging shreds of evidence against their ideas.  For example, when you bring up some negative controversy about a celebrity in the presence of a dedicated fan, that fan may further solidify his/her support for that celebrity after hearing the information. 

Ways to overcome different biases:

You can overcome these biases by taking care of some essential points:

  • Accept the truth that you sometimes act biased while making some decisions or treating people.
  • Remain updated on social media to get information about discrimination prejudice going on in society. 
  • Try to hang out with people who have a better attitude than you to help you think more positively of others. 
  • Make an effort to be friendlier and less threatened when interacting with new people.  We all know the saying, “don’t judge a book by its cover.” We really must make it a point to follow this saying.  Maybe you are thinking wrong by just watching their outer look.  Perhaps your first impression was that the person was tardy looking, so you automatically don’t want to interact with them any further.  However, it’s important to get past your first impression to find the individual’s real personality.  Give the person a real chance to prove him/herself.
  • Make a list of when your biases impact your behaviour. Then try to avoid that situation by thinking creatively.

By the way, do you have a hard time getting past your first impressions?  Are you having a hard time overcoming your biases?  Do you lack the skills required to make objective decisions?  Well, you too can master the art of critical thinking to overcome any biases you may have by Meeting Us  here and get the details of the course.

If you want to know more about our Indo-Canadian trainer and us,  Click Here .

You Might Also Like

Read more about the article Remove Mental Blocks To Enhance Creative Problem Solving Abilities

Remove Mental Blocks To Enhance Creative Problem Solving Abilities

Important benefits of personality development.

Read more about the article Survey On Telephone Etiquette – For Engineers

Survey On Telephone Etiquette – For Engineers

This post has one comment.

' src=

I appreciate you sharing this blog article. Keep writing. Josefina Marven Tess

Leave a Reply Cancel reply

Save my name, email, and website in this browser for the next time I comment.

IMAGES

  1. Emotion Biases And Critical Thinking

    example of how critical thinking cleared up biases

  2. Cognitive Bias

    example of how critical thinking cleared up biases

  3. 15 Cognitive Biases: A List of Common Biases Many People Have

    example of how critical thinking cleared up biases

  4. 18 Cognitive Bias Examples Show Why Mental Mistakes Get Made

    example of how critical thinking cleared up biases

  5. 9 Common Thinking Biases

    example of how critical thinking cleared up biases

  6. 25 Critical Thinking Examples (2024)

    example of how critical thinking cleared up biases

COMMENTS

  1. A List of Common Cognitive Biases (With Examples)

    Explicit biases are prejudiced beliefs regarding a group of people or ways of living. Racism, sexism, religious intolerance, and LGBTQ-phobias are examples of explicit biases. If you think that all people of group X are inferior, then you have an explicit bias against people of group X. 2. Implicit biases are unconscious beliefs that lead ...

  2. Cognitive Bias Is the Loose Screw in Critical Thinking

    Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking. Wikipedia lists 197 ...

  3. Critical thinking

    Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.

  4. PDF Cognitive Biases and Their Importance for Critical Thinking

    What They Are and Why They're Important 3 1. Cognitive Biases: What They Are and Why They're Important Everyone-agrees-that-logic-and-argumentation-are-important-for-critical-

  5. What Is Cognitive Bias? Types & Examples

    Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect , inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect. Cognitive biases directly affect our ...

  6. How to Identify Cognitive Bias: 12 Examples of Cognitive Bias

    Identifying the biases you experience and purport in your everyday interactions is the first step to understanding how our mental processes work, which can help us make better, more informed decisions. Cognitive biases are inherent in the way we think, and many of them are unconscious.

  7. Bias

    Peters (2020) also suggests that we're more likely to remember information that supports our way of thinking, further cementing our bias. Taken together, the research suggests that bias has a huge impact on the way we think. To learn more about how and why bias can impact our everyday thinking, watch this short video.

  8. 2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

    This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while ...

  9. Cognitive Bias: What It Is, Signs, How to Overcome It

    Research suggests that cognitive training can help minimize cognitive biases in thinking. Some things that you can do to help overcome biases that might influence your thinking and decision-making include: Being aware of bias: Consider how biases might influence your thinking. In one study, researchers provided feedback and information that ...

  10. Critical Thinking Tutorial: Common Cognitive Biases

    Biases also prevent us from considering diverse perspectives, weighing evidence objectively, and relying on accurate information to make decisions. As a critical thinker, being aware of cognitive biases can help you actively challenge and limit their influence.

  11. 12 Common Biases That Affect How We Make Everyday Decisions

    Remember one of my "5 Tips for Critical Thinking": Leave emotion at the door. 6. The Sunk Cost Fallacy. Though labeled a fallacy, I see "sunk cost" as just as much in tune with bias as faulty ...

  12. A Crash Course in Critical Thinking

    Here is a series of questions you can ask yourself to try to ensure that you are thinking critically. Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion ...

  13. Cognitive Bias List: 13 Common Types of Bias

    The Misinformation Effect. The Actor-Observer Bias. The False Consensus Effect. The Halo Effect. The Self-Serving Bias. The Availability Heuristic. The Optimism Bias. Other Kinds. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases.

  14. 41+ Critical Thinking Examples (Definition + Practices)

    There are many resources to help you determine if information sources are factual or not. 7. Socratic Questioning. This way of thinking is called the Socrates Method, named after an old-time thinker from Greece. It's about asking lots of questions to understand a topic.

  15. The impact of cognitive biases, mental models, and mindsets on

    Table 1 lists some examples of common biases/traps and examples of how they may impact health systems. As found in a recent systematic literature review 11 examining biases and decision-making particularly during times of major transformation, managing cognitive biases is critical to the success of strategic decisions.

  16. Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

    Sources of Bias. Recognizing bias is an essential part of problem solving and critical thinking. It is important to be aware of potential sources of bias, such as personal opinions, values, or preconceived notions. Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence.

  17. 7 Ways to Improve Critical Thinking and Challenge Brain Bias

    Here are seven ways to improve critical thinking and challenge brain bias: Look for alternative premises: Our brain is wired to interpret information through existing contexts causing us to overly focus on information that supports our current beliefs. Counteracting this requires a deliberate shift in focus. If you believe something to be so ...

  18. PDF How to Reduce Bias In Decision-Making

    Objective: This module is designed to help students reduce and even eliminate on-going biases that hamper successful decision-making. Approach: The approach surveys an array of biases to help students recognize them, while outlining various techniques to help students reduce and hopefully even eliminate them.

  19. Are You Aware of Your Biases?

    by. Carmen Acton. February 04, 2022. Getty Images/Carol Yepes. Summary. Often, it's easy to "call out" people when we notice their microaggressions or biased behaviors. But it can be equally ...

  20. What Is Cognitive Bias? 7 Examples & Resources (Incl. Codex)

    There are numerous examples of cognitive biases, and the list keeps growing. Here are a few examples of some of the more common ones. 1. Confirmation bias. This bias is based on looking for or overvaluing information that confirms our beliefs or expectations (Edgar & Edgar, 2016; Nickerson, 1998).

  21. 4 Cognitive Biases That Can Derail Your Critical Thinking

    Where we see it: Jumping to solution-thinking when faced with a problem. How to navigate it: Set a time period for framing up and sitting in the problem before generating solutions. 3 ...

  22. When Is It Right to Be Right? 4 Cognitive Biases in Business

    Betting against Cognitive Bias. Critical thinking skills are the best way to safeguard yourself against cognitive bias in all its forms (not just these four) and stop relying on mental shortcuts. They can even help you harness the power of bias for good: Apophenia can nourish creativity, for example, and confirmation bias can boost self-esteem.

  23. Overcoming Biases For Developing Critical Thinking

    Optimism/Pessimism Bias: Optimism bias considers that one's chances of experiencing positive events are higher than negative ones. This can lead to an unfair decision regarding any concept. In-Group Bias: In-group bias refers to favouring members of your team or similar team and gender above others. For example, supporting your favourite team ...