Cognitive bias is a systematic error in thinking, affecting how we process information, perceive others, and make decisions. It can lead to irrational thoughts or judgments and is often based on our perceptions, memories, or individual and societal beliefs.
Instead of performing exhaustive logical analyses for every decision, humans utilize heuristics.
Heuristics are mental shortcuts or “rules of thumb” that simplify complex problem-solving.
While these shortcuts are often efficient and adaptive for survival, they frequently produce predictable errors.
These errors occur because heuristics prioritize speed over accuracy, leading to skewed perceptions of reality.
Dual Process Theory: System 1 and System 2
The dual process model explains cognitive biases through the interaction of two distinct neurological systems.
Nobel laureate Daniel Kahneman defined these systems based on their speed and effort requirements. System 1 constitutes the primary source of biased judgment due to its reflexive nature.
-
System 1 (Intuitive): This system is fast, instinctive, and emotional. It operates automatically with little or no sense of voluntary control.
-
System 2 (Deliberate): This system is slow, effortful, and logical. It handles complex computations and provides the necessary oversight to correct System 1 errors.
Conflict arises because System 2 is inherently “lazy.”
It requires significant metabolic energy to function.
Consequently, System 2 often accepts the intuitive suggestions of System 1 without critical evaluation.
This failure of oversight allows biases to influence high-stakes decisions in law, medicine, and finance.

Confirmation Bias: The Preservation of Preexisting Beliefs
Confirmation bias is the tendency to selectively process information that validates one’s current worldview while disregarding contradictory data.
This bias acts as a filter that reinforces “echo chambers,” particularly in digital environments where algorithms prioritize content based on user preference.
This mechanism provides a motivational benefit by protecting self-esteem.
Admitting error creates psychological discomfort, so the brain prioritizes “desired conclusions” to maintain a sense of security and intellectual competence.
From a cognitive perspective, confirmation bias occurs because the mind struggles with parallel processing.
Parallel processing is the ability of the brain to simultaneously evaluate multiple, competing hypotheses.
Because this is cognitively taxing, the brain defaults to a single, consistent narrative.
This can be particularly dangerous in criminal investigations, where a detective may focus only on evidence that implicates a specific suspect while ignoring exonerating facts.
Empirical Validation: The Wason Rule Discovery Test
Wason (1960) provided the foundational evidence for this bias through a numerical reasoning task.
-
Aim: To investigate whether people seek to confirm or falsify their hypotheses when testing a rule.
-
Procedure: Participants were given the sequence “2-4-6” and told it followed a specific rule. They had to generate their own triples to discover the rule, receiving “yes” or “no” feedback.
-
Findings: Most participants assumed the rule was “even numbers increasing by two” and only tested sequences that fit this narrow theory (e.g., 8-10-12). They rarely tested sequences that could falsify their theory (e.g., 2-4-7).
-
Conclusions: People have a natural tendency to seek confirming evidence rather than attempting to disprove their own assumptions. The actual rule was simply “any three ascending numbers.”
Hindsight Bias: The Illusion of Predictability
Hindsight bias, or the “I-knew-it-all-along” effect, is the tendency to perceive past events as having been more predictable than they actually were.
Once an outcome is known, the brain reorganizes its memory of the event to make the result seem inevitable.
This distortion occurs because current knowledge is highly “accessible” in the mind, making it difficult to recall the state of uncertainty that existed before the event occurred.
This bias serves a motivational function by making the world feel orderly and predictable.
When unexpected events occur, they violate our expectations and cause anxiety.
By convincing ourselves we “saw it coming,” we regain a sense of control over our environment.
However, this overconfidence can lead to risky future decisions, as individuals overestimate their ability to forecast complex outcomes in sports, politics, or finance.
Empirical Validation: The Nixon Visit Study
Fischhoff and Beyth (1975) conducted the first direct investigation into this phenomenon using real-world political events.
-
Aim: To determine if knowing an outcome changes a person’s memory of their initial predictions.
-
Procedure: Before President Nixon’s historic 1972 trip to China and the USSR, participants assigned probabilities to various outcomes. After the trip, they were asked to recall their original predictions.
-
Findings: Participants consistently remembered giving higher probabilities to the events that actually happened and lower probabilities to the events that did not.
-
Conclusions: Knowledge of the present outcome automatically and unconsciously contaminates our memory of the past.

Self-Serving Bias: Attribution and Ego Defense
The self-serving bias is a social-cognitive distortion where individuals attribute success to internal factors and failure to external factors.
This bias differs from the Fundamental Attribution Error because it specifically focuses on “valence.” Valence refers to the intrinsic goodness or badness of an event.
By taking credit for wins and blaming the environment for losses, individuals maintain a positive self-image and high levels of self-esteem.
In the workplace, this manifests as employees attributing promotions to their talent while blaming a “difficult boss” for a poor performance review.
While this bias protects mental health and prevents depression, it can hinder personal growth.
Overcoming this requires “self-compassion.” Self-compassion is the practice of treating oneself with kindness and objective understanding during failures, reducing the need for defensive externalization.
Anchoring Bias: The Power of First Impressions
Anchoring bias occurs when an individual relies too heavily on the first piece of information offered—the “anchor”—during decision-making.
Once an anchor is set, all subsequent negotiations or estimates are adjusted relative to that initial value.
This is a common tactic in retail and real-world negotiations, where an initial high price makes all lower prices seem like bargains, regardless of the item’s actual market value.
The phenomenon is explained by “selective accessibility.”
This theory suggests that when we see an anchor, our brain automatically searches for information that is consistent with that value.
This makes the anchor more mentally prominent, biasing our final judgment.
Susceptibility to anchoring increases under high “cognitive load,” which is the total amount of mental effort being used in the working memory at one time.
Availability Bias: Recency and Vividness in Judgment
The availability bias (or availability heuristic) involves estimating the frequency of an event based on how easily examples can be recalled.
Information that is “available”—meaning it is recent, vivid, or emotionally charged—exerts a disproportionate influence on our perception of risk.
This explains why people often fear rare, sensationalized events like shark attacks or plane crashes more than statistically common dangers like heart disease or car accidents.
This bias allows the brain to bypass the difficult task of calculating “statistical probabilities.”
Statistical probability is the objective likelihood of an event based on data.
Instead, the brain uses the ease of memory retrieval as a proxy for frequency. If you can think of it easily, your brain assumes it must happen often.
Inattentional Blindness: The Limits of Focused Attention
Inattentional blindness occurs when a person fails to perceive an unexpected stimulus that is in plain sight because their attention is focused elsewhere.
This is not a visual deficit but a cognitive one. It results from “attentional capacity” limits.
This concept describes the finite amount of mental energy available for processing sensory information.
When we focus intensely on one task, the brain “filters out” irrelevant information to prevent sensory overload.
Empirical Validation: The Invisible Gorilla Study
Most et al. (2001) famously demonstrated the severity of this selective attention.
-
Aim: To test if intense focus on a task causes people to miss highly visible but unexpected stimuli.
-
Procedure: Participants watched a video of people passing a basketball and were told to count the passes made by the “white team.” Midway through, a person in a gorilla suit walked through the scene.
-
Findings: Approximately 50% of participants failed to notice the gorilla entirely.
-
Conclusions: Conscious perception requires attention; if the mind is fully occupied, even a large, distinct stimulus can remain “invisible” to the observer.
Preventing Cognitive Bias
As we know, recognizing these biases is the first step to overcoming them. But there are other small strategies we can follow in order to train our unconscious mind to think in different ways.
From strengthening our memory and minimizing distractions to slowing down our decision-making and improving our reasoning skills, we can work towards overcoming these cognitive biases.
An individual can evaluate his or her own thought process, also known as metacognition (“thinking about thinking”), which provides an opportunity to combat bias (Flavell, 1979).
This multifactorial process involves (Croskerry, 2003):
(a) acknowledging the limitations of memory,
(b) seeking perspective while making decisions,
(c) being able to self-critique,
(d) choosing strategies to prevent cognitive error.
Many strategies used to avoid bias that we describe are also known as cognitive forcing strategies, which are mental tools used to force unbiased decision-making.
History of Cognitive Bias
The term cognitive bias was first coined in the 1970s by Israeli psychologists Amos Tversky and Daniel Kahneman, who used this phrase to describe people’s flawed thinking patterns in response to judgment and decision problems (Tversky & Kahneman, 1974).
Tversky and Kahneman’s research program, the heuristics and biases program, investigated how people make decisions given limited resources (for example, limited time to decide which food to eat or limited information to decide which house to buy).
As a result of these limited resources, people are forced to rely on heuristics or quick mental shortcuts to help make their decisions.
Tversky and Kahneman wanted to understand the biases associated with this judgment and decision-making process.
To do so, the two researchers relied on a research paradigm that presented participants with some type of reasoning problem with a computed normative answer (they used probability theory and statistics to compute the expected answer).
Participants’ responses were then compared with the predetermined solution to reveal the systematic deviations in the mind.
After running several experiments with countless reasoning problems, the researchers were able to identify numerous norm violations that result when our minds rely on these cognitive biases to make decisions and judgments (Wilke & Mata, 2012).
Key Takeaways
- Cognitive biases are unconscious errors in thinking that arise from problems related to memory, attention, and other mental mistakes.
- These biases result from our brain’s efforts to simplify the incredibly complex world in which we live.
- Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect, inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect.
- Cognitive biases directly affect our safety, interactions with others, and how we make judgments and decisions in our daily lives.
- Although these biases are unconscious, there are small steps we can take to train our minds to adopt a new pattern of thinking and mitigate the effects of these biases.
References
Allen, M. S., Robson, D. A., Martin, L. J., & Laborde, S. (2020). Systematic review and meta-analysis of self-serving attribution biases in the competitive context of organized sport. Personality and Social Psychology Bulletin, 46 (7), 1027-1043.
Casad, B. (2019). Confirmation bias. Retrieved from https://www.britannica.com/science/confirmation-bias
Cherry, K. (2019). How the availability heuristic affects your decision-making. Retrieved from https://www.verywellmind.com/availability-heuristic-2794824
Cherry, K. (2020). Inattentional blindness can cause you to miss things in front of you. Retrieved from https://www.verywellmind.com/what-is-inattentional-blindness-2795020
Dietrich, D., & Olson, M. (1993). A demonstration of hindsight bias using the Thomas confirmation vote. Psychological Reports, 72 (2), 377-378.
Fischhoff, B. (1975). Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1 (3), 288.
Fischhoff, B., & Beyth, R. (1975). I knew it would happen: Remembered probabilities of once—future things. Organizational Behavior and Human Performance, 13 (1), 1-16.
Furnham, A. (1982). Explanations for unemployment in Britain. European Journal of social psychology, 12(4), 335-352.
Heider, F. (1982). The psychology of interpersonal relations. Psychology Press.
Inman, M. (2016). Hindsight bias. Retrieved from https://www.britannica.com/topic/hindsight-bias
Lang, R. (2019). What is the difference between conscious and unconscious bias? : Faqs. Retrieved from https://engageinlearning.com/faq/compliance/unconscious-bias/what-is-the-difference-between-conscious-and-unconscious-bias/
Luippold, B., Perreault, S., & Wainberg, J. (2015). Auditor’s pitfall: Five ways to overcome confirmation bias. Retrieved from https://www.babson.edu/academics/executive-education/babson-insight/finance-and-accounting/auditors-pitfall-five-ways-to-overcome-confirmation-bias/
Mezulis, A. H., Abramson, L. Y., Hyde, J. S., & Hankin, B. L. (2004). Is there a universal positivity bias in attributions? A meta-analytic review of individual, developmental, and cultural differences in the self-serving attributional bias. Psychological Bulletin, 130 (5), 711.
Miller, D. T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or fiction?. Psychological Bulletin, 82 (2), 213.
Most, S. B., Simons, D. J., Scholl, B. J., Jimenez, R., Clifford, E., & Chabris, C. F. (2001). How not to be seen: The contribution of similarity and selective ignoring to sustained inattentional blindness. Psychological Science, 12 (1), 9-17.
Mussweiler, T., & Strack, F. (1999). Hypothesis-consistent testing and semantic priming in the anchoring paradigm: A selective accessibility model. Journal of Experimental Social Psychology, 35 (2), 136-164.
Neff, K. (2003). Self-compassion: An alternative conceptualization of a healthy attitude toward oneself. Self and Identity, 2 (2), 85-101.
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2 (2), 175-220.
Norris, S., Salgado, F., Murray, S., Amen, D., & Keator, D. B. (2025). The Role of Negativity Bias in Emotional and Cognitive Dysregulation: A Neuroimaging Study in Anxiety Disorders. Depression and Anxiety, 2025(1), 2739947.
Orzan, G., Zara, I. A., & Purcarea, V. L. (2012). Neuromarketing techniques in pharmaceutical drugs advertising. A discussion and agenda for future research. Journal of Medicine and Life, 5 (4), 428.
Pickel, K. L. (2015). Eyewitness memory. The handbook of attention, 485-502.
Pohl, R. F., & Hell, W. (1996). No reduction in hindsight bias after complete information and repeated testing. Organizational Behavior and Human Decision Processes, 67 (1), 49-58.
Roese, N. J., & Vohs, K. D. (2012). Hindsight bias. Perspectives on Psychological Science, 7 (5), 411-426.
Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. In Advances in experimental social psychology (Vol. 10, pp. 173-220). Academic Press.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5 (2), 207-232.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 (4157), 1124-1131.
Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review, 90(4), 293.
Tversky, A., & Kahneman, D. (1992). Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4), 297-323.
Walther, J. B., & Bazarova, N. N. (2007). Misattribution in virtual groups: The effects of member distribution on self-serving bias and partner blame. Human Communication Research, 33 (1), 1-26.
Wason, Peter C. (1960), “On the failure to eliminate hypotheses in a conceptual task”. Quarterly Journal of Experimental Psychology, 12 (3): 129–40.
Wegener, D. T., Petty, R. E., Detweiler-Bedell, B. T., & Jarvis, W. B. G. (2001). Implications of attitude change theories for numerical anchoring: Anchor plausibility and the limits of anchor effectiveness. Journal of Experimental Social Psychology, 37 (1), 62-69.
Wilke, A., & Mata, R. (2012). Cognitive bias. In Encyclopedia of human behavior (pp. 531-535). Academic Press.
Further Information
Test Yourself for Bias
- Project Implicit (IAT Test) From Harvard University
- Implicit Association Test From the Social Psychology Network
- Test Yourself for Hidden Bias From Teaching Tolerance
Listen
- How The Concept Of Implicit Bias Came Into Being With Dr. Mahzarin Banaji, Harvard University. Author of Blindspot: hidden biases of good people5:28 minutes; includes transcript
- Understanding Your Racial Biases With John Dovidio, PhD, Yale University
From the American Psychological Association11:09 minutes; includes transcript - Talking Implicit Bias in Policing With Jack Glaser, Goldman School of Public Policy, University of California Berkeley21:59 minutes
- Implicit Bias: A Factor in Health Communication With Dr. Winston Wong, Kaiser Permanente19:58 minutes
- Bias, Black Lives and Academic Medicine Dr. David Ansell on Your Health Radio (August 1, 2015)21:42 minutes
Videos
- Uncovering Hidden Biases Google talk with Dr. Mahzarin Banaji, Harvard University
- Impact of Implicit Bias on the Justice System 9:14 minutes
- Students Speak Up: What Bias Means to Them 2:17 minutes
- Weight Bias in Health Care From Yale University16:56 minutes
- Gender and Racial Bias In Facial Recognition Technology 4:43 minutes
Journal Articles
- An implicit bias primer Mitchell, G. (2018). An implicit bias primer. Virginia Journal
of Social Policy & the Law, 25, 27–59. - Implicit Association Test at age 7: A methodological and conceptual review Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior, 4, 265-292.
- Implicit Racial/Ethnic Bias Among Health Care Professionals and Its Influence on Health Care Outcomes: A Systematic Review Hall, W. J., Chapman, M. V., Lee, K. M., Merino, Y. M., Thomas, T. W., Payne, B. K., … & Coyne-Beasley, T. (2015). Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. American journal of public health, 105 (12), e60-e76.
- Reducing Racial Bias Among Health Care Providers: Lessons
from Social-Cognitive Psychology Burgess, D., Van Ryn, M., Dovidio, J., & Saha, S. (2007). Reducing racial bias among health care providers: lessons from social-cognitive psychology. Journal of general internal medicine, 22 (6), 882-887. - Integrating implicit bias into counselor education Boysen, G. A. (2010). Integrating Implicit Bias Into Counselor Education. Counselor Education & Supervision, 49 (4), 210–227.
- Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect Christian, S. (2013). Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect. Journal of Mass Media Ethics, 28 (3), 160–174.
- Empathy intervention to reduce implicit bias in pre-service teachers Whitford, D. K., & Emerson, A. M. (2019). Empathy Intervention to Reduce Implicit Bias in Pre-Service Teachers. Psychological Reports, 122 (2), 670–688.