You are running hard in a race. From third place, you pass the person in second place, so now you are in first place.
This issue is about bias. Bias is a term we normally associate with ill deeds and try to avoid, but I am going to start this article with another way of looking at bias. Ultimately all cognition is bias. Anytime we encounter new information, make a decision, or engage in an action, we bring up mental models to give meaning to what we perceive and use the affordances and emotional valances in those models to steer our reaction. Those built-in models are based on a lifetime of experiences. They are the “truth” of us, whether prejudiced, demeaning, leaning one way or another, or not. If X happens, there are only so many ways I can interpret it or act in response, and past experience tells me these one or two choices are probably the best. Bias merely represents our brain’s fundamental way of deciding. (To read the research on this perspective, look at the Harvard Bundle on Bias and Heuristics)
Since we assume our past experiences with the world represent the most likely future experiences, we build models of meaning from previous experiences that allow us to understand future experiences. To understand, we stereotype. At the basic level, prejudice, stereotyping, and bias are not tendencies we must get rid of, or even can get rid of. They navigate us through a complex and difficult world, especially the social one. We just need to fine-tune them.
Unfortunately, we err. We try to be “logical,” “objective,” or “smart” in our decision-making, and yet these descriptors really just reflect the values of the larger group, not some greater truth; in other words, they represent the collective bias of a culture. I might rage at a waiter because something in my own experiences has deemed that raging[1] is the best thing to do, but in others’ eyes, that might be unfair. Which bias should I adhere to? The collective bias or my own? Not easy as I have discovered in a recent personal incident explained in this footnote.[2]
This is not to say that that the personal bias is equal to the collective. Were it so, we wouldn’t need laws, treaties, or religions. And in the case above, the group knows better than I do that the probable harm caused to me or to the waiter exceeds the probable gain. My biases are usually hammered together in minutes, but the collective bias of any group is deep, and centuries old (though their age can be a problem too). To upgrade my own biases, I turn to literature, media, research, others’ expertise, and careful thinking. Feeding on this collective decision-making ability to gain skills is the transit from child to adult, ignorant to educated, and we spend more brain hours on this endeavor than anything else. Think about it. Watching TV, surfing the net, playing games, reading novels, computing, talking to others are all attempts to feast on the collective knowledge and improve our decision-making.
[1] New finding in neuroscience: aggression is deliberate, not a loss of self-control.
[2] I am facing such a dilemma now. A close friend is a bit large and I think this has been troubling her for most of her life. A few years ago, she was diagnosed with uterine cancer, which fortunately, surgery took care of. But, before that, just after the doctor told her she had cancer, he also told her that this illness was common is overweight women. She told us, in a rather heated way, that she was being “fat shamed.” I told her back, in rather strong terms, that I thought she was overreacting: The doctor was just doing his job. But she is still adamant that he was insensitive—to the point of cruelty—by putting that information forward just after giving the upsetting news of cancer.
I thought her anger and wording showed a terrible bias in her thinking, but now I wonder about my own. What bias of my own made me shut her down so quickly instead of listening? I should have followed the advice my psychologist-friend Gerry Yokota gave on a similar issue: When a person is in pain, it is real, no matter how they construct the cause, so focus on the pain. Indeed, biases are like an onion, with so many layers. If you have some thoughts on this, mail me.
In fact, some of the tools we have developed to improve our biases come from explorations of bias itself, such as in Daniel Kahneman’s bestseller: Thinking, Fast and Slow. (And unfortunately, from here on I have to revert to the common use of bias, as being faulty thinking, rather than stick to the neuroscientific view that all thinking is bias.) In Kahneman’s words:
We’re so self-confident in our rationality that we think all our decisions are well-considered. When we choose a job, decide how to spend our time, or buy something, we think we’ve considered all the relevant factors and are making the optimal choice. In reality, our minds are riddled with biases leading to poor decision making. We ignore data that we don’t see, and we weigh evidence inappropriately. (source)
Kahneman outlined two modes of thought that he called systems 1 and 2. Here are examples (reduced, since Kahneman tends to be wordy):
- System 1: Fast, automatic, frequent, emotional, stereotypic, unconscious. Examples (in order of complexity) of things system 1 can do:
- determine that an object is at a greater distance than another
- complete the phrase “war and …”
- solve 2 + 2 =?
- drive a car on an empty road
- think of a good chess move (if you’re a chess master)
- associate the description “quiet and structured person with an eye for details” with a specific job
- System 1: Fast, automatic, frequent, emotional, stereotypic, unconscious. Examples (in order of complexity) of things system 1 can do:
- System 2:Slow, effortful, infrequent, logical, calculating, conscious. Examples of things system 2 can do:
- prepare yourself for the start of a sprint
- sustain a faster-than-normal walking rate
- determine the appropriateness of a particular behavior in a social setting
- park into a tight parking space
- determine the price/quality ratio of two washing machines
- determine the validity of a complex logical reasoning
- solve 17 × 24 (source)
- System 2:Slow, effortful, infrequent, logical, calculating, conscious. Examples of things system 2 can do:
Some of the best ideas, however, lie in his examination of tendencies we have in relation to fast thinking—anchoring, framing, sunk costs, etc.—and how we can use them to our advantage. That is why you should watch the Main video we provided. It gives some great information in 10 minutes.
(In fact, we hope you always explore the multimedia resources we showcase at the beginning of each issue. So many of the videos or podcasts we suggest are superb, sometimes the best part of the issue.)
Slow thinking might lead to better decisions than fast thinking, but if you have ever been around someone who takes forever to decide the simplest things, then you can see the advantages of fast thinking. The key is to know when you should shift from fast to slow thinking. For example, stereotyping people by race, gender, culture, age, political party, or occupation, in other words, fast thinking, has both disadvantages and advantages. I might automatically assume a taxi driver is less well-read than I and get in trouble as a result, but it is okay to instantly assume he knows the roads better than I do.
In short, fast thinking is the norm. It makes us active and productive. But once in a while it can lead to poor decisions. The key is knowing when we should slow our thinking down, and coming to that realization is not easy. We can’t afford to examine each decision and evaluate it, but maybe becoming aware of common brain heuristics, the mental shortcuts we use to make decisions, can help. And this brings us to the most beautiful brain diagram I have ever seen, the Cognitive Bias Codex, assembled by John Manoogan. Click on the image to get a better view.
Lovely, isn’t it? But it is a bit daunting too. So, start with the outside ring and work your way inwards. The outer ring begins with these four problems that arise when we get new information:
What Should We Remember?
Too Much Information
Not Enough Meaning
Need To Act Fast
The next ring shows how we deal with these problems. For example, if there is too much information, we focus on patterns, changes from before, or details that confirm our own beliefs. After contemplating these general tendencies, you can go on to the 188 cognitive biases caused by them, such as confirmation bias.
This amazing Codex has helped me gain a better picture of heuristics but, to tell the truth, I am not sure how to use it. Maybe Buster Benson, in Cognitive Bias Cheat Sheet, can help.[3] He suggests looking for cognitive biases when you are analyzing opinions from others that seem flawed and getting a grasp on the four risks of bias:
- We don’t see everything. Some of the information we filter out is actually useful and important.
- Our search for meaning can conjure illusions. We sometimes imagine details that were filled in by our assumptions, and construct meaning and stories that aren’t really there.
- Quick decisions can be seriously flawed. Some of the quick reactions and decisions we jump to are unfair, self-serving, and counter-productive.
- Our memory reinforces errors. Some of the stuff we remember for later just makes all of the above systems more biased, and more damaging to our thought processes.
That’s better. For example, look at numbers 2 and 3 above. I should have kept them in mind before I disparaged the waiter.
So, to summarize, you now know that even though we normally use the word “bias” to mean flawed thinking[4], bias is really the basis of all thinking. We have also broken this process down into two forms, fast and slow thinking, and gone to the bottom of the cognitive ocean by organizing biases into 188 types, and identifying four big risks in fast thinking. You should also know that you are in second place, not first.
[3] Actually, I put the cart before the horse. Manoogian’s Cognitive Bias Codex was inspired by this very article, not the other way around!
[4] It makes me wonder that if so, what do we call something that is free from bias? Objective? Humbug! There is no such thing.

Curtis Kelly (EdD.) is a professor emeritus of Kansai University, a founder of the JALT Mind, Brain, and Education SIG, and producer of the MindBrainEd Think Tanks. He has written over 30 books and given over 500 presentations. His life mission is “to relieve the suffering of the classroom.” Not a bad bias, is it?