The most useful post you'll read this year, and I'm not biased

thinking is hard

Decision-making happens in the front parts of your brain, and any time this area of your brain lights up, your brain is using up a lot of energy. This area of the brain is kind of like your center for manual override. It's doing that with executive function, which means it requires conscious effort. What is it overriding? The amygdala; your emotional, reactive center that drives you to fight or flight or curl up into a ball and wish that it would all go away.

The brain loves efficiency, so it doesn't use this executive function any more than it has to. We wouldn't be able to function if we looked at every moment as a completely new situation. Instead, we learn, and leverage our experience to move a lot of the "noise" into subconscious (low-effort) thought. The brain relies on a lot of shortcuts to minimize the brain strain, so we can respond in real-time. 

It's a great and useful system, except for one thing that most of us don't recognize.

we're not making rational decisions

When you're making a decision, however small, your brain is trying to predict the outcome. It uses all sorts of emotional shortcuts and impressions to try and settle the issue as quickly as possible. 

Although you think you're making a rational choice, it's more likely that your brain is making an approximation of rationality, leveraging you amygdala - the emotional part of your brain. There are a bunch of studies out there trying to identify how our brain works when making decisions. This one about the "framing effect," for example, explains how the emotional part of the brain lights up, drawing on a bias to make the decision, rather than using the executive decision-making function to analyze the data, and then decide.  This suggests that our emotions and biases playing a more significant part in making the decision, even when we think we're making a "rational" decision.

These shortcuts have their purpose and value, but the downside is that they can also lead to poor choices, ineffective habits, and damaging biases. It's so much easier for your brain to stay on the easy-train of "being right," but ultimately that's just an illusion that carries its own risk. 

We're all biased

 There are twelve common cognitive biases that get in the way of your rational mind. 

  1. Confirmation Bias (Only ideas that reinforce my thinking are valid.)
  2. Ingroup Bias ("We" are better than "them," obviously.)
  3. Gambler's Fallacy (The odds in a coin flip being heads is greater, because the last three flips were heads.)
  4. Post-Purchase Rationalization (Not what I planned to buy, but what a bargain!)
  5. Neglecting probability (Flying in planes seems more dangerous than driving in cars.)
  6. Observational Selection Bias (You buy a VW, now you notice VW's everywhere.)
  7. Status-Quo Bias (If it ain't broke, don't fix it...and change is uncomfortable so it ain't broke.)
  8. Negativity Bias (No news is good news.)
  9. Bandwagon Effect (Brian, "We are all individuals!" Crowd "Yes!  We area all individuals!"
  10. Projection Bias (You know what I mean with this one.)
  11. Current Moment Bias (I'll take the marshmallow now, thanks.)
  12. Anchoring Effect (Relatively speaking, it seems like better value.)

CHALLENGE YOUR OWN IDEAS

While your "gut" may be right, sometimes it's good to give it a good reality check. It helps to start building your awareness of the biases slipping in to your own decision-making, or that of your team's. Even being able to do that is a really great start.

Once you're aware, it's time to challenge the thinking. There's a system called "Six Thinking Hats" that offers an approach to minimizing bias in decision-making. The system name sounds like an inauthentic corporate program, but the general idea is a good one. In a nutshell, the approach is to turn dilemmas over in your own mind, using different thinking styles: data-driven, intuition, conservatively/defensively, optimistically, creatively, and process-oriented

 

One more good resource on the sources and impact of bias: