We usually think of ourselves as sitting in the driver’s seat, with ultimate control over the decisions we make and the direction our life takes; but alas, this perception has more to do with our desires – with how we want to view ourselves – than with reality.”
Dan Ariely
We like to think we act rationally, considering the elements of a situation objectively and arriving at logical conclusions.
But we’re not as smart as we think.
Our brains, powerful and complex as they may be, are not without their limitations. We are prone to many innate biases that affect our thinking, cause us to make bad decisions, and lead us to judge others and ourselves unfairly.
Psychologists call these inherent thinking errors cognitive biases, and they affect us all – the educated and uneducated, the young and old, the expert and the novice.
It turns out we’re also quite good at deluding ourselves about these biases. We’re not very good at noticing what we’re not very good at. As Harvard psychologist Dan Gilbert writes, “people are better at playing these sorts of tricks on themselves than at catching themselves in the act.”
People realize that humans deceive themselves, of course,
but they don’t seem to realize that they too are human.”
Dan Gilbert
Understanding these cognitive biases doesn’t make us immune to them. They still act on us predictably, causing us to think and act irrationally. But knowing how we may be affected can keep us in check. We can consider the influences on our thinking, take others’ perspectives, be slower to judge, and ultimately act with more empathy and compassion.
1. Fundamental Attribution Error
Multiple times a day, we try to interpret and explain the behaviour of ourselves and others.
The fundamental attribution error describes how when we see someone doing something, we tend to attribute their behaviour to their personality rather than the situation.
For example, let’s say you wave hello to an old friend in the street, and they ignore you. You could conclude that they’ve deliberately ignored you, they didn’t want to see you, and/or they don’t like you. But maybe they were simply lost in thought and didn’t see you.
We see the behaviour, jump to a conclusion about the cause, feel wronged, take it personally, and judge that person as mean or inconsiderate. Worse, we could retaliate against some imagined wronging and feel justified in being mean in return.
Conversely, when analysing our own behaviour, we tend to overemphasise the role of a situation and downplay the role of our personality:
- I acted this way because I was forced to by the situation. You acted this way because you’re mean.
- I was late because I was stuck traffic I couldn’t predict. You were late because you’re bad at time management.
- I didn’t clean up because I’ve had a long day and I’m really tired. You didn’t clean up because you’re so lazy.
Treat everyone as if they might be going through something rough that you don’t know about.”
Chris Brogan
As I’ve written about before, most people, most of the time, are good and decent and are simply trying their best to make their way in life. If we consider the fundamental attribution error, we’ll cut other people some slack, and instead of responding with rage, maybe even offer our sympathy.
2. Planning Fallacy
In 1957, Jørn Utzon won an international competition to design a new opera house for Sydney. Construction began in March 1959, and was forecast to finish in January 1963, at a cost of around $7 million.
In 1973, the scaled-down Sydney Opera House finally opened, at a cost $102 million, a decade late and an astonishing 1,457% over budget.
Hofstadter’s Law: It always takes longer than you expect, even when you take into account Hofstadter’s Law.”
Douglas Hofstadter
The planning fallacy describes our tendency to underestimate the time, cost and risk of doing anything (and overestimate the benefit). Research suggests that on average, we underestimate by about 40%.
We’ve all been there – late because of unforeseen traffic, missing a deadline, a project going over budget. We often imagine the best case when we make plans, assuming things will go exactly as expected, with no unforeseen delays or interference. But reality is rarely best case, and is often worse than our worst case predictions. One 1995 study found that even when participants made very conservative predictions (99% probable) of how long it would take to complete a project, only 45% actually finished before their conservative estimates.
We all have at least one chronically late friend whose behaviour seems frustratingly selfish and inconsiderate. But the planning fallacy can help us understand that rather than being selfish and inconsiderate, our friend may simply be bad at predicting how long a task will take.
There’s a big difference in how we judge others when we think their actions are deliberate instead of accidental, or if their attitude is disrespectful rather than careless. Understanding the planning fallacy might be the difference between cutting someone some slack instead of cutting them out of your life.
3. Illusory Superiority and the Dunning-Kruger Effect
93% of Americans say they are better than average drivers. 90% of Stanford students say they are more intelligent than their peers. 94% of university professors rate themselves as above average.
Of course, they can’t all be right.
We tend to overestimate how good we are compared to others in a range of abilities and behaviours, including intelligence, fairness, healthy eating, driving, punctuality – and ironically, how biased we are – a phenomenon known as illusory superiority.
Not only do the vast majority of us think we are better than average, but the worst performers overestimate their abilities the most.
In 1999, David Dunning and Justin Kruger conducted a study where participants completed a series of tests of grammar, logical reasoning and humour. They asked the participants to rate how well they did in the tasks compared to other participants, and then compared the real results with participants’ self-ratings. The worst-performing 25% were by far the least accurate comparing themselves to others, and rated themselves above average – performing better than 60% of participants.
This insight became known as the Dunning-Kruger effect – that the least competent performers think they’re much better than they really are. As Dunning writes, “if you’re incompetent, you can’t know you’re incompetent…the skills you need to produce a right answer are exactly the skills you need to recognize what a right answer is.”
Our overestimation in these ways leads to overconfidence, and overconfidence leads us to think and act irrationally. It has been blamed for everything from lawsuits and reckless policies to stock market crashes and even war.
Overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion.”
Daniel Kahneman
Ignorance more frequently begets confidence than does knowledge.”
Charles Darwin
Awareness of these biases can help keep our own overconfidence in check. Secondly, it is a reminder when we consider others’ actions and opinions – and choosing how to act ourselves – that we rarely have the full picture.
One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision.”
Bertrand Russell
Another method that has worked for me is to assume I am average or below average. It helps me stay in a student mindset, not dismiss the wisdom of others and be more receptive to what I can learn from them. More, it dampens the fear of failure and paralysis of perfectionism – someone who is still learning can hardly be expected to be perfect, so mistakes are expected and welcome.
4. Confirmation Bias
Unconsciously and consciously, we seek out opinions and information that confirm what we already believe. Moreover, we tend to avoid, dismiss or downplay opinions and information that challenge what we already believe.
We’re more willing to accept evidence we agree with at face value, and we hold evidence we don’t agree with to a much higher standard.
This is confirmation bias, and it’s a problem.
When our bathroom scale delivers bad news, we hop off and then on again, just to make sure we didn’t misread the display or put too much pressure on one foot. When our scale delivers good news, we smile and head for the shower. By uncritically accepting evidence when it pleases us, and insisting on more when it doesn’t, we subtly tip the scales in our favor.”
Dan Gilbert
As Tufts University Psychology Professor Raymond Nickerson writes, “If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration.” He goes on to suggest that confirmation bias alone, “might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.”
Like the majority of cognitive biases, we’re all susceptible to confirmation bias. We tend to read, watch and listen to sources that agree with our political, social and moral opinions. We mostly spend time with people who have similar views to us. And we avoid people and sources that challenge our views and make us feel uncomfortable or insecure about them.
If we’re not careful, this can lead to feelings of divisiveness, tribalism, bigotry and worse:
Few things have done more harm than the belief on the part of individuals or groups (or tribes or states or nations or churches) that he or she or they are in sole possession of the truth, especially about how to live, what to be and do – that those who differ from them are not merely mistaken, but wicked or mad: and need restraining or suppressing. It is terrible and dangerous arrogance to believe that you alone are right, have a magical eye which sees the truth, and that others cannot be right if they disagree.”
Isaiah Berlin
Seek out other views. Read widely.
Lean into the cognitive dissonance of exposing the controversial or unpopular opinions of people you respect intellectually and morally. Billionaire investor and PayPal cofounder Peter Thiel likes to ask interviewees, “What important truth do very few people agree with you on?”
We don’t like being wrong, and as Kathryn Schulz writes, “It is not in our human nature to imagine that we are wrong.” Considering how confirmation bias is affecting us can help keep arrogance in check. We don’t always have all the answers, and people who disagree with us aren’t always idiots.
As I’ve written about before, assuming we’re always right and dismissing opposing views in this way can lead to dehumanising and discriminating against those who hold them.
Confirmation bias nudges us to avoid people with opposing views, but engaging with them can make us more compassionate.
Often when we get to know someone whose words and deeds were off-putting, once we get a better sense of how that person is understanding events, our dislike dissipates.”
Thomas Gilovich
Further Reading
Thinking, Fast and Slow – Daniel Kahneman
Predictably Irrational: The Hidden Forces That Shape Our Decisions – Dan Ariely
The Upside of Irrationality – Dan Ariely
Being Wrong: Adventures in the Margin of Error – Kathryn Schulz
You Are Not So Smart – David McRaney (and his excellent blog)
Stumbling on Happiness – Daniel Gilbert
The Wisest One in the Room: How You Can Benefit from Social Psychology’s Most Powerful Insights – Thomas Gilovich
Deceit and Self-Deception: Fooling Yourself The Better to Fool Others – Robert Trivers
Confirmation Bias: A Ubiquitous Phenomenon in Many Guises – Raymond Nickerson
The Triumph of Stupidity – Bertrand Russell
Leave a Reply