You're Biased | Just Reflections - Issue #7
Hi friends,
Great to have you here for yet another week. I hope you had a great week.
This week I wanted to write on the topic “Do I want to have kids?”, I was really excited about it, but I spent way too much time thinking and reading about cognitive biases that I had more things to say about that than about the kids issue. So we’ll talk about kids next week. For now, let’s take some time to understand why we’re all biased.
Every day, we make decisions and judgements. At every moment, our brains have a tonne of information coming in, way more than they can handle. Sometimes we have enough time to weigh everything before we decide, but other times we decide on the fly using little information. Because of that, we take many unconscious mental shortcuts or — to say it in a more fancy way — we use heuristics to decide quickly.
Heuristics are straightforward rules of thumb that we develop based on our experiences. They allow us to simplify complex scenarios and make quick decisions. We do this because life would be really exhausting if we had to deliberate on each of the countless decisions we have to make every day. The image below shows a heuristic that helps drivers decide whether they can fit under a bridge. They don’t have to remember the height of their vehicle or understand how much space is there under the bridge. As long as their vehicle hits the sign they know they can’t fit under the bridge.
Heuristic for deciding whether to your vehicle's height is safe enough to drive under a bridge
Heuristics are not about making the best decision but about making one quickly. If you’re driving and child runs into the road, you don’t start calculating their speed relative to the speed of the car and assessing other traffic behind you. With barely a thought, you’re more likely to step on the brakes immediately. You have a heuristic that says stepping on the brakes is likely to be the correct decision.
Normally, heuristics help us dismiss unimportant details and reach quick conclusions. However, they can also cause biases that misguide our logic. Salespeople, marketers, commercial clairvoyants, psychics, astrologers and — dare I say it — pastors, popularly use our biases against us all the time. If you want to learn more about how psychics can exploit our cognitive biases, check out this video by SciShow Psych on YouTube.
There are literally hundreds of cognitive biases that most of us are unaware of. And we’re all vulnerable to them even when we’re well informed about them. Let’s look at a few.
Heuristics and biases
Cognitive biases were the primary subject of Daniel Kahneman and Amos Tversky’s 1974 paper title, “Judgement under Uncertainty: Heuristics and Biases”. The ideas were further expanded in the book “Thinking, Fast and Slow” also by Daniel Kahneman. In it, he talks about how our brains like to find patterns and relationships to help process and store information. Unfortunately, because we’re constantly looking for these, we’re prone to finding relationships and causality where none exists and hence assign greater meaning to things that occur purely due to chance.
Here are the three primary cognitive biases discussed in the paper:
The Representativeness Heuristic — This is mistaking plausibility for probability. Kahneman illustrates this with an experiment called the “Linda Problem”. It goes like this; Linda is a social activist. Is she more likely to be a ‘bank teller’ or a ‘feminist bank teller’? Obviously, she is statistically more likely to be a ‘bank teller’ because all ‘feminist bank tellers’ are ‘bank tellers’, so if we choose ‘feminist bank teller’ as our answer, we’re decreasing our odds of being right. However, many people choose ‘feminist bank teller’ simply because she’s an activist, so it feels more plausible.
The Availability Heuristic — This operates on the notion that “If I can think of it, it must be important”. It’s a mental shortcut that occurs when people make judgements about the probability of something based solely on how easy it is to think of examples. We assume our memories are a representative sample of reality and discount events that are outside of our immediate recollection. If we can quickly think of multiple examples of something, we will believe it’s quite common. This heuristic causes a bias in how we assess risk. Trends and media coverage cause people to fear some risks more than others, where in fact there is no legitimate reason for it. For example, because of its wide media coverage, most Americans assess the likelihood of a terrorist attack much higher than it actually is. In reality, studies estimate that you’re 130 times more likely to be killed by the police in America than by terrorism.
The Anchoring Effect — This occurs when we overestimate the accuracy of the information we received first and this disproportionately affects any subsequent assumptions we make. If you ask a random group of people how old Gandhi was when he died, you’ll get much higher estimates if you add the anchoring question, “Do you think he became older than 114?” than if your anchoring question was, “Do you think he died before the age of 35?”. You will certainly encounter anchors when you walk into a store and see a price tag written: “Was
£59.99, now £29.99”. You’re more likely to buy this thinking it’s a good discount than if the price tag was just written “£29.99”, regardless of the actual value of the item.
There are many other interesting heuristics and biases I’d love to talk about — like how we have a tendency to trust people who are good looking — but this would become too long. While some of these examples might seem quite particular, the point is to illustrate the prevalence of heuristics in our lives and the errors they lead to when we aren’t careful.
How much can you really trust your brain?
Some people accept that cognitive biases occur but don’t believe that they have any. First, thinking you have no bias is itself a cognitive bias. Look up “The Blind Spot Bias” or more confusingly “The Bias Bias”. Second, our brain lies to us all the time about many things. One way to illustrate this is with optical illusions. Take this image, for example.
These lines appear to be angled up or down, but the horizontal lines are actually all parallel. Need proof? Try covering the top and bottom of one line of squares with a piece of paper. No slants to be found!
A nice thing about optical illusions is that we can easily show our mistakes. With this one, we can use a straight-edge to show that the lines are parallel. Clearly, our eyes were deceiving us. The interesting thing is that when we look at the images again after knowing that the lines are parallel, we still see them as slanting. It’s as if we have learnt nothing in the last few seconds and our brains are still convinced that the lines are slanting, regardless.
You still can’t look at it and see reality as it is. It feels impossible to overcome this sense even after proving it. Our intuition is fooling us in a repeatable, predictable and consistent way and there seems to be nothing we can do about it except to measure it every time we want to see the truth.
Vision is one thing we do really well. A big part of our brain is dedicated to vision and we are well-practised in it because we use it for many hours of the day every day. But we still have these predictable and consistent deceptions in our vision, something we’re very good at. So what are the chances that we make even more mistakes in other things that we’re not as good at? That we don’t have a specialised part of the brain to do and that we don’t do many hours of the day every day.
Chances are that we make many more mistakes and we don’t have easy ways to see them because, unlike optical illusions, we can’t just pick up a ruler and measure them to show the mistake, it’s much harder. And just like visual illusions, the moment we stop measuring, we snap right back to our deception.
Now, with this perspective, think about all the things that you’re convinced are true. How sure can you really be that you haven’t fallen prey to your cognitive biases in your reasoning? We have a fantastic understanding of our physical limitations. We build stairs and cars and hammers and scissors and rulers because of that understanding of our physical limitations. Unfortunately, we don’t have the same understanding of our cognitive limitations and so we have few tools to assist us with them.
What then can we do?
Because our cognitive biases so often lead us to faulty conclusions, it’s important to be humble about our views. We rarely notice when we’re being biased or when someone is taking advantage of us using our biases.
Considering our fallibility, we have to do something that doesn’t come easy. We must recognise that the world is an uncertain place and that our judgements about it are often wrong. We should listen to opinions that we might initially consider as wrong or even offensive.
Use the scientific method on yourself. When you have an idea or a firm belief about something, intentionally and sincerely try to disprove it. If you think that eating meat is bad for you — for example — don’t Google “reasons eating meat is bad”, instead Google, “reasons eating meat is good” and attentively learn the other side of the argument. You might discover that you were wrong. And if you don’t find that you were wrong, you’ll have a stronger, more balanced case for why you’re right.
We should surround ourselves with people who continuously challenge our thinking and force us to critically consider the facts about our perspectives. If you have a great idea and you find other people who believe that idea and surround yourself with them so that you never get criticism, that’s a sure way of having confirmation bias and that can lead you down the wrong path.
If your friends always agree with every idea you have, then maybe you need to expand your friendship circle. Try to find friends who will challenge your ideas in constructive ways. Who won’t just stop at telling you it’s a great idea but will also tell you what’s wrong with it so that you can work on making it better.
Our intuitions are useful and even necessary for making quick judgements about the world, but that doesn’t mean they are correct all the time. Recognising the flawed nature of your thinking is a bold first step to challenging it. A better understanding of the heuristics we use regularly can help us predict our errors and make better decisions under uncertainty.
That’s all I have for you this week. If you like the newsletter, consider sharing it with others on Twitter, WhatsApp or Facebook. Hit the thumbs up or thumbs down below to let me know what you think about the issue.
I hope I’ve given you something to think about this week and I wish you ever-increasing curiosity.
Until next week.
BK
Impactful ideas that challenged my thinking.
I have a lot of interests so I'm always learning all kinds of things, some of which really challenge my thinking. In the Just Reflections newsletter, I'll be sharing with you a summary of the ideas that challenged my thinking recently and hopefully they will challenge yours too and we grow together.
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue