26 November 2020 08:18
We've debated about the proper ways to analyse the world around us since the dawn of civilisation. Formal systems of logic are ingrained in the programming and technology that we use every day, and they date back even further than the syllogisms of Aristotle. But that doesn’t mean that the human mind is flawless; far from it.
Here are some of the most common way that even the most rational of us can be lead astray:
Arguably the most significant one for the world at the moment and one that is only accelerated by the many algorithms that dictate our internet search results and browsing behaviour.
We’re instinctively drawn to those who share our opinions. It’s no surprise that people with right-leaning political viewpoints read more right-wing literature and that people with left-leaning political viewpoints read more left-wing literature. It’s natural to want to seek like-minded company, but this can become a problem when it distorts our perceptions of reality and leads us to decisions based on social groupings rather than policy efficacy.
It’s perhaps natural to lead on from this to social proof. We’ll see later on that we have an entirely different set of standards for the opinions of others than our own. However, there is one sense in which we really do value the opinion of others.
The more people endorse an opinion, the more credibility it intuitively has with us. When other people see that it was already approved by others, they are more likely to approve it themselves - especially if it is by people who share opinions on other subjects as well. This isn’t necessarily a bad thing as group opinions can often be correct, but it’s worth remembering that ‘1,000 people can’t be wrong’ is not in and of itself a concrete logical argument.
It’s more important to see that something coheres to the truth rather than to a group opinion, perhaps even that there is a specific causal link involved.
One of the more infamous biases. This concerns our ability to establish patterns in behaviour from seemingly random data. That’s not necessarily a bad thing, in fact, it’s an incredibly useful skill to be able to discern a causal relationship out of the dissonant noise of the phenomena around us. The problem is that it is easy for this faculty to misfire in ways that can be hard to detect.
It’s intuitive to believe that, if we see two things occurring at roughly the same time and distribution, they might well be linked. However, there are countless examples that can be given of entirely incorrect behaviour attribution; anything from a decline in piracy leading to global warming, to ice cream consumption and internet explorer use causing an increase in murders.
Humans have a natural tendency to over-ascribe explanatory power to the most recent of two otherwise equally valid causes. But this ignores the fact that the complex systems of the world around us causes accumulations, and importantly, delays that can mask the real causes of the behaviour we see around us.
It’s often said that, “hindsight is 20/20” but there’s perhaps a surprising amount to unpack in this simple phrase. When others make a mistake, particularly a highly publicised one, there’s often a great deal of judgement or at least punditry that weighs in after the fact. Often, these opinions ignore or at the very least insufficiently account for the lack of information that people have at the time.
In the heat of the moment, a coach may tell their team to play a certain way, with disastrous results for a match. But the fact is that they do not know for certain how well their instructions will be interpreted, how the opposing team may react, how a referee will see things, or a myriad of other different factors.
That information is usually only available after the decision is rendered, when the results have already come in. Hindsight bias tends to forget this and place an unduly high expectation on the information available to people at a given time. It’s obvious to us after the game that the coach’s decision was wrong, because we can see the results of the decision with absolute certainty. The future is usually far less easy to interpret.
Sometimes called ‘illusory superiority,’ in short, what this entails is a distorted belief in our own abilities. We often overestimate our capabilities in comparison to others. This is especially so when we look out our hypothetical abilities versus the actual performances of others - itself part of a hindsight bias.
This also has links to conditions such as narcissism and the Dunning-Kruger effect. The latter is a bias whereby we overestimate our own degree of knowledge on a given subject. It is related to overconfidence bias in that it is partially contingent on our perceived abilities being higher than our actual ones.
Because we are not well enough acquainted with a subject we lack the ability to accurately gauge our expertise in relation to other people - people who may well be far more knowledgeable than us but less sure of themselves because they’re aware of complexities that we are not.