A lot of what we do during a typical day is done on ‘autopilot’, accomplished by our ‘system 1’ thinking – for example, you can get up, get washed, dressed and ready to go out, almost without noticing. Navigating your usual route to work (if you aren’t working from home) requires little mental effort. If there is some disruption – a cancelled train or similar, you might have to think about an alternative route and would be engaging ‘system 2’ to do so, but the actions of walking, riding or driving along that new route are routine and easy for you – that’s system 1 again. I discussed these types of thinking (described by Daniel Kahneman in ‘Thinking Fast and Slow’) in a previous article.
When it comes to making decisions, our system 1 carries on providing us with instant judgments, almost without us noticing. The potential problem here is that many of these judgments are enormously skewed and subject to bias – we are effectively deceiving ourselves. A handful of biases have crept into common language – optimism bias for example, which I mentioned in an earlier article. There are claimed to be hundreds of types of bias, but there is a lot of overlap and similarity. I’ll discuss a handful of key biases here to start you thinking.
This weekend, for the first time, I was pushing my 5 month old granddaughter around John Lewis in a push chair (at this point you are supposed to say, Stephen, you don’t look old enough!). I was struck by how many other people were pushing kids around in pushchairs. Do you suppose there was a sudden rush of new (grand)parents to John Lewis over the Easter weekend? – of course not. What had changed was my perception. There will always be a roughly similar number of children being pushed around a shop in pushchairs, but I hadn’t paid much attention to this since my own kids were little. Because this was something that was very readily available to me (ie my attention was more focused on pushchairs because I was pushing one) I noticed others doing the same.
Daniel Kahneman and Amos Tversky called this the “availability heuristic”. Something can be drawn to our attention in a number of ways – personal exposure, press coverage, frequency of contact etc. Any of these can raise our awareness of a particular phenomenon and cause us to think of it more. Thus news articles about a plane crash might make us nervous about flying, despite being about the safest way to travel. When making decisions, or even just thinking about assumptions around which to base decisions, we need to be aware that we are being influenced by recent events, news articles or by what we have recently encountered. Brainstorming ideas in groups can help, provided the people involved aren’t all operating in a very similar environment all the time. Greater diversity of team members can, to an extent, help to temper the effect of this bias.
Research has shown that we care roughly twice as much about losing something as we do about gaining an equivalent thing. This means we are roughly twice as motivated to avoid a loss as we are to achieve a similar gain. This is known as “Loss Aversion”. You will have noticed that marketing makes effective use of this bias through the ‘fear of missing out’ or by using time limits to encourage us to act now (an not ‘lose’ the opportunity). When we’re selling something, we naturally value it more than if we were buying the same thing. We are more likely to avoid a £100 penalty than to be motivated by a £100 discount. Trial periods are used because we start to treat the trial as the status quo, so that we are less inclined to ‘lose’ the benefit when the trial ends. This loss aversion can influence our decision making, reducing our rationality.
Another type of bias is evident when I’m interviewing candidates for financial modelling roles, I have, in the past asked a question along the lines of “What percentage of Excel do you think you know”. Surprisingly often, inexperienced candidates rate their knowledge considerably higher than people with decades of experience, including myself. The more I learn about Excel, the lower I tend to rate myself. This effect was discovered by two researchers who gave their name to this form of cognitive bias. The Dunning-Kruger effect describes the phenomenon where inexperienced people tend to overestimate their abilities and highly experienced people tend to underestimate theirs. This occurs in many fields – for example most drivers think they are better than average. When recruiting, we should take measures to overcome this by objectively testing the skills we need, rather than relying on the candidates self-assessment.
These are just a few examples of the types of biases we are all subject to. Even when we are aware of them, we can still be influenced by them, but at least by having an awareness, we might double-check ourselves, engage system 2 thinking and question whether we are truly making rational choices.
If you’d like notifications when the other articles are published, you can subscribe here.