Somewhere in a Psychology Laboratory
What you believe affects your ability to do maths, apparently. If you make up a question about a medical drug and ask people whether the numbers you present mean that the drug is effective, most get the correct answer. But if you rephrase the study to be about gun control instead, those with strong views on the topic will be much worse at seeing what a few numbers really mean. If you’re intrigued about the details of the questions, see the study itself which is from Yale [1]. What’s surprising is that being good at maths made no difference to the outcome. It’s as though the more skilled someone is the more quickly they ‘see’ an interpretation they want to see, or the more they strive to find one. That’s a problem if you think that having good information is all that’s needed to improve our decisions.This isn’t new.  Shakespeare or the ancients could have told us that what you believe strongly affects what you see. Over the decades other big studies have shown that political experts do worse than chance at predicting the future – and again, the more expert they are, the more confident in their own opinions, and the worse their chance of actually being right – numerous studies of doctors show similar patterns [2].

What’s going on under the surface? When we see something unfamiliar, the only way we have of making sense of it, is by relating it to what we already know – that’s what “understanding” means. The more certain you are of what you know, the more likely it is that you’ll quickly snap to seeing the new in those terms. If you strongly believe gun control is bad, you’ll have difficulty even seeing evidence to the contrary. If you’d like to find your own examples just listen to politicians debating in parliament. Any actual facts presented are likely to be either ignored, or waved away as irrelevant, while the party line is then repeated ad nauseam. We’re all a bit like that. Darwin wrote that if he heard something contradicting a cherished belief, he had to write it down within 30 minutes or his mind would lose it, like we would now say the body rejects transplants.

Why this is Important
Strong beliefs and convictions distort, but they’re valuable too. People like a strong leader we’re told, and history is scattered with those who have succeeded by having an unshakeable conviction in their beliefs that has given them strength. It’s valuable for motivating and leading others, and for drumming up fanaticism. But useful though that is, in matters where we’re less concerned about persuading others and instead we really want to think clearly and effectively, then we need to accept that strong beliefs will cause poor thinking.

The problem of our beliefs affecting how we see is important because we believe so many strange things, and we see the world through those beliefs. Teenagers believe that driving fast is good; some believe that having expensive cars or job titles is central to life’s purpose; or that you have to pay executives huge salaries to “get the best” (despite any evidence to the contrary). Those are my examples of some strange beliefs, I’m sure you have better. There’s nothing wrong with beliefs – we have to have some – the problem is that we seem to forget that we have them, or that we’re seeing a distorted world through those lenses. As our opening study suggests, facts aren’t going to help us if our convictions mean we can’t take them in, and that we only use our intelligence as a tool for explaining them away. Recently, to take a charitably plausible version of events, the TEPCO people at Fukushima believed radiation levels were safe and you’d suppose that’s what they would have wanted to see. Remarkably they didn’t perceive that their meters had hit maximum, and the actual levels were lethal .[3]  Presumably “lessons will be learnt”, and new procedures will be added to a handbook, but I’m not confident that will have much real effect.

Why haven’t Lessons been Learnt?
It’s astonishing that we’ve known about these limitations in how we think for centuries. Researchers come up with new examples every few years, papers are published, popular books of advice are written. But there’s scant evidence that we’ve got any better at thinking as a result. We shrug our shoulders and carry on arguing our corner, proving what we ‘know’ to be true – even in life and death politics – as though nothing has been learnt. (If you go to a lot of meetings, you’ll surely know what I mean). Why has all this knowledge and repeated research had so little effect?

Will Science and Reason help?
What can we do? You’d hope that reason and science may rescue us – the triumph of reason over emotion – though the maths example we started with should make you feel uneasy about that.

“Science” (just a word for ‘knowledge’) fabulous though it can be, suffers the same problem. That’s why politicians can cite convenient scientific truths “scientists say CJD cannot cross into humans” – or only a few years ago how all the scientific experts knew that continents could not move…choose the examples you like best. Why don’t reason and science help us? Logically it’s because all our reasoned arguments have to start somewhere, and that starting point ends up being perceptions, or beliefs, or values that we hold true – you’ve got to start your argument with something for reason and logic to get their teeth in to. As Hulme put it “Reason is powerless in the face of some of the most elementary realities“. This is especially true, as our maths example shows, if we start off by holding strong views in the first place – as of course people do, because reason easily becomes just another tool to argue for what we want to believe. If you’re clever enough, you can prove you are right – to yourself at least. There’s nothing wrong with reason, it’s an excellent invention, but just less powerful than you might have thought.

Making a Start
So what can we do to think more effectively? Should we accept how we think now as a natural limit? People may not be able to fly, but we invented the aeroplane. We may have biases, inadequacies and partial knowledge, but we can develop tools to overcome them – much as we invented writing, reason, maths, and language to overcome other “fundamental” human limitations. Looked at that way there’s every reason to hope we can learn to think more effectively, in particular in the face of our own distorting certainties.

To begin with the issue of strong beliefs distorting our thinking, here’s a few practical exercises to get started. They’re not exercising your reason, they’re exercising how you see:

  • Practise being wrong. Through school and life we’ve learnt that being wrong is the worst thing you can be. We’ve learnt to hedge and to defend in order to avoid appearing wrong. So try pronouncing confidently in those areas you actually are pretty certain about. Do that in a way where you can easily be shown up if you’re wrong. Then frankly admit it. Experience the feeling of being wrong, and learn to love it. It may open up new worlds.
  • When you find you’re holding a strong conviction of some kind, ask yourself what evidence it would take for you to see that you’re wrong. The exercise itself can be enlightening. Remember, we do need convictions, it’s just problematic when we don’t realise we’re seeing the world through them as the only possibility.
  • Make use of marketeers’ efforts to create new convictions for you. When you see an advert that attracts you, ignore the product and ask yourself what’s really of value to me there? For example, it’s not the drink bottle (of course), it’s the idea of friendship and fun in the picture that attracts me. It’s a great way to make use of the work put into adverts and to discover something more about what convictions are really behind your response each time.
  • Reading newspapers or articles about politics or beliefs you’d normally find unattractive – just to become more familiar with the discomfort of having your own beliefs challenged.

If you practise those for a while, then when you want to think more clearly about something, you’ll find your mind is far more able to see what’s actually there, rather than just showing you what it wants to see.

Adrian West

References
[1] “Motivated Numeracy and Enlightened Self-Government”, D.M. Kahan et.al., The Cultural Cognition Project, Working Paper no. 116, Yale Law School, September 2013.

[2] See “Errornomics: why we make mistakes and what we can do to avoid them“, by Joseph T. Hallinan. 2009, or search for the “Case of Joseph Kidd”  which was used in one such medical study, or the work of Philip E. Tetlock for more detail.

[3] Fukushima radiation levels “18 times higher than thought” – a BBC article, 1st September 2013.