top of page

ON CONFIRMATION BIAS

13 November 2017

​

When you teach Theory of Knowledge, the IB’s syllabus of epistemology, you usually include a module on cognitive biases.  A famous example of a cognitive bias is confirmation bias.  It means that (unwittingly, subconsciously) people tend to look for information that confirms their preconceived ideas, prejudices or beliefs.  Such a bias, when it goes unrecognised, can seriously limit the knowledge we acquire during our quest in life.

​

For example:  if you enjoy a glass of wine with your dinner, you tend to focus on articles in the media that talk of the benefits of moderate drinking, and you ignore medical evidence of alcohol-related diseases.  If you are a teetotaller, you only notice the articles that warn about the health risks associated with consuming even a small amount of alcohol, and you dismiss the research that has gone into its benefits.

​

If you have a stereotypical view of another nation, for example if you see the stereotypical German as a Lederhosen-wearing, yodelling, beer-drinking, Bratwurst-eating and completely humourless party-pooper, you will look for evidence of this during your trip to Germany.  You will take photos of the guy in Lederhosen, the beer gardens and the barbecues, not of the wine festivals or your hosts’ vegetarian feasts.  If you meet 20 German citizens and one of them really is a boring old spoilsport, you will feel confirmed in your bias about the stereotypical German and dismiss the 19 fun-loving guys as ‘exceptions’ to the rule.

​

The same is true for evaluating the past.  If, for example, a relationship has gone wrong (a marriage, a friendship, a work relationship), both people in this relationship have the same evidence available to them:  past positive experiences, past negative experiences.  It is normal to have a mixture of both in any type of relationship.  Nonetheless, depending on where these people are on the spectrum of confirmation bias, they will evaluate this evidence differently or selectively. 

​

One person may focus solely on the negatives and dismiss the evidence that there were good things worth salvaging; another person may only focus on these happy memories and dismiss the negative things as insignificant.  This is also called ‘selective memory’ and such a polarisation of different attitudes can have a highly antagonising effect.  For most people, especially for those with high expectations of themselves, it is not easy to admit that they may have made any mistakes, and so, subconsciously, we seek to find evidence in the chambers of our memory that confirms that we are right and that the other person is wrong.

​

It can also mean that, oblivious of our own confirmation bias, we accuse others of theirs.  It prevents us from moving forward and avoiding mistakes in the future.  We may, for example, try to avoid a difficult conversation for fear that it might challenge our own rationalisation of events from our past, or for fear of showing vulnerability and risking its possible repercussions.  Conversely, we may deliberately seek that conversation purely to justify our own rationalisation of these events and not because of a genuine desire to listen and learn.  Recognising these potential pitfalls may be a first step towards an honest and robust discussion.

​

Whenever I have taught Theory of Knowledge to my IB students, I have always tried to ensure that they end up being aware of the pitfalls of confirmation bias in their acquisition of knowledge and in their debate of ethical questions.  As it is so ingrained in our subconscious, we may never be able to overcome it completely but we can certainly take steps to guard against it.

​

The Theory of Knowledge syllabus distinguishes eight different ways of knowing, but in all my discussions with students we keep coming back to the two most prevalent ones:  reason and emotion.  When we use reason to arrive at knowledge, we think with our head:  we use arguments, counter-arguments, logic.  When we use emotion as a way of knowing, we follow our heart, our instinct, our gut feeling.  It is the old distinction of IQ and EQ, of rational and emotional intelligence.  It is very much to the IB's credit that they pay so much attention to developing both.

​

I always tell my pupils that the main objective of the course is to make them into self-aware learners who reflect on the ways in which they arrive at knowledge, who are aware of the different ways of knowing, the overlap of reason and emotion in so many areas of knowledge, the cognitive biases that may hinder their progress, and who maintain an open mind and heart in their research and debate.

​

To return to the three examples  above:  in scientific subjects, try to examine as objectively as possible the research into the benefits and dangers of alcohol, without any preconceived ideas of finding the evidence that suits your lifestyle best.  If you are travelling, try to find evidence that confirms as well as rejects the preconceived stereotypical ideas you may have formed about a nation.  Be open-eyed and open-eared.  Use your sense perception (a third way of knowing) to form your own judgements. 

​

In ethical debates or discussions:  do not enter into them with preconceived ideas.  Do not assume that you already know what the other party will say and try not to deny someone a voice based on such a preconception. You may just be guilty of the same confirmation bias that you suspect in others.  Be open-minded, kind, generous and curious.  Give yourself the chance to listen, to review your opinions and to reflect critically about your own cognitive bias.  If you recognise it in yourself, it will make you more tolerant of it in others.

​

The English language has a wonderful word for someone who stubbornly refuses to learn:  a ‘mumpsimus.’ The word dates back to the 16th century, and it is believed that it refers to an illiterate priest who during Eucharist services repeatedly used the word ‘mumpsimus’ instead of the correct Latin ‘sumpsimus’.  He stubbornly continued to do so, even after the mistake had been pointed out to him.

​

We can all be reluctant to revise our opinions sometimes.  But I hope that those students I have taught over the years will continue to work at defeating their inner mumpsimus and that, at least, they will not accuse others of theirs whilst condoning their own.  Confirmation bias is an integral part of human nature, and it is a challenge for us all, instead of denying it, to acknowledge its existence in ourselves and others and thus to enable us to work hard and with humility at mitigating its effects.

bottom of page