I’ve been thinking about free will, which has in turn got me thinking about all the flaws we all have in our abilities to reason. We all have them to a lesser or greater extent – and they’re often surprisingly subtle. The subtly disturbs me – it leaves the door wide open for bad reasoning, and smashes at the notion of free will.
An Example: Confirmation Bias
The errors in our reasoning that I’m talking about here are not mistakes we make while doing math in our head, but rather errors we make when supposedly reasoning towards a truth.
These errors in our reasoning abilities are often called human biases. A favourite of mine is the (motivated) confirmation bias – it’s ubiquitous, subtle, and scares the living hell out of me as a result. Here’s a nice definition from Wikipedia:
Confirmation bias is a tendency of people to favor information that confirms their beliefs or hypotheses. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs.
This is unconscious, which makes it all the more pernicious. It’s “motivated” as people are motivated to defend a belief or hypothesis that they already hold, unwittingly selecting information to support it.
For example, if you believed in “alternative medicine” – your belief may be unconsciously bolstered whenever you read about the case of some poor child recovering from a dreadful disease after being dosed with a sugary homeopathic remedy.
Hypothesis-Determined Information Seeking and Interpretation
But what about the other facts: the number of people who died from the disease, the number of people who recovered without any medication whatsoever, and so on.
These hint at some of the theories scientists (see below for reference) are developing as to why we have a confirmation bias.
- “Restriction of attention to a favoured hypothesis” (it wasn’t chance, or a different medicine that saved them – its as the homeopathic medicine)
- “Preferential treatment of evidence supporting existing beliefs” (the people who recovered without the homeopathic medicine probably had some homeopathic trace elements in their food – and look, the ones that took it recovered. Amazing.)
- “Overweighting positive confirmatory instances” (ZOMG look, 50 cases of recovery! It must be true. (Ignoring the 1000 cases that didn’t, not even seeking them out.))
See the reference below for the science behind these, and experiments which appear to indicate that these are mechanisms behind confirmation bias.
Thoughts: Consistency and Religion and Free Will
- What scares me about the confirmation bias is that we’re all susceptible, it’s unconscious, and it leads to us draw false conclusions (such as earnestly believing in something that’s demonstrably false, such as homeopathy). I wonder to what extent I am biased in this way.
- The scientific method is one way in which we try and ascertain truths without bias. That doesn’t mean scientists are not without confirmation bias. But at least science has mechanisms to avoid them.
- This is very much related to consistency as well – as I wrote in Being Aware of Rationalising. I wonder if we have particular confirmation biases to maintain a consistent experience.
- I wonder to what extent the confirmation bias leads to someone continuing to believe in a religion. Restriction of attention is evident in many believers, blithely turning an eye to contradictions (or other religions), as are preferential treatment of evidence and so on. I guess we can be lenient here on what counts as evidence.
Finally, having a bias such as a confirmation bias severely undermines our notion of free will. I’m not the conscious author of an opinion or reason here – it’s even more of an illusion if my own brain is filtering information behind my back, so to speak. Where’s the free will in that?
This paper is awesome, and the source for my second section title: Confirmation Bias; A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology (Educational Publishing Foundation) 2 (2): 175–220 (PDF)