The confirmation bias in us

Jack Kruf

Some things are remarkable and for most leaders and managers hidden or simply unnoticed. One of them is confirmation bias. Within science and literature, it is described, researched and elaborated over and over again, throughout history in fact, beginning with the Greek historian Thucydides (Dent, 1910). He wrote:

… for it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy.”  

The definition of Haselton et al. (2005): a systematic pattern of deviation from norm or rationality in judgment. It ads to this the interpretation:

Individuals create their own “subjective reality” from their perception of the input. An individual’s construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.

Generally spoken we, humans, have some ’embedded heuristics’ which Daniel Kahneman and Amos Tversky described as ‘highly economical and usually effective but lead to systematic and predictable errors’. It is an aspect to be aware of when chairing or participating in a meeting. It is mentioned as a key driver in risk management theories. It can creep into the discussion and decision making process, staying unnoticed and dressed as humor, good-spirit, support, collegiality and optimism.

This steering mechanism in our brains can be defined according to Cambridge Dictionary “as the way a particular person understands events, facts, and other people, which is based on their own particular set of beliefs and experiences and may not be reasonable or accurate.”

Confirmation bias is also about the internal “yes man“, echoing back a person’s beliefs like Charles Dickens‘ character Uriah Heep in his book David Copperfield. Uriah, the clerk is “cloying humility, obsequiousness, and insincerity, making frequent references to his own ‘humbleness’” (Wikipedia).

In a short but excellent video Jason Zweig (2009) explains how this mechanism is related to the handling of our own stocks and investments, this in addition to his article. Confirmation bias is all over the place in the stock-market. It is studied extensively. Zweig gives advice on how to ignore the yes-man in our heads.

Social psychologist Richard Stanley Crutchfield discovered that 1/3 of the people ignored what they saw and went with the consensus. People within a research experiment (and this has been done over and over again) al agreed on a certain question when they were not exposed to the answers of others. But when they heard that every one of their group disagreed (before they gave their judgment), 31 to 37 percent said they did not agree!

Solomon Asch (Gardner, 2009) also concluded later in new experiments,  that “overall, people conformed to an obviously false group consensus one-third of the time”. Inline with what Crutchfield discovered earlier. Gardner concludes:“We are social animals and what others think matters deeply to us… we want to agree with the group” and “it certainly is disturbing to see people set aside what they clearly know to be true and say what they know to be false.”

From the evolutionary perspective there is the human tendency to conform is not so strange. We are gregarious after all. It is the survival perspective to best to follow the herd. Gardner: “We also remain social animals who care about what other people think. And if we aren’t sure whether we should worry about a certain risk, it matters where other people are worried about seem to make a huge difference.”

The other way around though is that also the government is sensitive for anchoring. Governments anchor on popular opinions. It influences the way they respond. And this mechanism could be at the basis of the rising populistic wave on which political parties surf their campaigns and public leaders base their decisions upon. Reading the news and following the political debates, it becomes clear that scientists have found and described realistic and fundamental mechanisms of us. We know how predictable we are, but often do not want to know about this.

Cas Sunstein (Gardner, 2009) elaborated the consequence of this mechanism on an individual level when information is coming in. He concluded that belief causes confirmation bias, and therefor in-coming information is screened thoroughly. If it supports the own conviction the incoming information is readily accepted. If not, it is ignored, scrutinized carefully or flatly rejected. Isn’t this recognisable in the debates we share, attend and see in the media by some leaders (in fact the wrong word). And isn’t this a mechanism we recognise in ourselves. Let us be honest.

Being aware of this bias, as Kahneman and Tversky stated (1974a), could contribute to improved public governance:

understanding of these heuristics and of the biases to which they lead could improve judgments and decisions in situations of uncertainty’. For me is knowing that 1/3 of the people in a meeting could have an interesting view – which is not shared due to confirmation bias (or possible group consensus, which actually can develop in every meeting –  a true eye-opener. And a personal conviction to find a way to get the best out of each meeting by creating an open mind setting and safety within the group. It can and may not be that precious knowledge, enriching experiences or clear views are getting lost in the melee of the groups dynamic.”

So, creating space for each individual in the battle of the group process is crucial. It is a challenging task. More than that, a renaissance for the individual.

Cambridge Online Dictionary.

Dent, J. M. (1910) Thucydides. The Peloponnesian War. London, New York: E. P. Dutton.

Dickens, C. (1850) The Personal History, Adventures, Experience and Observation of David Copperfield the Younger of Blunderstone Rookery. London: Bradbury & Evans.

Gardner, D. (2009) Risk: The Science and Politic of Fear. London: Virgin Books.

Haselton M., Nettle D., Andrews, P. (2005) The evolution of cognitive bias. In Buss DM (ed.). The Handbook of Evolutionary Psychology (PDF). Hoboken, NJ, US: John Wiley & Sons Inc.

Kahneman, D. & Tversky, A. (1974a) Jugdement under Uncertainty: Heuristics and Biases. Science, Volume 185, pp. 1124-1131

Kahneman, D. & Tversky, A. (1974) Jugdement under Uncertainty: Heuristics and Biases. Science, Volume 185, pp. 1124-1131

Sunstein, C. (2005) Laws of Fear: Beyond the Precautionary Principle Principle. Cambridge: Cambridge University Press.

Wikipedia, Uriah Heep.

Wikipedia. Cognitive bias.

Zweig, J. (2009) How to ignore the yes-man in your head. Wall Street Journal. New York:  Dow Jones & Company.

Picture: Kruf, L. (2005) Twist. Breda: private collection.