Managing Confirmation Bias in Stakeholder Engagement

The Challenge of Confirmation Bias

People tend to confirm their beliefs by seeking out information that aligns with how they see the world and disregard and devalue information that contradicts their view.
This cognitive phenomenon, called the confirmation bias1, is an information processing error that causes the mind to become narrow and rigid in its awareness, creativity, and foresight. Combined with our inherent tendency to be overconfident in our estimates2 and excessively optimistic about outcomes3, this narrow frame of mind is especially troublesome for strategic decision making, the assessment of risk and importantly, the awareness of possibilities and opportunities. These limitations can leave an organization vulnerable in both internal and external volatile environments and bound to short-sighted aspirations.

Freeing your organization from the confirmation bias: Implications for stakeholder engagement

One approach for organizations – businesses, NGOs, multilateral agencies, and governments – to look beyond their own biases is to ensure a broad and inclusive approach to stakeholder engagement. Seeking the advice of your stakeholders, and from stakeholders that tend to be in opposition to your organization or have considerable experience within your industry, are fundamental to overcoming the bias. In fact, the more a narrow mind frame can be challenged by disconfirming evidence and alternative ways of approaching a situation, the more valuable the process will be in the end. People within structured organizations, like the mind itself, are incredibly impressionable. The simple act of being exposed to other ways of thinking restructures the way that we see the world, the way we calculate risk, and the opportunities that can be realized.
So, consult your stakeholders. But beware not to create an anchor when you solicit their advice. Do not lead into the discussion by setting the frame in which the organization is already thinking. Doing so will limit the stakeholders’ ability to move beyond the thought process that the very discussion is aimed at breaking down. And push yourself (and urge your stakeholders) to generate as many alternative solutions to the situation as possible. With all possible options on the table at the end of the consultation process, which will include both confirming and disconfirming notions to pre-existing ideologies within the organization, you will then be best positioned to identify which route is the most fruitful.
Finally, make sure that the evaluation of options (and the final decision regarding the strategy going forward) is done without excessive time pressure or stress, and is carried out before the evening hours and not after eating lunch. These factors can activate the mechanisms that encourage cognitive rigidity and drive the confirmation bias, including the devaluation and disregard of disconfirming evidence, and can make the mind inflexible against new and innovative ideas. The result would be a superficial evaluation process that is inherently bound to the biases that were meant to be avoided in the first place.
In his seminal novel “Thinking Fast and Slow,” Daniel Khaneman argues that exposure to disconfirming evidence, evidence that fundamentally contradicts an individual’s mental model of the world, is one of the most promising ways to change behavior4. At GlobeScan, we believe that exposure to alternative (and perhaps disconfirming) perspectives from stakeholders is one of the most promising ways for an organization to think outside the box, adapt in a volatile world, and aspire to bigger and better goals.


References
1 Nickerson, R. S. (1998). Confirmation Bias: A Ubiquitous Phenomenon in Many Guises. Review of General Psychology, 2(2), 175-220.
2 West, R. F., & Stanovich, K. E. (1997). The domain specificity and generality of overconfidence: Individual differences in performance estimation bias.Psychonomic Bulletin & Review, 4(3), 387-392.
3  Klein, C. T., & Helweg-Larsen, M. (2002). Perceived control and the optimistic bias: A meta-analytic review. Psychology and Health, 17(4), 437-446.
4 Kahneman, D. (2011). Thinking, fast and slow. Macmillan.
Image credit: Clay Bennett

This post was written by former GlobeScan Senior Research Analyst, Dr. Melaina Vinski.