Have you ever wondered why companies keep investing in a part of business or a project that has not achieved promising results, rather than diverting the investment or abandoning the project? Or why so often we are more ready to receive information that confirms our preconceptions, rather than information that challenges us? Or why we tend to overvalue our assets, regardless of their objective market value? These are just a few examples of how our decision making is regularly impacted by biases.
The ‘normative approach’ in decision science assumes that people are fully rational or at least can be induced to behave rationally when making decisions. However, as we have become increasingly aware, the actual behaviour of decision makers is affected by more cognitive biases, emotions, ideologies, and social pressures than we care to admit. Notwithstanding the growing use of Artificial Intelligence and automated processing, humans are usually the ultimate decision makers and our decisions touch every part of business, policy making and the consumption of goods and services. Indeed, even AI and automated processing is not immune from bias, for it is humans who decide how systems are designed and algorithms formulated. So the opportunity for biases to infiltrate decision making is, by any measure, enormous.
It’s obviously easier to remediate biases which are overt, but what about those which are subtle – i.e. embedded within organisational processes and practices? What about those which manifest our own personal mindsets, processes and practices (which are so obvious to others, and much less to ourselves)? Over the past decade, organisations have increasingly embraced the field of behavioural economics as a way to reduce, if not eliminate, biases in the design of financial products and services, policy outcomes and buying patterns. Given its success, this field has morphed into exploring behavioural insights more generally, for example in relation to strategy development, supply chain management, and project management.
Identifying root causes of decision biases and mitigation approaches are complex, but there are typical debiasing methods that can be applied to reduce the impacts, or even remove them. Adopting diversity of thinking in a team context and individual decision making is among the well-received strategies by high performing organisations. Deloitte has done extensive work on diversity of thinking and its employment to stimulate smarter decision making, led by Juliet Bourke, Deloitte Australia’s lead partner in Diversity and Inclusion Consulting practice. Juliet’s rigourous work on effective adoption of diversity of thinking provides practical ideas to help teams see a wider range of scenarios and guide their decision making process to mitigate social, informational and attentional biases effectively1.
But, as Juliet points out, creating a diverse thinking team and applying a structured process to guide conversations, is just part of the debiasing process. Finding and fully appreciating diverse views is challenging – we are each prone to overweight our own pre-existing point of view confirmation bias), assume that our view represents more of the world than it actually does (representation bias), lean too heavily on an expressed starting point (anchoring bias). And it is even more difficult when these biases have been built into protocols for strategic and operational decision making (for instance, reasoning by analogy or single outcome calculations). It’s time to review our decision making processes – as well as our personal mindsets and behaviours – to see if we can spot and reduce our blindspots.