Adopting a healthy scepticism of thinking fast with data
I’ve previously, briefly, reviewed the classic text “Thinking Fast & Slow“ by Daniel Kahnemann. This month, we are focussing on the mindsets (or sub-personalities) that analysts need to adopt to succeed in their work.
Guest blogger, Tony Boobier, has blended both those themes in this guest blog post. It is both a longer book review of “Thinking Fast & Slow“, plus a reflection on mindsets. From his experience as a mentor, author & consultant, Tony shares how this perspective helps us identify problematic mindsets.
If you thought it was as simple as persuading your executives to invest in being data-led, prepare to think more about hearts & minds. Building on those 10 rules in Tim Harford’s book, here are the biases that data leaders need to help their business avoid. Over to Tony…
Data democratisation in the pandemic?
Haven’t we have all become data experts during the pandemic? After all, almost every night we have been peppered with data on the number in hospital, the number of confirmed cases and of course the now-infamous ‘R’ rate, amongst others. Breakdowns by country, area, gender, demography; we are reliably informed that the eventual solution to the pandemic will be ‘data driven’.
Those with greater experience of data and analytics can only hope that important judgements about the pandemic are being made using data with a greater degree of granularity than that shared publically. The danger with data, and therefore the analysis of data, is that it often allows us to jump to conclusions.
Sometimes we even use data and analytics to ratify and justify intuitive decisions that we have already made. If the data and analysis support these personal viewpoints, then surely everything stacks up, doesn’t it?
Thinking Fast & Thinking Slow
In his book, ‘Thinking Fast and Slow’ author Daniel Kahneman who is a Nobel-winning psychologist and economist, famous for his work on judgement and decision-making, throws some serious spanners in the works.
He writes that data (which might be in itself incomplete) assumes a greater degree of credibility when supported by what we might consider being simple explanations, especially as we review the past. In other words, we run the risk of letting interpretation of information be distorted by how the past is viewed from a subjective point of view, rather than what actually happened. He describes this concept as ‘hindsight bias’.
If his approach is right, then maybe our assessment and understanding of past performance of organisations, which is the core of so-called ‘performance management’, might not be all that it seems. Perhaps we should be equally wary of predicting the future.
The illusion of predictability
Kahneman suggests that any confidence that we have about understanding the past also encourages us to believe that we can understand the future as well. He describes this as no more than ‘an illusion’ especially as luck (which itself is unpredictable) plays such a major part in the likely future outcomes.
So what is it that makes us so confident in our ability to predictive what might happen going forward? He describes us as no better than ‘willing participants’ in this ‘illusion’. If we think that we able to predict the future, Kahneman suggests, then it helps us reduce the inevitable anxiety which occurs if “we allowed ourselves to fully acknowledge the uncertainties of existence“.
It’s not all negative, and in his defence, he does agree that analytics can provide reliable predictions that human judgement on what he describes as ‘low impact’ predictions. The test, I suppose, is where we place the boundary between low and high impact predictions.
Use pre-mortems and post-mortems
It’s a book full of interesting insights, not least when he introduces the concept of the pre-mortem, as opposed to the post-mortem. A pre-mortem relates principally to decision making rather based on intuition as opposed to algorithmically based decisions. It’s probably not unreasonable for us to stretch the concept into an analytical context.
The pre-mortem concept invites decision-makers to consider this relatively straightforward question: “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.”
The pre-mortem concept principally does two things:
- It overcomes the phenomena of ‘group think’ that affects groups once they think a decision has been made (perhaps if it is based on data and analytics, which offers a degree of certainty);
- It unleashes the imagination of knowledgeable people, who can consider other key aspects which might affect the outcome.
The perils of overconfidence
According to his research, Kahneman says that organisations that rely on overconfident experts can ‘expect costly consequences’. Those who are overconfident by nature and optimistic tend to take the greatest risks.
Despite this, the mantra of current times seems to be one of ‘having confidence in the data and analytics’, as if both provide some form of infallibility. To support this, confidence remains a valuable (and valued) commodity as it creates a “collective blindness to risk and uncertainty“. According to PWC (and there are surely others singing from the same hymn sheet) “Organisations which feel confident about the underlying data used by their systems are able to rely upon it to make decisions, highlight opportunities and identify and manage risks.”
Excessive confidence is also an issue. Kahneman notes that the problem of overconfidence is especially ‘endemic’ in medicine. In a particular survey, he discovered that clinicians who were ‘completely confident’ of their diagnosis were found to be wrong 40% of the time. Perhaps data scientists by default also have the characteristic of excessive confidence, especially in the numbers?
But leaders & analysts are expected to be confident
On the other side of the coin, lack of confidence of individuals is still viewed by many organisations as a weakness. We don’t often see industry leaders saying that they don’t know the answer. Those Doubting Thomas’s who are prepared to reveal their uncertainty or lack of confidence in their decisions are likely to find themselves more quickly replaced. Substituted by those who are more clear-headed and certain, and who can win the trust of their stakeholders.
So what’s the implication for analysts, and what might be the ‘mindset’ issue?
Mindset issue 1
Firstly, we need to consider whether organisations and individuals have somehow been systematically lured into overconfidence through the use of data and analytics. If so, then what is the implication? After all, the message that ‘truth is in data’ is continually reinforced to a point that we nowadays often express surprise when the phenomena of a ‘rogue algorithm’ is discovered. Have we somehow become victims of efficient marketing of technology companies?
Mindset issue 2
Secondly, do analysts need to be more cautious when looking backwards at historical data? Especially when constructing some form of explanation. What has happened in the past may be a matter of fact. But usually, reasons still need to be found especially for poor performance – where a scapegoat is often sought. When we look at historical data, is there a risk of viewing it through a distorted lens to make the data fit the narrative? ‘It’s now obvious from the data, looking backwards,’ they might say, ‘that this would have been the inevitable outcome‘.
Mindset issue 3
Finally, if we are able to explain away the past (sometimes in such a distorted way adopting ‘hindsight bias’) should analysts be even more cautious in their confidence to predict outcomes and implications for forward-looking decisions? Might misplaced confidence only translate into a greater likelihood of wrongly anticipating the future?
At the end of the day, maybe the mindset issue is that of being able to recognise the potential fallibility of data and analytics. Don’t analysts and data scientists need to retain healthy scepticism both in terms of reviewing data of the past and also their ability to accurately forecast the future?
Humility is key as an analyst mindset
Thanks for that advice, Tony. I am reminded of the humility as an attitude encouraged by David Speiglehlater in his book & the curiosity that was the golden rule from Tim Harford. Both support the kind of healthy scepticism recommended by Tony above.
How do you balance the tension between retaining scepticism or doubt, whilst also communicating confidence to senior stakeholders? If you have wrestled with this challenge or have your own thoughts on the mindset that is needed by data leaders, I’d love to hear from you. Please share your views in the comments box below or on social media.