A common mistake analysts make, failing to recognise your own bias
Could you be failing to recognise your own bias in your data & analytics work? A few weeks ago, I challenged our guest bloggers to write on common mistakes made by analysts. Tony Boobier has chosen to write on the topic of bias and the need for us all to consider our own biases.
Regular readers will know that Tony is an Analytics & AI mentor, consultant and author. He has written for us previously on leadership topics including Shakespeare & Silent Dinners. I’ve also reviewed his books on AI & the future of work, plus AI & the future of banking.
In his latest post, Tony reflects on behavioural biases and their relevance to analysts or data scientists. How might analysts be blind to the common mistake of missing our own biases?
Common Mistakes that Analysts Make: Bias
It almost sounds like a terrible to thing to say, but the reality is that we are all biased in one way or another. Bias can be either conscious or unconscious. It can be positive or negative. Nevertheless, it often eats away at our decision making. Bias starts to occur at a young age. It is usually a tendency for us to have a viewpoint for or against something or someone. It is often based on stereotypes. In many cases we just don’t really understand or know what might be the underlying cause.
It’s an area which fascinates behavioural experts and has led to a whole new branch of science. One type of bias is that of ‘cognitive bias’ which is a sort of repeated mental process which ultimately leads us to jump to conclusions. It’s suggested that cognitive bias might be due to our brain’s ability to process information, but could also be influenced by emotional and social triggers, including peer pressure.
Other types of bias include:
- ‘Anchor bias ‘ – where we jump to a conclusion based on the first information we see.
- ‘Confirmation bias’ – where we look for information to support what we think already.
- ‘Hindsight bias’- where we look at results in the light of what we already know about an outcome.
How our biases can lead the analytics witness
In the context of analysis and what mistakes might be made, perhaps the answers that we find in analysis are in part as a consequence of our bias. Rather than an impartial picture. In other words, we either consciously or subconsciously seek ratification in data and analytics to prove what we already believe, or what we want or expect to discover.
A biased outcome may to some degree be accidental. But it can arise as a result of the data itself being biased due to the way it was collected. Equally, we might discount outliers in the data because they don’t fit in with our expectation of what we expect to happen or the model that we have chosen to use. Yet the deepest insights could actually rest in those outliers.
It’s a problem which perhaps affects those who have been working for a long time in a particular sector or function. Through time, they have been infused with what might be euphemistically be described as ‘accepted wisdom’. So the answers that they find in the data reinforce that viewpoint rather than contradict it.
Are data scientists immune?
Pure data scientists might be less likely to make that mistake. They approach data in an agnostic way without any preconception as to the outcome. But couldn’t that approach also be flawed? They may want to investigate issues but perhaps look in the wrong place for information, or perhaps use the wrong data set. In so doing, they run the risk of reaching incorrect conclusions. After all, the fact that two particular metrics correlate may be coincidental not causal. There may be other untapped data forces which need to be considered.
It’s tempting to say that these are just human failings. In a future world of advanced analytics and AI, won’t the likelihood of bias be reduced or even removed entirely? Will we be better off with impartial, insensitive and depersonalised algorithms created by the computer itself without human intervention? It may not be that simple. Machine learning which is flawed may happen as a result of biased data sets or perhaps the wrong algorithmic model. Some might even argue that all models are wrong, in that they are only statistical approximations. Whatever the cause, these mistakes ultimately lead to both unsatisfactory and often unfair results.
The zeitgeist of this data-driven age is that we are trained to be convinced that the truth is in the data. That effective analysis inevitably leads us to correct data-driven outcomes. But maybe it’s not such an easy thing. In the same way that experts study bias, there are also studies into the best ways of avoiding bias happening.
How can we avoid bias in our analysis?
For example, data techniques such as random sampling might be used in ensuring a balanced dataset. Healthy scepticism is also a useful tool. Use a framework that challenges the outcome of derived insights – in a positive way. At an individual level, self-awareness of personal bias is also important.
In automated and perhaps ultimately autonomous systems there are also technical approaches to solving this problem. The over-riding one is perhaps that of effective oversight. We tend to think of this oversight as being some form of human intervention which looks over the final result. This might be more problematic where analysis is particular complex.
There is also a risk if we allow machines to make automated decisions on relatively simple matters, we humans may simply lose the ability to apply correct oversight to the more difficult ones. At the end of the day, we need to ask ourselves if these are really ‘mistakes’. Maybe they are no more than operational and technical issues as we learn to use data in an ever more sophisticated way. Perhaps these are no more than teething problems in the way that we increasingly understand and leverage the information world.
Recognition of the problem of bias takes us more than half way towards resolving it. Be it through human, non-human or hybrid processes. Removing it entirely could prove to be a little more challenging.
How are you tackling your own biases?
Many thanks to Tony for raising what can be an important pitfall for analysts. Whether we are thinking about discrimination against certain people groups or having a clearer view of reality, bias matters.
How are you tackling your own biases? Do you have any tips or tricks to share with others? Are you focussing more on data, analytical processes, interpretation or communication? All can matter. I’d love to hear more about your experience and any lessons learnt.
Whenever seeking to give up bad habits, awareness is the first step. So, let’s all share more on this real world challenge as we all seek to make less of those common mistakes.