5 common mistakes made by analysts and how to avoid them in future
This month, let’s indulge the facet of human nature that finds it easier to spot what is wrong & talk about common mistakes made by analysts.
I will be sharing my own experience and asking our guest bloggers to share theirs. To kick us off, I am delighted to welcome back Andy Sutton a very experienced analytics & marketing leader from Endeavour Drinks Group in Australia. While we worked together I had the pleasure of seeing how well Andy developed his teams.
Drawing on that experience, in this post Andy shares his top 5 mistakes made by analysts. Over to Andy to dish the dirt…
Choosing a top 5 mistakes made by analysts
I’ve been an analyst or worked with analysts as a stakeholder or leader for around 20 years across the UK and Australia. Over this time, the world has changed hugely from data to big data and from statistics to machine learning & AI. But some of the challenges of working in the analytics space remain.
Paul asked me if I had any thoughts on the top mistakes made by analysts. I thought I could create a top 10 list but as I started on the list I realised it was a lot harder than I thought! Analysts actually have a very tough job where they’re expected to be across the depths of data in their organisation, be knowledgeable on a vast array of analytical tools, work with multiple stakeholders of various backgrounds and understand the context of their business challenges.
So a few hints and tips were more appropriate than a laundry list of my pet hates. I’ve personally made all of the mistakes below to various degrees.
Andy’s top 5 mistakes to avoid (in reverse order of importance)
1) Analysts need to be led by analysts
It’s a fallacy that analytics teams need to be led by analysts. I’ve blogged before that data science teams don’t need to be led by data scientists and the point extends to analysts too. The analytics leader needs to be the link between the technical analytics solutions delivered by their teams and the commercial outcomes of the business. The corollary of this is that the analytics leader can be an analyst who talks the language of the business or a business leader who talks the language of analytics – or a hybrid of both. The most important aspect of the role is to be able to listen to both teams and translate between the two.
2) Analysis which confirms existing biases
The role of an analyst is to be a voice of logic and balance rather than come with a pre-conceived answer to an analytics question. The pressure to deliver to a narrative constructed by a business stakeholder can be quite intense. No one really likes to know that their latest campaign hasn’t worked, their customer segments are poorly defined or their sales are lower than expected. The role of the analyst is to shine a light on the truth and to suggest insights to remediate rather than align to a preordained outcome.
3) The more complex the analysis the better it is
It’s fairly trendy these days to solve everything with a machine learning model or even with deep learning approaches – everyone wants to use the latest technique they’ve learnt in their degree or data science training course. But data science models are inherently complex – they take a long time to train, build and productionise and even longer to explain to business stakeholders how to use them.
If a simple piece of exploratory analysis, 2 way analysis or a decision tree can drive the same outcome (or better) then why waste the time of doing something more complex. Data scientists and engineers are expensive resources and so should be pointed at critical, complex business problems which means simpler questions should be answered elsewhere. The ability to understand the right tool for the right job isn’t just for tradesmen!
4) It’s all about the analytics
Probably one of the biggest mistakes I used to make as an analyst and something I still see today. The analytics is always secondary to the relationships you build with stakeholders to understand the business they operate in, the pressures of their roles and what their success looks like. You also get the opportunity to educate the business on the primary use cases for data and the limitations of any analytics produced. It also underpins the 2 preceding points as it makes it much easier to suggest the right tools for the job and enables you to deliver more challenging insights through the relationships produced.
5) So what?
And the biggest mistake of them all is forgetting the so what of any analysis produced. Analysis without a so what is just a report not an insight. Insights have to be more than just knowledge but something actionable by the business. In reality the analytics is just an appendix – with the main point of any analysis being what do I do next that I wouldn’t have done before. Even in the world of data science the gap can be wide, with models which are built and never productionised to drive the changes identified.
Which would be your top 5?
So there’s my 5 – some of them can be fatal whereas others are minor irritants day to day. What are yours?
Many thanks to Andy, for myself I recognise all the pitfalls that Andy identified above & guess quite a few would make it onto my top 5 too. However, when it comes to my post, I’ll choose some of the others to give them airtime.
What about you, dear reader? Any comments (or war stories) about the mistakes Andy identifies above? If you feel passionately about this topic, please send me your personal top 5 common mistakes made by analysts. I might publish them too. It certainly helps for us all to have the humility to share some of the mistakes that we have learnt from. So, thanks again to Andy for his candour. I hope it encourages you and your team.