3 all too common mistakes analysts make when delivering their work
Finalising our series focussing on the mistakes analysts make, let’s now move beyond the technical stage to spot mistakes when delivering their work. This post builds on the previous list of mistakes 4 to 6 during technical work.
These are shared with the hope of us all raising our game & staying aware of these risks. They are not intended as a criticism of hard-working analysts, who have a challenging enough role already. So, please hear them in the spirit of being shared by someone who has made all of them.
I’ve thought about what goes wrong for analysts within the framework of my own Softer Skills model. That has prompted me to identify 3 mistakes made in the contracting stage, 3 during technical work & now 3 when delivering. So, without further ado, let me dish the dirt on the third set of 3 things I’ve seen go wrong, even when delivering to known stakeholders…
3 more mistakes that analysts make, during the delivery stage
Let’s assume that robust contracting has already been completed. Analysts have an accurate understanding of the business need, have planned the time they need & secured buy-in. Following that high-quality technical work has been completed, including identifying relevant insights & recommendations. What else could go wrong? Well, the next stage is often to communicate your findings back to business stakeholders. Without careful thought, all the good work completed so far can be undone, resulting in negative feedback or just a lack of action as a result of your analysis. So, let me unpick how failing to think about each stage of your analysis can result in failure.
When I train analysts in the softer skills they need for success, I stress how much there is still to do once technical work is completed. At this stage, mistakes can be made which undermine the quality of what is delivered. There is so much that can go wrong, but let me just highlight one for each of the three sub-stages. These are mistakes I have made myself and seen all too often in practice.
(7) Publish and be damned
It’s an easy mistake to make after generating some compelling insights to rush into communicating them to others ASAP. But wait! That can be a mistake if you don’t take time to reflect on the consequences of your findings. Given you have been working in the previous stage to identify clear actions to be taken, your analysis should drive change. But in any organisation, there are stakeholders who benefit from the status quo and so will feel threatened by such change. Without planning how to take your insights & recommendations back to the organisation, you may run into defensive resistance. You need to recognise & navigate office politics.
The clearest example of this that I still remember was sitting in a meeting listening to an analyst present their findings & recommendation. As we listened it dawned on all of us that this analysis basically showed that one of the marketing teams was wasting money & we would be better off cutting all their budget. The leader of that team was sitting in the audience & got visibly angry. Clearly, this was news to him & unsurprisingly he got aggressively defensive. He proceeded to verbally poke holes in the analysis & tear the analyst apart. Painful to watch.
Tactic to avoid: Stop taking the mindset (from the world of journalism) ‘publish & be damned’. First, keep be alert to & learn about the politics & power bases in your organisation. Know your stakeholders & their priorities (as covered in mistake 3. Then, after completing technical work, take time to think about the implications of your work. Then plan how best to take your findings out to stakeholders. Do you need private meetings? Is it better to consult with some stakeholders to get their input to help shape recommendations or wording first?
(8) Cluttered graphs and ‘death by Powerpoint’
If you have (as advised in my post on the technical stage) completed robust analysis & generated insight, you will have plenty you could share. There will be a temptation to both share all that you consider interesting & to demonstrate the amount of work you have done. Please resist this temptation, it leads to information overload & makes it less likely your audience will be interested or act on your recommendations. Less is more. Think of the times you have been bored to death in meetings with too many slides for the point or audience.
Another related problem at this stage is giving too little consideration to the principles of data visualization. Often this is caused by a lack of time. Akin to the way that testing is too often reduced in late running IT projects, producing effective graphs too often looks like an afterthought. If an analyst is short of time they can fall back to just using Excel chart wizard (or equivalent) and pumping out poor quality graphs. Ignoring the reality that human’s visual perception is strong (so use higher data density in fewer graphs or small multiples for scanning) but visual memory is poor. They end up with lots of slides of boring graphs that fail to engage & persuade.
Tactic to avoid: Plan for this stage from the start. Have a repeatable workflow that considers from the beginning the best way to communicate findings. Allow time for data prep for data viz after analysis. This enables you to get data in a suitable format for a well-chosen engaging graph. Have as few slides as possible (but be ready to answer more questions or share more detail if asked). Design graphs with high data density, declutter them & use colour sparingly to draw the eye to insights. Use annotation on graphs to add an explanation in context. Avoid separate legends & don’t use radial charts. Beyond that, get trained in data visualisation, it can revolutionise your output.
(9) Walk away after presented analysis (no feedback loop)
During their analysis, most analysts with real-world messy data will need to make assumptions. Hypotheses will be generated and when interpreting results, forecasts of impact will need to be estimated. A lot of this relies on domain knowledge or past experience of uplift or improvements caused in the past. But too often analysts are in a rush to move on to the next piece of work, so they fail to learn what really happens as a result of their analysis.
This is understandable given workloads but undermines analysts’ learning & the scientific method. Anyone with a science background will recognise the importance of the feedback loop. Without discovering what happens as a result of their analysis & the impact caused when recommended actions are taken, analysts cannot improve. They lack the learning from such feedback to improve future analysis & recommendations. A classic example of this in the world of insurance is the claims or improvements to customer retention. Most analysts made recommendations which if acted upon promised an improvement. Yet the sum of all these separate recommended actions was an impossibly high % of customer retention.
Tactic to avoid: The trick here is to use the analogy of a relay race. In that sport, runners don’t just pass on the baton to a stationary runner. They run together for a while until the new runner has reached optimal pace & then hand over the baton. Likewise, analysts need to stay curious & involved in the work to be done to act on their analysis. Keep talking to stakeholders & stick your nose into the team or project working to implement the change. Offer to help & explain that you need the feedback. Measure the result of the recommended action being taken. Then learn from that result. Adjust your thinking, assumptions, estimates. It will evidence your commerciality as well as improve your scientific rigour.
Which tactic could help you avoid mistakes when delivering?
I hope that series was helpful. Thank you to those who have contacted me personally to tell me about some that have helped them.
If you’ve not caught the whole series, I encourage you to also read the mistakes made at the contracting stage or during the technical work.
To finish though, I’d like to hear from you. Which mistake have you experienced most? Do you relate more to the 5 mistakes shared by analytics leader Andy Sutton? What other mistakes have you seen? Please let us know so we can all learn from our mistakes & keep improving. Plus, I hope you’re encouraged that it’s not just you!