How to avoid stakeholder disappointment when you deliver your analysis
Have you experienced senior stakeholder disappointment, or at least lack of enthusiasm, when you present your analysis?
I know it’s an all too common experience for analysts, data scientists and their leaders? Why is that? What has gone wrong?
To help us explore that, I’m delighted to welcome back guest blogger Harry Powell. Harry is the Director of Data & Analytics for Jaguar Land Rover. He has shared with us before, both on solving the Productivity Puzzle and what he learnt from an Alan Turing Society lecture.
So, over to Harry to share is ever relevant experience and lessons learnt as to how to avoid disappointment…
Do you recognise this scenario?
You and your analytics team have been working on an important project for your CEO. You’ve taken your time, but you’ve come up with something amazing, really insightful. You are confident that she will be impressed and it will have a massive impact
But when you present it, she seems unimpressed, underwhelmed, as if you have shown her something she could have thought of herself. She couldn’t have, by the way.
It is not that the scope has changed. You have given her what she asked for and something she still needs. She just looks disappointed, as if she was expecting so much more. As if your model could be so much better.
But it didn’t start out like that. She was engaged and
Why disappointment? What went wrong?
It’s happened to most analytics teams. You have scoped the work, built the model, refined and tested it, and added a load of functionality. Every week the quality of the results improved, and now it’s as good as it can be; there is no more to go for.
The problem is that your stakeholder’s expectations tend to be linear in the time that a project has been going. The way she thinks is that if you work another 40 hours on
But in fact, the value of an analytics project tends to exhibit diminishing marginal returns; as time goes on it gets harder to find additional value.
Disappointment from diminishing returns
Let’s imagine that value is measured by the accuracy of your predictive model. And let’s assume that the informational content of your data means that the model is limited to 80% accuracy (however measured). And suppose that in the first week of the project you build a model that delivers 40% accuracy. Then by definition, a similar increase in the second week is impossible (without new information).
So as the weeks pass, you add progressively less value. The result is a curve which starts steeply but which levels off over time. At some point, the straight line of expectations will cross the value curve. And beyond that point, you will disappoint, no matter what you do.
But it gets worse. The value curve must necessarily be bounded by the actual opportunity. In the example above, you will never get more than 80% accuracy in your prediction. But expectations don’t work like that; your manager won’t know the limits of what is possible.
Her line keeps on going up forever. So no matter how hard you work, and how brilliant you are, sooner or later your stakeholder is going to be unimpressed by your work.
How should you avoid such disappointment?
So what should you do? Have a look at my unscientific chart. There is a conflict between the interests of the team and that of the CEO. The CEO gets the most value at the point when the two lines cross: Beyond that, she will think that you are wasting effort which could be better used elsewhere.
But you could argue that the team should seek to maximise the surplus, the vertical distance between results and expectations. This will be somewhere short of the
After all your team’s value is determined by whether you exceed the
In any case, don’t spend too long perfecting your product. Even if the scope remains unchanged (itself unlikely – go a read any number of Agile textbooks) you are unlikely to reap the rewards. Just as Paul argued in his post praising imperfection.
How do you avoid stakeholder disappointment?
Thanks to Harry for sharing his analytical thinking on this people problem (expectation management). I hope you’ll agree that is a useful model to help you think about what you know intuitively.
What’s your experience? I feel this issue is relevant to most Data & Analytics leaders. So, I’m interested to hear what has worked for you. How have you avoided disappointed stakeholders? How have you managed their expectations & determined when work is good enough?
Please share your wisdom in the comments boxes below & let’s start a conversation to help us all improve this leadership skill.