Could you be excited by automated pricing models for free?
How long does it take to build your insurance pricing models? At what cost, especially if this work is completed by highly paid actuaries?
In this post we take a break from our recent events & leadership focus and get ‘down & dirty’ with technical detail. In this case the challenge for insurance businesses to build and refine their pricing models in an ever more competitive data-led battleground.
I’m not normally a fan of automated model building, given the importance of domain knowledge guiding variable selection etc. Done well, however, it can save time & money as long as you can still be assured of model quality. Of course there are times when the speed trade-off is worth it, even where rebuilding models more often can be an advantage.
Many years ago, when KXEN first launched on the market, my statisticians were sceptical of model quality and a mathematical optimisation technique that was too ‘black box’. In our trials it proved true that models built using their proprietary technique did degrade more quickly than those built using existing process. However, they could also be built in half the time. So there can be a trade-off (even though at that time we found the same benefit by using Portrait Miner more visual software).
Two things have persuaded me to share with you a new breakthrough. Firstly that it’s been made by a friend of mine, Tony Ward, who when working for me proved himself a very competent statistician & programmer. Secondly, that he has approached the problem of having a more open solution to automated/algorithmic model building by using ‘R’. This is both increasingly favoured by analytics and insight teams and makes the methodology transparent to any R user.
In the spirit of today’s content marketing, mobile/social & collaborative businesses, Tony is also giving away the R code to achieve this model building (and a white paper on his research, method and robust testing used). That really is a free offer that I can’t but share with our statistical and analytics leaders. So, here is Tony’s site for access to that paper & code (just sign-up at the bottom of the webpage, with no commitment or cost):
Algorithmic Pricing – Statcore
Algorithmic Pricing is a our proprietary new framework for building Pricing Models using Machine Learning. At the core of this framework is Penalised Regression which we supplement with 3 other learning algorithms to boost performance. To download the 2015 GIRO Paper and 2016 GI Pricing Seminar Presentation on Penalised Regression please sign up at the bottom of this page.
As Tony shares in the results published, this algorithmic method outperformed the traditional Actuarial method of building GLM models (in 9 out of 10 cases). Although the average improvement in R² of 0.04% might seem small (even though statistically significant), the projected benefits are not. Tony & his team estimate that a move to this method could improve an insurance company’s loss ratio by 1%, as well as increasing average premiums and contribution. That is a benefit well worth exploring.
Hope you found that diving into a technical opportunity an interesting change. The wider data science community often share useful R code snippets, it will be interesting to see how this practice manifests itself within commercial businesses & their customer insight teams.
How are you building pricing models? Any tips to share on automated model building?
Loss ratio improvements are obviously important. Of maybe more importance is the improvement you’d see in COR from a drop in the huge sums paid to actuaries!
Very good, Andy, although I fear I’m encouraging an anti- Actuary sentiment here. Anyone want to speak up for actuarial models or salaries?
Hi please can I download the white paper and starter code?
Go ahead and follow the link, scroll to the bottom of the page & sign up there. You’ll then be sent link & passcode.
Hi Paul,
Firstly thanks for sharing this. I only just noticed the comment box hence my late reply….
I think the role of the actuary is here to stay. I’m a statistician so I have no vested interest in keeping those guys around! For me there are two things a Pricing Actuary does. 1. Lifetime value models 2. Statistical modelling of the assumptions that feed into those lifetime value models. Actuaries will always have the upper hand with point 1, but you don’t need to be an actuary for point 2.
Regardless of who builds the models, having automated modelling techniques does not remove the need for “someone” to do the work. Most of the work is around understanding the problem, collecting the right data, creating novel features and communicating the results and implementing the model.
However I do think that those actuaries that decide to skill up in areas such as machine learning and distributed computing/cloud technologies will prosper. This might be a bit painful for those that have spent the last 3-7 years studying already!
Thanks for that perspective & I agree, Tony. Although I’m not sure Actuarial students would relish studying Data Science as well.