Predictive and Prescriptive Analytics
Analyst Coverage: Philip Howard
Predictive analytics refers to the ability to predict events before they occur, allowing you to take remedial action in advance. Prescriptive analytics extends this concept by either suggesting the actions you should take or automating those actions.
Why is it important (hot)?
Historic analysis is just reporting. To be actionable, insight needs to alert you to what is likely to happen. That is the core of analytics: identifying future trends, whether these be threats or opportunities. Even that is often not going to be enough predictive: you know that something is likely to happen, but what do you do about it? Prescriptive analytics provides the actions – both the things you must do, and the things you should not do – to go along with the likely outcome.
How does it work?
Predictive analytics goes beyond just reporting (nowadays called “descriptive analytics”). Using the same basis of data management, that of capture, cleansing, organising, classifying and presenting of data about what has happened, it goes on to use the data and statistical and mathematical modelling techniques to predict outcomes.
Data Scientists will lead that refinement, using both supervised (led by the data scientist) and unsupervised training (machine led, machine learning and Deep Learning). These techniques provide you with a predictive outcome, and also a reliability estimate.
The training is undertaken by providing a large numbers of examples to enable the model to detect a pattern, Further data is then used to check that the model is generally applicable. This test for general applicability provides the statistical level of confidence that can be applied to the model.
“Prescriptive analytics” then seeks to provide insight into which actions to take to give a better chance of achieving a desirable outcome, or lessen the impact of an undesirable outcome. The prescriptive element of this will typically involve either the embedding of a process into a workflow (where the actions to be taken is automated) or the use of natural language processing to generate, for example, details of the work that an engineer will need to undertake.
We live in a world of risk. That risk can be in the form of fraud, of mechanical failure, of missed opportunity, and so forth. Better insight into what lies ahead reduces exposure to that risk.
One of the most common use cases for predictive and prescriptive analytics at present is in predictive maintenance, which is a monitoring and fact-based form of preventative maintenance. Predictive analytics can predict the statistical likelihood of a failure, whilst prescriptive analytics can recommend adjustments to your maintenance programme to reduce the risk of downtime and unit failure. Such anomaly detection is usually pattern based, for instance vibration analysis. The analysis model is trained to identify the normal picture and the operating window, and can then highlight patterns that fall outside. Its use has identified clear benefits, including reduced equipment costs, avoiding critical failure, parts not machines fail; reduced labour costs, because smaller numbers of more focussed repairs are required; increased safety, since things are fixed before they fail; increased revenue, due to reduced downtime; and increased productivity.
Amongst other common uses cases for predictive/prescriptive analytics are fraud detection, cybersecurity, recommendation engines (next best offer), churn management, and network management.
“Lenovo is just one manufacturer that has used predictive analytics to better understand warranty claims – an initiative that led to a 10 to 15 per cent reduction in warranty costs.”
“The predictive models created in our [data science platform] not only revealed where problems were likely to occur, but also identified the root cause. As a result, within the first two months of testing, total downtime was reduced by more than 20% and device failures and their subsequent costs were also reduced.”
The key is to ensure that the data that is captured is timely, accurate and comprehensive. There is no point in deploying any sort of analytics on poor data: so be clear in the objective, plan to get the right data, plan to ensure it is as accurate as possible, then deploy the tools and reap the benefits.
Beyond the accuracy of your data a major consideration is the timeliness of your analytics. Organisations increasingly want real-time data, and this is particularly true within the Internet of Things. In the past analytics was conducted on data at rest, which inevitable introduced latency, but now data in motion, whilst it is streaming can be analysed. This has given rise to edge analytics where the data is analysed close to the sensors, giving near real time results, which can be vital when tracking expensive mechanical assets. Increasingly it is not just those with machinery to monitor but as the costs decrease the case for real time monitoring of data becomes a reality for retailers, financial services, telecommunications and others.
Finally decide where the analytics need to take place, the alternatives are that it could be at the edge close to the sensors, limited data but very as near to real time as possible, at a concentration point so that a broader picture can be gathered, or at the centre, where there will be latency, but the most complete picture is going to be had.
The Bottom Line
We are moving to a world were informed decisions need to be made more frequently and more rapidly than ever before. Predictive and prescriptive analytics are cornerstones to enabling a data-driven business, saving cost, maximising revenue and becoming adaptable to a changing world.