Actionable insight is a term that effectively conflates two distinct trends within IT. The first of these is what we might call 'insight automation'. This is the ability to analyse (large quantities of) historic data to infer the future behaviour of both people and things--this then enables the automation of the business processes that leverage those inferences (insight), replacing what were historically manual activities. The second trend is towards 'self-service insight visualisation', which provides the same capabilities but within an environment where it is not practical to automate decision processes: where the human input to the process is enhanced by high quality visualisation of the data that the user can derive for him or herself (self-service) and without reliance on IT.
To give a very simple example of insight automation consider a speed limit sign. The problem with speed limits is that they are arbitrary, the speed limit is the same regardless of whether it is foggy or sunny, regardless of the time of day, regardless of whether the road is wet or dry, and regardless of the amount of traffic. Now imagine a smart speed limit sign with built-in sensors that knew all of these things, and was computer controlled with algorithms that adjusted speed limits to take account of all those factors. Wouldn't that be more sensible? Of course, you could also build in number plate recognition, speed cameras and an Internet connection so that real-time traffic information can be collated centrally for further analysis, which can then be passed onto satellite navigation systems for updated logistic information to further improve traffic flow. On the other side of the coin, telemetry-based insurance is already in use in helping to improve driver performance.
It is worth considering how you get to the point where insight may be automated. In the case of the smart speed limit, algorithms react in real-time to changing conditions, but these algorithms are based on the analysis of historical information. As far as the latter is concerned, you need enough relevant data to analyse. Specifically, you are looking for repeatable patterns of behaviour or activity. In the case of road traffic it is pretty obvious that wet roads and foggy conditions are more hazardous, and that speed limits should therefore be lower; however, in many areas such correlations may not be obvious, and relevant data may need to be investigated in depth. This is initially an activity carried out by what we now call a data scientist. Once a pattern has been identified you can then develop algorithms that will automatically be actioned when that pattern is recognised or when an exception to the pattern is identified. However, there will also be instances when the process cannot be automated: when a decision has to be made by one or more people. For example, in brand management you may be monitoring social media for favourable or adverse comments about your brand, but deciding what to do about changes in that commentary is going to require human input.
So, why is it that actionable insight has become such a significant trend? It is not as if the procedures involved are new: twenty years ago functions such as (customer) segmentation or market basket analysis were mainstream, but bespoke, data mining activities. Today, however, they are ready-built into standard application packages. What has changed is:
- We can now combine information from multiple, diverse sources (for example, social media, clickstreams, and machine generated data as well as traditional types of data) that together make up big data and/or the Internet of Things, in order to give a broader perspective on business issues that may arise, thus supporting the insight that is required to take advantage of this information.
- We have the technology to cater to the volume, variety, velocity and veracity (not to mention value) of this diverse data in a relatively inexpensive manner, thanks to the development of NoSQL databases running on low cost commodity hardware platforms.
- We can do the analysis required in real-time or near real-time so that appropriate action can be taken. However, it is not always desirable that this process be automated and in these instances the decision may be postponed for management review, but you still need the right information in order to do that.
- The information needed to enable actionable insight is now available to business users and executives without needing recourse to IT. This has been a major stumbling block in the past that has hindered attempts to take timely action as a result of information. Moreover, self-service capabilities are being augmented by new visualisation techniques that make it easier to understand the data.
- The tools available for analysing data are more advanced than before, with the take-up of languages such as R (a free software environment for statistical computing and graphics) and the introduction of new mathematical techniques, such as machine learning. You could also argue that developments in psychology make the application of insight to customer interactions (for example) more effective.
Ultimately, what actionable insight provides is the Freedom to act within a timescale that is appropriate to the business, rather than one that is constrained by IT and other resources. However, there is still more to actionable insight that needs to be considered. In particular, the information from which you derive insight has to be accurate, complete and timely. Moreover, you have to know that it is all of those things. In effect, you have to be able to trust the data and trust that it is secure and complies with relevant regulatory regimes. This is where governance comes into the equation.
Historically, applications such as credit card fraud detection, churn analyses and cross-sell and up-sell marketing have been silo'd applications, reliant on single, rather than multi-variant, sources of data. For example, up-sell and cross-sell have largely been informed by CRM (customer relationship management) systems, and other structured data about sales. With big data this picture changes: now you can include not just transactional data, but also attitudinal data, using sources such as social media and clickstream data to enhance your understanding of customer behaviour. This makes the analysis richer and, as result, (automated) retail recommendations become more accurate and more likely to result in sales.
Of course, not all areas in which insight automation is applicable involve multiple types of data. Preventative maintenance, for example, is a good example of where almost all relevant data is machine generated. However, the stumbling block here has been that there is so much data to collect and analyse in order to be able to predict failures that the cost of processing, versus the value of the analysis, has been uneconomic for most enterprises. This is where low cost platforms such as Hadoop come into play, because they enable the storage and analysis of all of this data in a way that is far more cost effective.
When it comes to people-based decision making it is obvious that self-service enables more timely actions than environments where you are dependent on the actions of others. When business intelligence and analytics are dependent on IT you cannot get answers to questions that were not planned for in advance. Such systems were neither agile nor responsive. The aim of advanced visualisation is to make it easier for business users to understand the data that they use, and to explore it in a way that suits them best; simple and visual and not complex and tabular.
In practice, organisations across the board are implementing actionable insight. These range from Telecommunications to Ad-Tech, and from financial services to retail, logistics, manufacturing, government and anyone supporting or analysing any sort of network (including internal company IT networks). Applications include smart meters, preventative maintenance, healthcare monitoring, crime prediction, fraud detection, retail recommendations, network optimisation, security monitoring, logistic optimisation... the list goes on.
Most large organisations will have many potential areas where actionable insight might be adopted. This is a challenge: which data, which analyses and which automated business processes will be most beneficial to the company? For obvious reasons the prioritisation of possibilities is an important consideration and requires an in-depth knowledge of both the business and the IT resources needed to make such a decision. This is why leading companies are increasingly appointing a Chief Data Officer, not only to advise on such matters but also to be responsible for the provision of the data that will enable insight processes.
Typically, once a subject area has been selected, the first step will be to collect, prepare and analyse relevant and representative data from a wide range of sources (including sensors, web logs, social media, RFID, and so on), from which patterns that are relevant to the business can be determined by data scientists. This process is time consuming and requires specialist skills and it is why discovery platforms and data preparation tools (preparation typically takes 80% of the elapsed time) are increasingly coming into play within the data warehousing arena. In so far as analysis of the data is concerned, there are various techniques that may be used. Some of these are traditional: data mining, text mining, business intelligence and so forth; but the rise of statistical processing through R, the introduction of new NoSQL platforms for analysing real-time data, the increasing use of stream processing for analysing data whilst they are in motion rather than having to wait for them to come to rest in a data store, the development of graph analytics and, potentially, the new discipline of cognitive computing--all introduce new analytic options.
Once relevant patterns with sufficient predictive certainty have been agreed, they can be embedded directly within the relevant business process. In some cases this may mean embedding these patterns within other platforms, such as streaming engines or a database, using technologies such as PMML (predictive model mark-up language, an XML-based file format developed by the Data Mining Group) where the software will automatically raise an alert, or perform some other action when a pattern or anomaly is identified.
Alternatively, results may be directly embedded within a business process. For example, council owned boilers in the city of Birmingham (England) have sensors attached to them to raise an alert when these boilers are interfered with. A more general example would be to raise a service call when a smart meter becomes faulty. This could be done using standard business process management (BPM techniques, using BPM tools that can discover, document, automate, and continuously improve business processes (thus increasing efficiency and reducing costs). Typically, an appropriate business rule is embedded in the BPM framework and invoked by an automated process; such rules can be changed (potentially, even by an end user in the business) without impacting the coding of the whole automated process.
Finally, of course, you can present results to users by means of the various new types of self-service visualisation techniques that are available. In this context, self-service means that data can be accessed directly by the business user and can have it presented on a tablet or smartphone, as well as traditional devices, which can be updated/customised by the end-user, without requiring assistance from IT. As far as visualisation is concerned this means going beyond traditional dashboards and drill-down charts (though these remain valuable) to a number of new techniques for exploring data (see, especially, the data visualisation and information visualisation charts in the Periodic Table of Visualisation Methods). The design elements in some of the more advanced charting techniques are significant, not least because they are based on a better understanding of psychology that helps users to engage with insight more easily and, as a corollary, this means that that insight is more likely to be actioned.
In addition, there is the possibility of using cognitive computing techniques (such as those implemented by IBM Watson) that can be used to provide advice to both users and customers.
An important subsidiary factor is reflected in the fact that you must have (confidence) in the analyses you have conducted. This is true not only statistically (formal measures of confidence levels) but also assumes that the data you have been analysing was correct in the first place. If organisations are going to rely on their data for insight, and are going to embed the results in automated processes, then they must trust that data: the data itself needs governance (technologies such as data quality and master data management as well as data governance itself plus both data security and compliance. And there is the reverse concern: customers want to know that they can trust their suppliers with the information that they provide about themselves--so called Digital Confidence--that that information will not be abused in any way and that it will be protected (using techniques such as data masking to prevent unauthorised access to that information. Thus trust goes beyond considerations about the accuracy, reliability and timeliness of data to include security and related technology.
In summary, gaining actionable insight is not a trivial task. There are many elements to consider that need to be thought about holistically rather than piecemeal. Even in particular parts of the organisation (for example, marketing) there is no a single solution based on a single technology; instead, a range of technologies will need to work together in harness.