Content Copyright © 2016 Bloor. All Rights Reserved.
As readers of my somewhat irregular contributions to the Bloor website will know, one of the products that I keep on coming back to explain their latest steps is Pentaho. I believe that BI, or as Pentaho prefer to call it business analytics is an ever more important component of an enterprise architecture. To the point where I wonder with some organisations how their executives actually manage their organisations when they have very limited insight into what has happened and almost zero visibility at all as to what might happen given current trends. To achieve a comprehensive BI solution requires more than just reporting and analysis tools, it is as much about data management as it is about reporting and as such the stack required to realise the potential is out of necessity complex and varied in its nature and content. For organisations they create a stack by combining elements of many vendors’ offerings, often with very clunky interfaces as a consequence. Whilst many vendors offer what at first sight appears to be a range of products which assemble into a comprehensive integrated whole, but in reality are also a variety of acquired elements with a GUI trying to mask what lies underneath.
Pentaho offer one of the most comprehensive tool kits and in my opinion one of the best integrated, but even then it can at times still be difficult to achieve what is required simply, and as we know complexity results in cost and is likely to introduce error. I am therefore delighted to be able to announce that with Pentaho 7 they are aiming to simplify and streamline without any loss of capability, by abstracting above the level of the complexity and enabling the technology to address the complexity and the BI professionals to focus on providing functionality and insight, ie the high value adding element of BI.
Pentaho as I have said is a very comprehensive solution, it handles the data pipeline, from the capture and integration through to the delivery of analytics, but now the user experience is being vastly improved, with the interface between the elements being made seamless and the experience made far smoother. At the same time the arbitrary distinctions in the traditional pipeline which see engineering precede, preparation, and finally you have analytics is replaced by a pipeline which recognises that analytics is required at all stages not just when the data is at rest but whilst it passes through any of the stages.
The presentation of results is also automated to ensure that charts are presented in ways that render them intelligible and easy on the eye. This is really useful, because we all get fed up when presented with a chart which instead of replacing a thousand words requires 2000 to explain what it is attempting to represent! I think that the key to what Pentaho are achieving is that they are moving the focus away from the tools, and towards the data, the data is driving what is happening. There will of course be technicians who will decry this and claim that all such moves are deskilling them, but the truth is we need to fundamentally change the productivity of the BI process and get the focus away from the low level technology and higher up the value chain, so that we get better results faster and focus more on the use of those results to change the business for the better. Actionable insight has to be timely as well as accurate to be of maximum value.
Other major enhancements include making Hadoop easier to deal with, again by abstracting above the level of the complexity. They have extended their Kerberos integration, bringing more secure big data integration within reach of enterprise users far more readily. They provided a drag and drop interface for developers to coordinate and schedule Spark applications, and making it easier to apply those Spark applications to a wider variety of application types and environments. Data preparation right through to Visualisation is being catered for in one tool, enabling a broader audience to explore a broader set of data in a more meaningful way as the technical barriers are removed. There are also enhancements to Security for Hadoop. A further enhancement which looks to be really useful is metadata injection, where you can define a template, and pass it against the metadata at run time, and then inject it into downstream steps. So you can have disparate data sources and rules which you want to dynamically change, and this enables you to do that without having to build and maintain a whole raft of transformations and jobs each time.
So a whole hatful of the sorts of things that make this one of the most enterprise ready BI stacks that I am aware of. So this looks to be a major enhancement which is really setting out the Pentaho stall as a BI vendor of choice at the Enterprise level with integrated capability which is easier to use and more powerful out of the box than the comparable offerings in the marketplace which are still reliant on skilled technicians to unite and enact the solutions.