Big Software and the 4th Industrial Revolution - Enabling the Mutable Enterprise

Written By:
Content Copyright © 2018 Bloor. All Rights Reserved.

The Background

It was back in 2011 that Marc Andreessen said, “…software is eating the world.” His view was that, in 2011, “we were in the middle of a dramatic and broad technological and economic shift, in which software companies were poised to take over large swathes of the economy.”

7 years on there is plenty of evidence to show that he was right. The success and scalability of the Cloud has enabled the high-profile growth of companies living almost exclusively in the virtual world, like Amazon, Google, AirBnB and Uber, to disrupt existing players and change the face of many industry sectors. Now new technology advances are also ushering in a new era of potential productivity improvement sometimes referred to as the 4th Industrial Revolution, or Industry 4.0. The opportunities for business are enormous, but failure to adapt could be fatal for many organisations.

While the focus has been on the growth of cloud computing, it is the development of new software technologies and capabilities that have increased the speed and agility and reduced the cost of both application development and the underpinning hardware infrastructure. This gave the digital native businesses the ability to bring new applications to market faster and more often, running on an infrastructure that was potentially cheaper to deploy and run. But these developments do offer existing physical businesses, with a mix of legacy on-premise applications and new cloud-based systems, an opportunity to compete with, and perhaps out compete, their digital only challengers.

The growth of software-defined infrastructure

Server virtualisation was an important early enabler of cloud computing. The abstraction of control functions away from the hardware or firmware and into software enabled not only improved server utilisation, but the ability to run what were proprietary operating environments on (cheaper) industry standard processors. Similar approaches have been taken with storage and, more recently, networking, where again, software abstraction of the control plane enables industry standard hardware to be used where once expensive, proprietary storage arrays or network devices had to be used. In some cases, for example in load-balancers, the functionality, previously deployed in hardware can now be completely delivered in software.

The benefits were not only about cost reduction. Reconfiguring network switches to deal with changes in the business, or changing data storage functionality, can now be achieved with automated software updates rather than more complicated firmware upgrades or even completely new equipment. This allows businesses to react to changing needs in a far more agile manner.

The changed face of application development

Application development has also undergone significant change. The concept of micro-services and service integration, allied with new approaches to application development, that can perhaps be encapsulated in the term DevOps, has allowed businesses to bring new functionality and new services to market in a fraction of the time it once took. Open APIs (Application Programming Interfaces) now make it possible to add new services (functionality) much more quickly and simply than changing older, monolithic applications, while containers, such as Docker simplify and speed up deployment. Further development of low-code and no-code technologies offer the possibility, in certain areas, of reducing or even removing the need for traditional IT programming.

Platforms and automation rise to meet the challenge of complexity

This approach has thrown up additional challenges in managing this plethora of new micro-services and ensuring performance and security. As more and more services are wrapped into containers the need for orchestration became acute and has been met by systems like Kubernetes. At the same time new automated testing technologies and the integration of security into the early stages of software development has enabled the best DevOps teams to keep up with the demands of continuous development and continuous deployment without compromising performance or security.

Big Software – much more than software abstraction and DevOps

Highly sophisticated software, driving increasing velocity and agility in IT, at a lower cost leads us to the concept of, what we at Bloor and others call, Big Software. The role that Big Software plays becomes even more important as we look at how the Internet of Things (IoT) and rapid developments in Artificial or Augmented Intelligence (AI) and Machine Learning (ML) are driving huge growth in storage, networking and processing needs.

For example, Network Function Virtualisation specifically, and software defined networking in general are critical in enabling networks to deal with the very different requirements IoT will drive in each vertical industry for latency, coverage and security. IoT itself, spawns huge amounts of data, much of which will need to be stored and analysed. AI will become a more common component of business IT systems, automating manual processes, both in the business and IT, improving analytical decision making and potentially enhancing security systems.

Big Software has also had a profound impact on the make-up of the IT vendor landscape. Increasing commoditisation of hardware is forcing some vendors to look at mergers and acquisitions to maintain the scale necessary in low margin product, while others, like Cisco, look to turn themselves into software companies. Around them the hyperscale cloud companies, AWS, Google and Microsoft continue to use developments in Big Software, often initiated by themselves, and burgeoning open standards in hardware design, to disrupt the business models of traditional hardware and software vendors.

Big Software is key in enabling all IT to be consumed “as-a-service”. Infrastructure-as-a-Service (IaaS) where you provision the compute and storage capacity where, when and for how long you need it. Platform-as-a Service that provides both the compute and storage capacity and a complete software development environment that reduces the complexity of using and managing multiple software development tools. And Software-as-a-Service where, to put it simply, you rent applications and all the infrastructure needed to run them.

The continuing challenge for CIOs and CTOs

The challenge facing CIOs and CTOs in traditional “physical” businesses has been how to adapt at pace while keeping their existing legacy systems running. Big Software offers the opportunity to break out of the restraints of old IT architectures and certainly provides a wide range of solutions and capabilities designed to increase speed and agility, while reducing costs. The at pace point is now critical. The speed of change is such that unless I.T. can support business in a state of continual change, in our terms a Mutable Business, there is a real threat to that business’ survival.

However, the question remains “how do I get there from here?” Firstly, the understanding and support of the Board is vital. Secondly, an IT vision and strategy are needed that closely align with the business vision and strategy of the organisation. And thirdly, the CIO and the CTO need to develop their IT workforce and provide additional skills and experience where necessary.

Taking some initial steps

The first and second requirements are not the same. It is our contention at Bloor that many Boards do not understand the business opportunities and risks that Big Software represents. Increasingly, the CIO and CTO will need to articulate the sort of new business models and opportunities that might be possible and highlight how and where disruption of existing models has already taken place. We will be focusing on this at our CIO Watercooler Digital Boardroom at the end of October. Once the Board has understood the potential for business change, both I.T. and the Business can concentrate on developing and aligning visions and strategies that take advantage of the opportunities and minimise the threats.

The third requirement covers both developing and repurposing existing IT resources and, given that skills in new technologies are both scarce and expensive, the judicious use of 3rd parties. I am not a fan of Gartner’s Bi-modal I.T. stance so, on the former, I would point readers to Geoffrey Moore’s book “Dealing with Darwin” which takes, as you might expect, an evolutionary approach to transitioning to new technologies that doesn’t isolate and disempower those involved in maintaining the legacy systems.

As to the use of 3rd parties, the inherent scale and complexity of the infrastructure that now supports cloud at scale offers a real opportunity for existing systems integrators (SIs), Managed Service Providers (MSPs) and Value-Added Resellers (VARs) to build on their existing customer and vendor relationships by reskilling and repositioning themselves as service integrators and cloud orchestrators for this new Big Software world.

If you are a CIO or CTO and want to hear more about the challenges and opportunities posed by Big Software sign up and take part in a CIO Watercooler Digital Boardroom discussion that takes place at 1PM on Tuesday 30th October.