Content Copyright © 2021 Bloor. All Rights Reserved.
Also posted on: Bloor blogs
Recently, it seems as though the marketing departments of IT vendors have been in a frenzy trying to find ways of positioning and branding their solutions as edge computing. Why the sudden feeding frenzy? After all, ever since minicomputers in the 70s and PCs in the 80s broke out of centralised mainframe data centres, businesses have been using computers at the edge of their operations…in retail branches, in distribution depots, in bank branches and manufacturing plants etc.
At that point, communications between the edge were hampered by the cost, lack of flexibility and low bandwidth and communication speeds of networks. The advent of the internet, wireless communications and fast, global fibre connections have provided the basis for much cheaper and more effective communications between globally distributed locations and the centre. But even then, most IT departments have still only been managing IT infrastructure consisting of thousands of end points on their network. IoT (Internet of Things) devices and smart phones changed all that. Now, there are literally billions of devices that want to process instructions, collect data and communicate with the centre in real-time.
This is where the fun starts. If all these billions of devices want to communicate with the centre in real-time, they’ll break the internet, or any other network that functions on a centralised data computing architecture for that matter. What we need to do is to work out what data can be processed and stored locally and what needs to come back to the centre. Estimates vary, but the sense is that at least 60% of IoT data can be processed and stored, for a short period of time, locally. Up to now, the challenge has been in delivering significant new compute and storage capability close, or at least closer, to the IoT sensors and other end-user computing devices. The challenges have been less about raw compute power and more about being able to locate, connect, manage and secure the required technology.
Back in July 2018 I published a Hot Report on how IoT, 5G (5th Generation mobile networks) and Edge Computing would be the key enablers of the 4th Industrial Revolution. The report is still remarkably current. The point of interest reading it now is that the ideas and use-cases were there, but adoption was slow. There are probably any number of reasons why it wasn’t taking off in 2018. Almost certainly there was a lack of suitable IT infrastructure. This is not a new problem. While we were discussing the edge computing phenomenon internally in Bloor, our Chairman, Brian Jones, related the views he heard in the 1990s from Mike Winch, the CIO at the large retail supermarket chain Safeway. Basically, this was that you had to balance the development of new applications with the ability of the IT infrastructure to run them. Too much focus on the latest IT Infrastructure technology, at the expense of application development would result in costly underused capacity. On the flip side application development, or at least its implementation would be hampered by lack of focus on and capability in, the underlying infrastructure. That was certainly an issue in 2018. But the picture is changing rapidly.
So, what has changed? 5G with much higher speeds, bandwidth and greater flexibility is being rolled out more widely. Improved wireless technology with the advent of Wi-Fi 6 and the continued development of Low Power Wide Area Network technologies (LPWAN) make the deployment of large numbers of small IoT sensors and other connected devices viable. Server and storage footprints are becoming smaller and smaller. Allied to this are secure, resilient modular micro-datacentres (think about something starting not much bigger than the BT Openreach fibre cabinets you see sprouting up on pavements in your town) that can be installed at the base of mobile operators’ masts, in hostile manufacturing environments, in hospitals, retail branches etc, without the need for additional cooling.
These physical changes in IT infrastructure are an important factor in enabling the growth of edge computing, but there are real game changers in software. The key is the way in which the control of all the physical network and compute devices has been abstracted into software and their functions virtualised. New servers, networks and storage devices are now all software defined. In practice this means the same physical device can be reconfigured remotely to perform a different role in say a network, using software. This makes for greater agility and lower costs. It also makes it easier to control and manage devices remotely. Just as IoT threatened to overwhelm existing networks, there aren’t enough IT staff to physically be on site to feed and water the equipment on a daily basis.
Going back to Mike Winch’s analogy, the infrastructure is in place for the applications to take advantage of. So, it is now possible to develop and run applications for this edge computing environment in the same way you would run your existing cloud computing environments. Additionally, running Artificial Intelligence (AI) and Augmented Reality (AR) applications at the Edge is now a viable reality. Furthermore, streaming analytics now delivers the ability to analyse the huge amount of data generated by all these new connected devices locally, in real-time.
Why should you care? This new ability to capture, analyse and process data near to its source, nearer to your customers and partners, and to be able to reconfigure and pivot your IT to adapt to changes in consumer behaviour, supply chain needs, and regulatory requirements quickly are key elements in making your business Mutable. Darwin’s theory of evolution is about being adaptable to changing environments, not necessarily being the fittest. Being at the edge is, almost by implication, “edgy” and is the essence of being a Mutable enterprise and thriving in today’s world. Is your business ready for edge computing? If not, why not?