Mainframe modernisation is now very much part of OpenText

Written By:
Content Copyright © 2023 Bloor. All Rights Reserved.
Also posted on: Bloor blogs

Mainframe modernisation is now very much part of OpenText banner

The OpenText World 2023 conference in Las Vegas was a chance to learn about its attitude to mainframe modernisation, now that OpenText has become a major player in this space, following from its acquisition of Micro Focus.

I was talking to Que Mangus (Product Marketing Manager, AMC, at OpenText) and it was clear that we were in general agreement. Mainframe modernisation matters because many important mission-critical workloads still run on mainframes, running some workloads on anything else will be difficult and/or expensive, many workloads can now run on other platforms, and moving workloads to the Cloud (i.e., using cloud technology abstraction and interfaces) makes sense. The first requirement is to know what mainframe technology you have, how well it works and who it impacts; and here, AI promises to help. Once you know exactly what you have, you are in a position to know exactly what is the most appropriate platform to run it on. Mangus’ practical experience is that, more often than not, this will turn out not to be the mainframe; my view is that once the technology has been fully abstracted, whether or not to use a mainframe is a fact-based choice, balancing effective cost (what you get charged for buying a mainframe is seldom list price, and you have to factor in lifecycle support costs too) against quality of service.

I think that once you take a more holistic view, including sustainability, re-using waste heat, privacy and increased use of compute-intensive services like AI and encryption, the mainframe may look increasingly attractive – but that must be a fact-based choice. What you mustn’t do is make, usually emotionally- or historically-based, assumptions such as that the mainframe is always:

  • more secure (it can be, but only if it is properly managed as part of the organisations security policies as a whole);
  • able to process higher throughputs (possibly, but how well your applications parallelise may be a factor in throughput on distributed systems, and do you actually need the throughput the mainframe is capable of?);
  • very expensive (vendor pricing has become much more flexible and customer-friendly over the years);
  • technically obsolescent (it is still being maintained, and is offering state-of-the-art facilities today).

We also discussed the quality of mainframe data, in the context of training business-oriented AI. The quality of the training data is probably key to the trustworthiness of any AI and mainframe data is of very high quality (you are sure, for example, that a customer exists and has a credit rating and a reliable contact address) – until, that is, you submerge it in the data swamp, when the lowest data quality in the swamp will apply. However, if you restrict training data just to high quality data, you may not have enough of it, and you may introduce bias. I wonder if training data should be weighted by a Quality of Service (QoS) tag? This implies either that metadata QoS tags are stored with the data or that there is a pointer to QoS metadata stored in a data dictionary or data catalogue. Logically, it doesn’t matter which approach you use; but what does matter, I think, is that you can’t access the data without also having the metadata made available.

I had hoped that, given its document management roots, OpenText might have some innovative metadata-based approaches to mainframe modernisation and exploitation of mainframe data. Sadly, apparently not, although Mangus is fully aware of data quality and QoS issues, and who knows what the future will bring? Whatever it is, it will probably be badged Aviator, probably involve AI – and it might even be embedded in the rich set of OpenText announcements at the conference without my noticing it.

So, my estimation is that mainframe modernisation is alive and well in OpenText, but hasn’t really progressed far from its Micro Focus roots yet. And there is nothing wrong with those roots, of course. I do expect progress in the future, however, possibly around the use of machine learning and AI tools to assess what you already have; this is an essential precursor to modernisation, in my opinion.