CA Technologies, and the Application Economy

Written By:
Published:
Content Copyright © 2014 Bloor. All Rights Reserved.
Also posted on: The Norfolk Punt

I’m just writing up my thoughts on CA World 2014, which was different to any other CA World I’ve been to. There was a strong sense of a ‘New CA Technologies’—although, when one dug deeply, it seemed to me that this was the start of a journey for most conference attendees. Nevertheless, the high level story about “Business re-written by software” for the “Application Economy” is a good basis for taking CA Technologies forward, I think.

CA Technologies was also talking about the API Economy and, since it is just possible that people will find this confusing, perhaps I’d better explain. The Application Economy is all about us being in a world where your customers are far more likely to experience your brand through a software app than a live person—and the quality of the end-user app experience is critical to a business’ success. The API economy underpins this, to my mind. It is all about every piece of software having a well-defined API (Application Programming Interface) and communicating only through this, so that applications can be built reliably by mashing up or orchestrating existing components using their published APIs—and this makes providing a flexible and ever-evolving software user experience to customers feasible (amongst other things). For this to work, good API management is important and CA acquired this capability when it acquired Layer 7 Technologies in 2013.

CA Technologies is rationalising its product range and increasing its SaaS offerings, and now has some 40% of its customers on its latest releases—which is good, but, as Mike Gregoire (CA Technologies CEO) points out, not good enough if his customers want to take advantage of the business opportunities available from opening up their information assets to partner organizations, third-party developers, mobile apps and cloud services, using the latest RESTful APIs and modern access control systems provided by the latest CA Technologies software. One particular feature I liked about the new offerings is the idea of security as enabling access for business rather than just for stopping the bad guys getting in—it still does that, but the change in emphasis is important.

Mike Gregoire did a great job of splitting “them”, (the old guard), from “us” (attendees at the conference), whilst still keeping “them” at the party as the “us” are building the new future! All the executive team were very bullish, in fact, and Amit Chatterjee (EVP, Enterprise Solutions and Technology, CA Technologies) filled in the technical detail well; although I do have to say that some of the audiences at the exhibition floor presentations were a bit sparse and a bit more subdued, perhaps, than I really expected—but this is probably because more than 50% of the session content was delivered off the show floor (we had an innovative CA mobile app to help us find our way around everything; it’s easy to keep fit walking around a Las Vegas conference).

One customer we interviewed was very keen on how well the CA Technologies products he used worked—and that’s my view of the bulk of them too. He was also interested in the promised ‘Application Economy’ future being promoted at the conference, although he did seem to be waiting to see exactly what CA Technologies would offer him.

Talking to Michael Madden (Mainframe General Manager) suggested that the CA Technologies mainframe story is now pretty much complete—and is still very important to the company. At a high level, according to Madden, “the mainframe is one of three core platforms, mobile, cloud, and mainframe, where customers are increasingly asking for a common enterprise management and monitoring solution”.

I was particularly impressed by the strategy around workload automation, after talking to Desikan Madhavanur (SVP, Mainframe Product Management). This is something that I think is sometimes overlooked (see my blog, here) although it is already important to large mainframe (and even many large distributed systems) users. However, its future could include machine learning built into the ecosystem (for good practice knowledge discovery and knowledge transfer) and SaaS offerings to enable the exploitation of workload automation by smaller companies. Desikan talked about the rationalisation of the CA Technologies workload automation product line whilst maintaining a familiar ‘look and feel’ for established users—but on top of (eventually) a common core code-base. I think workload automation is about to move into the mainstream consciousness (compare my blog on the Cortex ‘software robot’ approach here).

I also got some interesting insights into CA Chorus. The way to see Chorus is as a management tool integration platform (with knowledge capture and transfer features) onto which CA Technologies installs management ‘disciplines’ for specific parts of the team—DBAs, for example. However, I could see it becoming a possible front-end to CA Technologies generally (across mainframe and distributed products), using its knowledge acquisition and transfer capabilities (and, perhaps, machine learning) to allow all IT support users to access the right tools to address their particular needs, across the enterprise. This isn’t CA Technologies’ strategic policy, it seems, but I wish it was.

An interesting thought is that customers could write their own cross-environment ‘disciplines’ for the CA Chorus platform, integrating their own in-house management tools, and making it easier to support still-important legacy systems going forwards as part of a modernised organisation.

I think this piece is now getting a bit long for a blog but I will mention that the CA Technologies DevOps story is now very good, although, it seems to me, it is very much an ‘operational IT’ story. Cameron Van Orman (VP of strategy for Infrastructure Management and Service Assurance at CA Technologies) tells me that “we have a strong Dev part of the portfolio in addition to IT Ops. Solutions like CA Service Virtualization and CA Release Automation” and that is true, and very much DevOps as the Agile practitioners who started the DevOps movement understood it. However, I’ve always seen coding, testing and release as an early part of the operational (with a small o) story. I’d like the DevOps pipeline extended to include requirements analysis and design as well as user experience monitoring.

IBM (for instance) has all of its Rational and WebSphere scope (and even continuous systems engineering) inside its DevOps story, which gives it a more complete view of DevOps, in my opinion, with feedback loops from user experience through to application design.

One thing I haven’t completely got my mind around yet is the CA Technologies development story around CA Gen (which used to be IEF, a CASE tool which I once knew fairly well). Michael Madden tells me that “CA Gen is an area of investment and we are using it, combined with CA API Gateway, to unlock CA Gen IP and extend the life of the product set through the creation of web services that can be utilized in the composite app world”. And, CA ERwin, which can be used for building the data models that complement CA Gen, is now back in the CA Technologies portfolio.

That’s the trouble with CA World—CA Technologies still has so many products that taking them all in over only a few days is almost impossible, despite its welcome focus on rationalisation for the Application Economy. I think that my main takeaway is that CA Technologies API Management, in conjunction with its view of security as enabling access to resources for business use, is the starting point for a very interesting enterprise mobile app development story, ‘mashing up’ existing applications and new mobile apps via RESTful APIs. Which isn’t to denigrate its excellent management tools, of course—the Application Economy depends on having access to secure and well-managed technology platforms. In its essentials, API Management adds Freedom to automated business.