Cloud Commons and a Service Measurement Index

Written By:
Published:
Content Copyright © 2010 Bloor. All Rights Reserved.
Also posted on: The Norfolk Punt

If CA Technologies is pinning its future to emerging hypergrowth markets, it’s also pinning it to Cloud Computing. It seems to be promoting a “supply chain” model in which internal and external resources are combined to deliver automated business services.

CA’s key contribution to this (sorry, I just can’t be taken with typing “Technologies” all the time even if it is important to making sure Bill McCracken doesn’t get swamped in Californian hits every time he googles his company) is its CA Cloud-Connected Management Suite, with the first parts of this being delivered later this year. This is built on recently acquired technology from 3Tera, Oblicore and Cassatt and is a suite of four key products:

  • CA Cloud Insight;
  • CA Cloud Compose;
  • CA Cloud Optimize; and
  • CA Cloud Orchestrate.

The last three seem quite technology focused. They are about creating a cloud from commodity hardware and deploying services on it (Compose); optimising your cost of cloud ownership (Optimize) and providing workflow control and policy-based process automation (Orchestrate).

However, since I believe that the Cloud is really little more than than the abstraction of automated business services from the underlying technology, I am most interested in CA Cloud Insight. because it introduces the concept of cloud service metrics for things like quality, agility, risk, cost, capability, and security—which may be the key to implementing effective cloud computing in practice.

This is is all supported with two new initiatives, starting with the Service Measurement Index (SMI, which attempts to provide a rational way of measuring cloud computing services along a range of dimensions (security, cost etc.). At present I think that this is a capability metric (is something provided, how much does it cost); it might be interesting to add effectiveness metrics (if a rich set of security SLAs are provided by a service, how easy are they to administer and work against in practice) in the future. The low-level SMI metrics roll up into a single score, which may bring abuse issues, since it oversimplifies something complex. Nevertheless, because you can drill down into the detail from the single number and because you can change your profile (perhaps security is more important to you than cost, while someone else has the opposite priorities) and generate a custom metric which reflects your specific priorities for cloud computing, the single number shouldn’t be an issue in reality.

The second initiative is the Cloud Commons online commiunity. This, it seems to me, is essential if the SMI is to take off. CA Technologies has seeded Cloud Commons with information on service levels and experts but the community must take off in its own right. One interesting aspect of SMI is that in addition to the formal metric, a community-based “star rating” is provided (rather like Amazon community book reviews/feedback) and you can cut and dice this, so that you might look for services 5 star rated by low-level developers or by security officers etc., as appropriate. I do have some doubts about star ratings, except as a community building thing, because I’d think they could be easily “gamed”. If Apple, say, comes up with a Cloud service it could probably mobilise a multitude of fans to give it 5 star ratings (not that it would, of course); a start-up might find this harder. However, if the community becomes strong it will become self-policing and such gaming can rebound on its perpetrators.

Perhaps the key enabler for SMI, however, is that CA is stepping back and giving it all away to the SMI Consortium at Carnegie Mellon (the home of the Software Engineering Institute and CMMI standard) to look after. Jeff Perdue, Senior Scientist at Carnegie Mellon Silicon Valley talked about this at CA World and is obviously used to these sort of metrics and managing their effectiveness. Perdue says that the SMI is mostly made up of metrics that could be independently measured, so if an independent body wanted to collect the measures on service providers and certify their SMI score that would be possible. There aren’t currently plans to do something like that, but the goal of SMI is definitely to have an objectively verifiable method for collecting measures and determining the resulting score.

We need something like this index because “professional” cloud computing will need some way of comparing cloud service levels without vendor bias. Today. if two business partners want to exchange information electronically they will, or should, do some form of “due diligence”: are my partner’s systems well-managed; if I come to rely on them, are they appropriately robust and resilient; does my partner have business continuity procedures in place; does it have appropriate security, so that linking to it won’t put my systems/information at risk? If your partners’ systems are entirely in-house, this gives you a (possibly illusory) confidence which can be supplemented by noting compliance with ITIL, ISO 27000 security standards etc. However, if my partners’ systems (or some of them) are in the cloud, or if I wish to obtain services from the cloud, it has been difficult to evaluate service levels in any very objective way, which can make it difficult for me to gain confidence that cloud services aren’t adversely impacting the quality, reliability or security of my systems in general.

The SMI should let me compare different cloud offerings, according to my priorities and along several dimensions, and select whatever is appropriate to my needs. If a partner wishes to reassure themselves that I am a safe and appropriate partner for an electronic connection, I can include the SMI scores for any cloud services in the negotiation as a basis for discussion.

Do I think SMI will take off? Well, the chances are about 50:50, I guess. CA Technologies has done the right thing by handing it off to Carnegie Mellon and the Cloud Commons community should help the industry to take ownership of it. However, some vendors will doubtless try to “game” the SMI numbers just as some tried to “game” CMMI maturity levels (claiming a maturity level for the whole organisation following appraisal of a very small part of it, for example) and this will reduce confidence in the initiative unless it can be nipped in the bud. Success probably depends on the Cloud Commons community taking off and that’s a social networking and people thing that’s hard to predict.

Still, I think something like the SMI is essential if professional cloud computing is to become accepted as part of the business automation tool-kit for mission-critical services: “you can’t control what you can’t measure” as Tom DeMarco said. I’m cautiously optimistic—Carnegie Mellon has a lot of experience with this sort of thing and the industry does need something like the SMI.