Abstractions in MDD

David Norfolk

Written By:
Published: 28th March, 2008
Content Copyright © 2008 Bloor. All Rights Reserved.

If there is a problem with automated development tools (with what used to be called CASE - Computer Aided Software Engineering - and is now called something like automated MDD (Model Driven Development), it is that they are too close to the physical code. Of course, in some ways this is a good thing, as it means that programmers can relate easily to their tools, but in other ways, it can discourage innovation and limit reuse of ideas across different domains.

People model the Current Physical system, using domain-specific descriptive language, and end up automating the status quo - which is only sometimes what the business needs.

Progress in IT is a history of increasing abstraction away from the physical hardware underlying any automated business system. Modern service delivery (from, for example, SOA - "Service Oriented Architectures") is all about abstracting the details of the technology underlying automated business services. The business sees a business service, such as a credit check or customer acquisition, with an associated SLA (Service Level Agreement) It doesn't care whether the service is provided by a Web 2.0 Mashup or a mainframe DB2 database system (well, it might care if the SLA isn't specified in sufficient detail, but that's an "implementation issue").

Automated development tools suffer from a similar lack of abstraction. Whether it's an integrated development environment (IDE) for coding or something covering more of the whole Application Lifecycle, such tools tend to imply a process. In fact, it is often quite hard to unravel the process from the tool and you may find that (whatever a tool's claims to "process neutrality") the process a tools vendor talks about happens to correspond 1:1 with what its toolset does; and when it acquires (or develops) a new tool, the process magically expands to accommodate the new deliverables.

Of course, what you really need is an abstracted, normative, whole-lifecycle development "meta-process" (small enough to understand), instantiated as a practical development process that suits your needs; complete with physical deliverables; and controls that address the possible imperfections of human processes and technology. This real-world physical process will be much larger and more complicated than the abstracted "meta-process", especially if it has to cope with a wide range of physical environments. However, if you validate this "meta-process" and the translation process to its physical instantiation (both comparatively easy) you can have sufficient confidence in the eventual (physical) automated business service delivery process you use for practical systems development.

Then you can slot in tools which automate parts of this process, where they are cost effective, which helps institutionalise the process and reduces any overheads associated with using a formal process (although the major cost justification for this will be the savings associated with "getting it right first time" with no need for rework).

Unfortunately, many automated development tools seem to have been built the other way round, to automate current development "good practice"; with the process behind them built as a bit of an afterthought. So, for example, although the Rational toolset, say, is powerful and potentially process-neutral, it's usually thought of in conjunction with the RUP (Rational Unified Process) rather than the vendor-neutral Unified Process (although a Unified Process does exist). There's now also a vendor-independent Essential Unified Process (see Dave Thomas' 2006 article here) but this is going its own way (and could fall into the trap of aligning itself with Microsoft technology, although it seems reasonably vendor-neutral for now).

That is all slight oversimplification, and most tools were developed top down and bottom up at the same time, but seems to correspond roughly to the status quo, in practice. An honourable exception to this may be the OMG-supported MDA (Model Driven Architecture) standard. Although this is based on practical experience with UML and even Unified Process, it has a proper open-standard meta-model behind it (the Meta Object Facility, MOF), and vendors of MDA-compliant toolsets (Compuware's OptimalJ, for example, is a good example) have to adhere to external standards and can't disguise any gaps in their offering by "tuning" the process for themselves.

However, there's an alternative to building "universal" development tools that are expected to cope with any situation you might come across, in any business area. This alternative approach is exemplified by the relatively unknown OPEN (Object-oriented Process, Environment and Notation) process. This entails the adoption of an open "meta-model" for the development process and building tools specifically for each project, for the precise domain you are developing in. Of course, once you've built the first tool, many of its components can be reused for other projects, so (although there is a barrier to adoption, unless the tool-building technology comes with pre-built components and templates) this can be extremely cost effective.

The advantage of this approach (apart from its not being tied to particular technology or vendor) is that your domain-specific toolset only needs to be able to address the needs and issues in a specific domain, rather than carrying the overheads associated with all possible issues in all possible domains. In addition, the existence of the abstracted meta-model ensures that you aren't too blinkered by the needs of a particular domain.

This is rather where Microsoft is coming from with its Domain Specific Language initiative (DSL) - find out more on its Visual Studio Extensibility page (be careful to distinguish the terms making up the DSL acronym from the usual English usage of these terms). It is probably fair to say that Microsoft's comparatively recent interest in DSLs has considerably raised their profile in the IT industry generally.

Nevertheless, a more mature example of this approach is MetaEdit+ from MetaCase, which released a new version of its tool at the beginning of April 2008. One distinguishing characteristic of the MetaEdit+ approach is that MetaCase expects all its customers to build their own "domain specific models" (DSMs); whereas Microsoft originally expected Systems Integrators and its large enterprise customers to build DSLs, partly because the Microsoft tools and process, being then very new, weren't as easy to use as it had hoped - according to Prashant Sridharan (former group product manager in Microsoft's tools group), reported in March 22nd 2004 here, "The value is there only if you're going to see a massive amount of productivity out of it, because it is a massive amount of effort to do it."

However, Microsoft has put a lot of effort into the usability of its DSL tools since then; and, for example, some of the most popular ‘how to' videos for VS extensibility from its developer community (on the Microsoft site here) concern DSLs and associated code generation.

MetaCase sells development tools mostly for the Telecoms and embedded space. It has years of experience with what Microsoft calls DSLs before they became part of the Microsoft mainstream - Tim Anderson's interview with Microsoft's software architect Jack Greenfield concerning Software Factories and DSLs from March 2007 in Reg Developer here provides some background for the evolving interest in DSLs within Microsoft. MetaCase's approach makes building its DSMs easy, partly because it has the sort of customers it can't dictate to; which is probably why it also produces a rather more open platform than Microsoft's .NET Framework (for example, it uses Smalltalk internally, which automatically gives it multiplatform capabilities).

MetaCase prides itself on being fast and cheap - typical developments to implement domain-specific modelling languages and code generators less than 7 man-days. This is because you can work from a supplied framework/template- not from scratch, using a mature product. This sounds reasonable to us - according to Panasonic, its DSM approach is 5 times more productive than traditional modelling approaches. Similarly, Nokia has detected a ten-fold productivity increase when moving to DSM from earlier manual coding practices. MetaEdit+ does appear to provide a rich development environment. A library of existing metamodels is provided, reducing start-up overheads; and code generation is well-supported - because the DSM is defined within a restricted domain (code generation from generalised models is harder, because it has to be completely general).

One typical use is in infotainment systems in cars, where developers will work with an emulator and get immediate feedback into the evolving models. In such a real-life development, you may have several roles - artists, behaviourists, programmers - each with their own specific DSM but generating code (in different languages, if appropriate) for one product.

Juha-Pekka Tolvanen (CEO, MetaCase) claims that MetaEdit+ is a second generation tool, MetaCase having scrapped and rebuilt its first generation tool (although it retained the ideas); he says Microsoft's current DSL tools are equivalent to its first generation. Nevertheless, Microsoft is developing its DSL offering too, so we might expect to see updated tools from Microsoft in the reasonably near future, although we can only talk about what is available now. MetaCase's DSM generation is template-driven, which helps to reduce the overhead for building the models, although this should only affect the architects anyway. Building the DSM does help formalise thought processes and because code generation is built-in, users can easily try out new models by generating prototypes.

MetaEdit has close integration with Visual Studio (VS) but not as close as a "plug-in" - you can't run MetaEdit+ from inside the VS IDE. MetaCase is, however, looking at closer integration with other tools, including Eclipse. Nevertheless, there are real problems integrating with IDEs that often can't recognise modelling constructs - you can, e.g., end up with one screen that supports UNDO alongside a screen that doesn't (and should). SAP is apparently doing work in this area and finding it hard.

Integrating fully with Visual Studio has similar issues to integrating with Eclipse - plus, possibly, extra issues, from Microsoft's support for very separate DSLs rather than an integrated model (Microsoft's own DSL tools now integrate very well with Visual Studio, of course).

MetaEdit+ has partial support for UML, although it supports features not available in standard UML. However, it can import many existing notations, which facilitate reuse of existing models and makes MetaEdit+ easier to use. Nevertheless, since UML 2.0 is an extensible language built on top of a formal meta-model, and MetaEdit+ espouses a "Model Driven Development approach, we can't help feeling that more complete integration of MetaEdit's DSMs with the OMG's MDA approach (and its underlying MOF metamodel) should be possible. Perhaps MetaCase's attitude to MDA is still somewhat influenced by its developing the MetaEdit+ approach independently.

Despite its main adoption in embedded/telecoms space, MetaCase has had some success in banking etc - but largely with end-user models for domain experts, rather than with programmers. EADS has also adopted MetaEdit+ and its experience confirms our view of the benefits available from a DSM approach. Nevertheless, it also highlights the fact that this approach needs the accommodation of a manageable upfront investment in the interests of future benefit. Initially, EADS was unsure of the DSM approach and recognised that it had to adapt to a new way of working

Using a DSM approach implies that you are prepared to invest building the models for your project before you build the software itself. Typically, the first project comes in over budget, the second project can make use of the pre-built DSM and comes in on budget and the third and subsequent projects reuse artefacts produced earlier and come in well under budget. However, this won't work if you manage projects one at a time (because someone will do the first project cheaper without the DSM); you need to adopt "program management" of several projects which share resources. In turn, this implies that the DSM user is a mature organisation, which already has a re-use culture.

Another issue is that many developers don't naturally think in abstractions; and you also need to design top down and bottom up at the same time. This implies an "architect role" - itself, perhaps, an indicator of development process maturity- for a few people who can think both logically and physically at the same time and thus help to mentor development. A further issue again, is that to manage delivery of overall benefit from a program involving upfront investment in the DSL approach, you need to be a metrics focussed company. It's all very well to claim that any initial investment has been paid back many times, but it is even better if you can prove it

And, now, a final question: did Microsoft copy MetaCase's approach? Almost certainly not, partly because today's DSL's have many roots, but partly because Microsoft's lawyers are extremely loath to let its developers look at anything developed outside Redmond, just in case someone later sues Microsoft for IP theft. Possibly, this means that Microsoft provides independent intellectual innovation but we (along with Newton) have always thought that progress comes from "dwarves standing on the shoulders of giants". However, Microsoft is keen to point out to us that it is now more open to outside developments (such as those from the Open Source community) than it used to be and cites the recently-updated Microsoft Interoperability Pledge in support of this. Apparently, it has just posted an additional 10,000 pages of protocol documentation on the Microsoft Developer Network (MSDN), "bringing the cumulative total to 40,000 pages of protocol documentation committed in its interoperability principles". This is impressive: "weight of paper" is an interesting interoperability metric, one that we might not have thought of ourselves.

Microsoft is seeking feedback on its interoperability efforts; we'd observe that, at a reading speed of 5 mins/page reading 40,000 pages will, take around 139 days - not including any breaks for sleeping, weekends or holidays (getting on for 2 years of normal working days), so that the devil, as far as achieving interoperability goes, may well reside in the detail.

Anyway, back to our main topic, if you are looking at Microsoft's DSLs as a customer, remember that (despite all the DSL activity in its Visual Studio Extensions site) Microsoft doesn't claim to have "invented" DSLs; we think you'd be well advised to check out MetaCase's alternative approach at the same time.

Post a comment?

We welcome constructive criticism on all of our published content. Your name will be published against this comment after it has been moderated. We reserve the right to contact you by email if needed.

If you don't want to see the security question, please register and login.