Fresh from the debacle at Terminal 5, which is a good example of why big bang approaches to migrations (and other implementations) can be dangerous, I have a number of disparate observations to make.
The first is that I recently spoke at a conference at which Informatica was also presenting. Listening to what they had to say I noted a slide presenting results of a survey into migration projects and the very high rates of project failures and overruns that characterise this market. The results were (within a percentage point) identical to those published in the Bloor Research paper on this topic last year but they were credited to the Standish Group. When I queried this, assuming that this attribution was in error, the company responded that this was in fact correct but the Standish Group's survey had been carried out in 1999. In other words, we have got no better as an industry in managing data migration in seven years: which is a pretty damning statement, which only underlines my call for more migration specialists.
And on that point, you might like to check out www.datamigrationpro.com, which is a web site representing an informal special interest group for exactly this purpose.
On to products and vendors: of which a couple have attracted my attention recently.
The first is IBM Optim, which is the product that IBM gained when it acquired Princeton Softech recently. There are two interesting points here. The first is that Optim is targeting big bang migrations. Now, Optim is primarily an archiving product so what does it have to do with migration? Well, the contention is that in many (though not all) migrations, much of the data that you are migrating is not actually required, or not required immediately, so you can archive it off the source system prior to the migration. This cuts down the amount of data that needs to be migrated during the cutover process, thereby reducing the downtime window that is required. Subsequently, you can migrate the archived data separately and add it back into the new system, if that is required. Or you can just leave it as an archive, if that is appropriate. In either case this seems a sensible idea that will be useful in some instances.
Secondly, Optim allows you to define business entities so that you archive a customer (say) with all of his orders rather than just working on tabular basis, which is a good way to approach data migration as well. This is exactly what Celona does, for example. While not yet integrated with DataStage, this is certainly a potential capability that IBM could develop that would enhance its offering in this area.
The other vendor that I want to mention is Valiance Partners. This company has a product called TruCompare, which provides a test environment for comparing source and target (or source and expected target) data that is based on 100% testing of records rather than sampling. It is not dissimilar to GoldenGate's Veridata although at this time I am not in a position to compare the two since I have not a detailed briefing on TruCompare yet (when I have I will report back). In any case, the value of either of these tools in helping to ensure accurate migrations is not to be exaggerated.
Thus we have a mixed bag but it is encouraging that more vendors are paying more attention (and I have not even mentioned Business Objects' focus on SAP R/3 migrations) to data migration as a market in its own right.