Data migration is the process of moving data from one environment to a new one, as opposed to data integration, ETL and replication, which are primarily concerned with moving data between existing environments, though these technologies may be used in order to support the migration process. Data migration is often undertaken as a part of a broader application migration (for example: migrating from SAP to Oracle, consolidating SAP environments or migrating from version of SAP to another) but may also be used to support migration from one database to another or between major upgrades of a database. The implementation of master data management may also require data migration.
Note that data migration is not, in itself, a technology but rather a specialised task that needs to be supported by a variety of tools and techniques.
Data migration is a discipline or process that employs a range of technologies to ensure a smooth migration from an existing environment to a new one. Best practices suggest that data migration will require use of data profiling and discovery tools and data quality capabilities and may also involve ETL, data archival and data masking technology. Test Data Management may also be a requirement in some instances, as may replication, in order to support what are known as zero-downtime migrations. Note that ETL is not a necessary requirement as some tools in the data quality space can be used to transform data and prepare suitable application or database load files.
When migrating to SaaS (software as a service) environments it is important for data migration to be automated as much as possible, especially where these applications have been acquired directly by the business rather than via IT. Recent research suggests that over 50% of SaaS deployments have significant problems around this issue while almost 20% of such projects are cancelled.
Data migration is an enabling discipline. Business users will decide that it makes business sense to migrate to the new version of SAP or to move from a SQL Server database to Oracle and it will then be up to the CIO or relevant IT personnel to determine what tools to use for this purpose.
That said, research suggests that only two-thirds of data migration projects are brought in on time and on budget. This means that the CFO should certainly care about major projects and so should the CEO if there is any potential damage to the company's image from any failed or late project. Apart from the use of technology, major contributory factors to success are:
- The business must be involved at all stages throughout the migration process
- A tried and tested methodology should be used
- The organisation should acquire at least knowledge and expertise with respect to migrations—this should not be left solely to third parties—you cannot exercise oversight if you have no understanding of the processes involved
Business stakeholders in particular need to care about relevant migration projects.
So-called zero-downtime migration was a major trend a couple of years ago but we are hearing less about this in the market. This is primarily because GoldenGate, one of the major exponents of this approach, was acquired by Oracle (but not, primarily, for this functionality). The other vendor actively pursuing this subset of the market was Celona but this company has gone out of business. Informatica and IBM (for example) offer some sort of zero-downtime migration capabilities, using replication, but do not offer the ability to capture failback logic for applications.
What has changed more recently is the realisation that it makes sense to archive some of your old data at the same time as you are migrating the remainder: after all, this is just another kind of migration. Secondly, users are increasingly aware of the governance and compliance issue involved in migrations and, in particular, the requirement for data masking during the migration process.
As noted, GoldenGate (Oracle) is not as active in this market as it once was, and Celona has disappeared. In other respects there are not many companies that specifically target data migrations—it is generally considered to be the domain of data integration vendors—but both SAP and Informatica have, at least at times, formally targeted it. Nevertheless, mainstream vendors have been broadening their portfolios to include all the necessary and optional products required to support data migration though some, notably Informatica, are probably ahead of the market in this respect.
Another company that specifically targets data migrations, and has multiple case stories to prove it, is X88. This is interesting because Pandora does not have any ETL capabilities but instead prepares relevant load files. While perhaps not suitable for the most complex migrations, this appraoch certainly will work for the vast majority. Other companies adopting a similar approach to X88 include Experian Data Quality, Datactics, Trillium and Innovative Software.
Where there has been some change is with companies like Actian (previously Pervasive) and Dell (Boomi) targeting ISVs that provide SaaS applications, the idea being that migrations between popular environments (Siebel to SugarCRM for instance) can be largely automated by embedding relevant connectors and pre-built transformation functions.
One more emerging trend is that of reuse. While reuse in the context of data governance is well-known, Entota has developed a data migration portal, aimed principally at SAP migrations, that is focused on reuse at the target level, on the basis that the target may be the target for other subsequent migrations and/or that target may itself be a source of subsequent migrations in the future. This makes a lot of sense. Entota has recently been acquired by BackOffice Associates (BOA) and the latter has also, just recently, announced a partnership with SAP whereby SAP will be be co-marketing and selling BOA's migration accelerator in conjunction with its own offerings: a particularly powerful combination for SAP environments.