Last but not the least in this series about Building Post Migration Utility is ‘Post Migration Support’. During the migration execution phase, audit trails and logs would have been created to ensure that all data has been correctly migrated and the correct synchronization has been achieved. Once the expected result has been achieved the decision to transition users to the new system would be taken. New / Target product training would be required before transitioning users to the new system.

There will also be ongoing data quality enhancements. Data quality issues may be identified and will need rectifying. One has to provide support in terms of maintaining the data quality in the new system.

~ Ramya

In continuation with my previous post, this post is going to be about building the migration wizard. The wizard would consist of two parts: Data Mapping Tool and Data Migration Wizard. The focus is to build a user friendly tool which would support the two phases of the data migration activity.

Data Mapping Tool would cover the major data conversion requirements of master data and transactional data migration from source to target systems. The tool should be built with the help of functional experts in order to design the mapping. Once the business logic is defined and mapped, the tool completes the mapping activity and mapping files would be generated for different product versions.

Data Migration Wizard would be a user friendly approach to support the final phase of data migration activity. The mapping files generated from the Data Migration tool would be the inputs for the wizard. The wizard shall have exception handling and reporting capabilities. The wizard shall be given to the customer to perform the data migration activity.

Migration does not get completed once the wizard is handed over to the customer. Post migration support is crucial since ‘One size does not fit all’. Check back for post migration support details in the next post.

~ Ramya

Continuing my previous post on detailing the phases of the building a migration utility, this post is going to be on the second phase  i.e. Data Assessment and Mapping. As part of the planning process the team will take an initial look at the source and target systems. The objective is to know enough about both systems to make an estimate of the effort to complete mapping and migration.

The biggest success factor in the data migration is the quality of source data, which is also the biggest risk. The most important phase in a data migration project involves the task needed to understand the data content. Our experience in building data migration utility has made us realize that projects fail or are delayed upon discovering that the data is not fit to use.  Getting a sample database from the client during the initial stages itself will give an idea of the kind of data issues that might arise.  Data profiling technology can be used to systematically scan the tables of interest to quantify any data quality and data integrity issues such as: missing values, ranges and outliers, data integrity analysis, parsing requirements to enable mapping of source columns to the target columns etc.

Some of the things to consider for data mapping would be:

  • Some of the target requirements will require significant transformations or the creation of new data content/
  • Table to Table copy or SDK API Mapping

For any matching that is done, each subject area mapped must have a reconciliation base validating that all records extracted were processed. If the work of scoping, mapping and data quality assessment have been completed then building the migration utility will be easier. More on building the utility in the next post.

(Additional inputs taken from SAP Thought Leadership - Data Migration)

Continuing my previous post on building migration utility, this post is going to be on the first phase of migration tool approach i.e. System Assessment. During System Assessment the following queries need to be answered:

  • What are the key customer requirements in terms of functionality when migrating between source and target?
  • What are the module differences between the source and target?
  • What are the functionality changes within the common modules?
  • What type of data do the source and target support?
  • What are basic input / output types, interface points in the different systems?
  • What are the external dependencies?
  • Do the products handled legacy file formats and / or high value transactions?
  • Have the customers customized their source system? If yes to what extent? Can the target system handle the customization without any customization?
  • What is the historical data requirement? This will be required to run reports.
  • Will the migration extend to data for Add-on modules?
  • What is the hardware support difference between the systems?

This query list is just a macro level view of the hundreds of questions that will rise during System Assessment. The end result of the System Assessment will be a defined ‘Scope document’. The scope document becomes crucial in migration tool building because during the actual implementation there is a huge possibility for scope creep. Once the scope is defined the stage is all set for Data Assessment & Mapping. More in the next post.

As I wrote in my previous post, I am going to be focussing on building migration utility focusing on data migration. The actual tool building process can be divided into four distinct phases:

Migration Tool Approach - 4 Phases

Migration Tool Approach

From a macro level view there are four distinct phases. Just like in any IT project, building the tool requires analysis, design, implementation and post go-live support. In the case of IT projects you would be assessing the applications or building applications. In our case of product engineering, analysis requires assessment of 2 different products, mapping the data between the source and target products. The product that is being built needs to support the different product versions of both source and target. So compared to application migration, this is definitely of a bigger scale.

For my next post I will be writing about the first phase: System Assessment. Till then, Happy Reading. 🙂

Business growth brings new challenges to business. Changing business scenarios result in releasing new product versions with better features and new functionality. The challenge lies in upgrading from a lower version of a product to its higher version without losing track of critical information and being user friendly at the same time.

All of us would have upgraded to a better version of a product. Firefox and IE had introduced new browsers quite a while ago. We have all upgraded to the newer version which has better features and new functionalities. The data migration would essentially be the bookmarks or favourites, RSS feed etc.

Just imagine an ERP system with millions of records holding business critical information. How do users migrate to a higher product version without losing or modifying the existing data? How can the migration be done without much complication?

The solution lies in designing an easy to use migration tool. A detailed analysis of both the lower version and higher version of the product needs to be done in terms of comparing the features and functionalities. This comparison needs to be done at the database level and at the user interface level. Based on the results of the analysis the tool needs to be designed and developed. The underlying factors that need to be considered are: the need for minimum user intervention and the ability to restart from any point without the need for an entire rollback of data. This is not just for an ERP but for any product upgradation. I have just given a global view of the solution but as you dwell upon this you would come across various challenges or maybe a better solution. 🙂