Please wait...

Legacy Data - migration and integration

Lift what is needed

Derive operational analytics from legacy data and explore inbound collaborative data integration options. The various use cases of Data migrations are policy data migration, claims and commissions data integration, new age legacy dw expansion, transactional data replication.

Perform Seamless Data Migration with:

1. Lift and Shift Migration Model:
Lift and shift data migration typically follows a script based approach and won’t require any structural changes to target data marts.
Approach to move:

The approach of standardized data migration follows the 3-nuanced approach:

  • Export Scripts definition
    • Export Scripts definition of data to be picked up from source database or source systems
  • Intermediary stage and cleansing
    • Intermediary stage and cleansing of intermediary data files and standardization to meta data format and redundancy check
  • Import Scripts to load and import the finalized data sets into the target database or data
Testing to validate:
  • Completeness testing
  • Data correctness
  • Data integrity
  • Business rules testing
  • Data reconciliation checks

2. Object Data Migration Model:

Object data migration model is extremely important in moving logical data. The following use cases are best fit for employing our object data migration model.

  • Policy Data or claims data for various insurance products
  • Sales POS data from End systems to integrated data store
  • Broker and Sales Org Data from legacy systems into Duck Creek, One Shield systems
  • Consolidation of reporting or legacy data about products / packages
The Object Data migration model employs five major phases:
Pre-assessment and Foundation:

The business flow model along with analysis of interdepend ability, data consistency, object relation mapping, standard logical grouping will spur the first level of activities.

Further to this the foundation phase will be used to carve out a data use cases, and automation strategy including the high-level estimation and timeline for implementation.

Object Data Modelling:

The object data modelling involves creation of a logical MVP data module. This module should be the schema for the logical unit being extracted. The essential test attribute of a MVP would be to stand by itself as an enterprise component.

The other steps involved in creation of data model is to perform a data clean up and profiling. Once that is achieved the harmonization of data process is performed and proof of concept strategy along with implementation planning and dependencies diligence is done.

Data extraction and Migration:

During the process, the mapped scripts to the object data model are leveraged to perform data extraction and migration of To Be state is done.

Validation and Verification

As the data is migrated and loaded into the target schemas, the data correctness check along with data completeness test along with straight processing for MVP use cases are performed to ascertain the validity of the data move.

Model Acceptance:

Once the TO BE and As IS reconciliation report is analysed and consolidated, the object data model is further improvised and normalized considering the learnings and findings from the process. This is followed by release of control reports and sign off on the process

I am ready to get in-depth analysis of the data? Write to us