Did we ever heard comparison testing? This is much needed for migration projects from legacy system to modernized system agent compensation system. This can be extended to any sectors but need is existing system is giving payout and new system should also payout the same or helpdesk team should have the details on known differences between two systems.
Let’s think the situation – You as agent were getting the pay check based on existing legacy system. If you get 1$ less in new modernized system than what you have been paid in existing system, would you agree? You call helpdesk and would be eager to know why paycheck is lesser than expected. If 1$ payout by modernized system is greater than what has been paid by existing system, company balance sheet would be dragged and company profit and loss account would be affected heavily. Both the scenarios are not advisable. This blog is to understand the steps for comparison testing and the factors influencing the comparison testing.
Comparison testing is to compare the data output between legacy system and modernized (IBM Varicent ICM) system. Comparison can be done at transaction level, agent level, and office level. It is suggested to start the comparison at transaction level. 1 Office will have many managers. 1 manager will have more agents. 1 agent will have many policies. 1 policy will have many transactions (Issue, re-issue, void, commission advance, commission chargeback, adjustments, termination, reinstatement, etc.).
Importance of comparison testing:-
- Upstream systems (Policy admin system) was sending the data based on their requirements. Agency incentive management system is having business event matrix to get the data from Policy admin systems and most of the cases, business event matrix didn’t match with policy admin system requirements due to many reasons. Upstream systems and Agent compensation system could be managed as two entities.
- Missed requirements as business drafted the requirements based on existing system and not based on new modernized system need.
- System integration testing could cover the scenarios but not to the extent of production scenarios with lot of sequencing like -> Payment mode change -> void and Reissue -> Termination-> chargeback-> Reinstatement. All these transactions will affect commission and bonus for agents. System integration testing couldn’t cover these long sequencing transactions with mocked up data.
- There are certain instances policy admin systems send the event records differently for same functionality – Applicant name change, Billing mode change trigger event record in IBM Varicent ICM system but not in legacy system, Net Month commission period being sent to legacy system but not send to IBM Varicent ICM system. Due to these alteration between two systems, IBM Varicent ICM had multiple issues and could unearth in comparison testing only.
- Legacy system didn’t have track of production problems occurred since implemented 30 years ago and hence requirements phase to IBM Varicent ICM project could also be main flaw and unable to create all requirements.
Comparison Testing Process
Each step (from step 2) will provide the outstanding records to next step till it reach step 6. Complexity step indicates how easily problem can be unearthed in comparison testing.
Complexity in Comparison Testing
|1||Comparison||NA||Automate to extract the comparison output between IBM Varicent ICM system and legacy system. Import the legacy system Mainframe data into SQL table formatted data to compare with IBM Varicent ICM system||Legacy system Business SME, Legacy system Developers, IBM Varicent ICM Product developers and testers.|
|2||Identify||Low / Medium||Identify the problems in Policy Admin systems layer, Inbound layer, Product layer (Premium calculation, Rate identification, Advancing not happening, overpayment, under payment, no Payment, etc), legacy system history layer.|
|Legacy system Business SME, Legacy system Developers, ETL developers, IBM varicent ICM Product developers and testers.|
|3||Segregate||Medium / High||Identify the data pattern for analysis based on dollar amount, high volume transactions, new/renewals, etc and set priority to unearth problems.||Legacy system Business SME, Legacy system Developers,ETL developers, Product developers and testers.|
|4||Technical war room anaysis||High / Very High||Discuss, troubleshoot on top 50 records and move iteratively for next 50 records to unearth problems. We could be identifying existing issues and new issues.||Policy Admin SME, Policy Admin Business, developers and testers, legacy system Business SME, Legacy system Developers, ETL developers, Product developers and testers.|
|5||Business war room analysis||Very High / Critical||This step will kick start with much complex and unable to identify any more problems. Involve Business stakeholders to understand how legacy system handled each policy / record for commission payout to identify the problems w.r.to IBM Varicent ICM Payout.||Life, Annuity , Health Business SME, Finance SME, Policy Admin SME, Policy Admin Business, developers and testers, legacy system Business SME, legacy system Developers, ETL developers, Product developers and testers.|
|6||Residuals||Showstopper||Business and IT team are unable to progress on residual records in comparison testing. IBM Varicent ICM team should give evidence that IBM Varicent ICM system is working fine based on the records in residuals data set||IBM Varicent ICM testers.|
I will write next blog to act on residual records.
Problem identification in Record Set
We have to scan each record to identify the problem / defect. When we identify the problem in single record, SQL query will be executed to retrieve the problematic records with same problem in complete data set. This will continue in each step in comparison testing process but it will be huge challenge in step 5 and step 6 for the problem identification.
I will write next blog in details of comparison testing to talk on variance calculation formula, dependencies on data selection, diagnostics run, QA validation, regeneration of source system files, etc.