A life insurance company that was to be sold, including a portfolio of more than 300.000 policies, faced a potentially deal-breaking problem. Certain legally relevant clauses had come to light, for which it was unclear (a) how many policies were impacted, (b) exactly what wording the policies contained, and (c) to what extent such clauses could open the door to legal claims.
The insurance company commissioned Deloitte to analyze the policies and quantify the associated risk to the portfolio. In addition, Deloitte was tasked with investigating the reasons why the various clauses had not been discovered earlier.
It quickly became clear that conventional tools - such as statistical sampling and traditional analysis software - would be unable to provide satisfactory answers to these questions. The team faced a variety of obstacles: the exact wording of the clauses in the policy population was unknown and could not be derived directly from policy number or policy structure.
The insurer needed precise answers, fast – forcing an uncomfortable compromise between speed (statistical sampling) and accuracy (exhaustive analysis), not to mention the cost of performing such a review. Neither approach was appealing.
To solve this dilemma, Deloitte proposed to apply Natural Language Processing (NLP), a form of artificial intelligence, building on tools already built at Deloitte’s AI technology incubator, the aiStudio. Using NLP, the Deloitte team could analyze every policy in record time – a fully automated process that required some data preparation and customization of code. Using the aiStudio asset as a springboard, Deloitte quickly developed a task-specific tool that read all the contracts, extracted key features, and structured them into a database for subsequent analysis and quantification.
The analysis was performed over massive volumes - double-digit terabytes – of data. A forensic grade data transfer platform was used to ensure secure and high-performance data transfer. After data integrity checks and some pre-processing, the algorithm produced structured data in the form of an SQL database.
Rigorous processes for quality assurance and validation of the results followed. The chosen NLP method proved highly robust to variations in the contracts, achieving near 100% accuracy for all extraction quantities.
The database contained all the fields required for subsequent analysis – including a combination with other databases – to quantify the monetary risk associated with each type of policy clause. The final report consisted, among other things, of a highly aggregated table containing all the information needed for further decision-making.
The table contained the following information per category of relevant contractual clauses:
While the quantitative policy analysis described above was performed, a process review was conducted to identify the various causes for not having identified the policy risks earlier. Deloitte reviewed a considerable volume of documents and conducted targeted interviews to shed light on process breakdowns and suggest improvements for the future.
The two project streams complemented each other, represented schematically as follows:
The adaptation of an insurance company’s business or operating model – for example by selling the company including a portfolio of policies – requires a high degree of transparency and legal compliance.
Often, as in the case described above, these targets are difficult to achieve due to limitations posed by the quality and completeness of historical data. Ingested data may not have been subject to current quality standards, changes may not have been tracked, databases may not have been sufficiently described, and often the expertise needed in this regard has long since left the company. Such problems are not unique to any one company.
The upshot: intelligently handling problematic historical data is a key competency for the long-term success of an insurance company – or, for that matter, for any company that increasingly relies on data to remain competitive and compliant.
The Deloitte approach described above was instrumental to providing a clear view on the historical data.
Use cases extend far beyond the urgency of a due diligence review. Tools and solution approaches developed by the Deloitte aiStudio can solve many data analytics problems that had been particularly difficult or costly to resolve in the past. Some examples:
Natural Language Processing, as implemented by the Deloitte aiStudio, is a versatile, flexibly adaptable technology that has passed the test with flying colors through an highly successful project deployment.