What keeps insurance COOs awake at night?

London Market insurers face significant fines or increased capital loadings for non-compliant data, not to mention the reputational impacts of getting it wrong. Nick Mair, co-founder and CEO Atticus DQPro, says insurance businesses must rise to the challenge of managing their legacy systems in order to thrive.

 

It has been 20 years since the Lloyd’s insurance market faced an asbestos-related solvency crisis, and capital restructuring since then has helped drive increased resilience. The market demonstrated these traits last year when it managed to sustain losses of £2 billion (US$2.7 billion) whilst maintaining considerable underlying strength.

The market may not have a solvency issue any more, but it does face increasingly complex regulatory requirements that must be met. Regulations such as Solvency II and Lloyd’s own Minimum Standards add layers of compliance and reporting that require insurance companies to maintain and retain reserves of compliant data as well as reserves of capital.

With data ownership and accountability now critical issues for all modern businesses, we can only expect even greater scrutiny of the way in which London Market data is moved, shared and managed. Insurance businesses already face the threat of significant fines or increased capital loadings for non-compliant data, not to mention the reputational impacts to individual businesses and the sector as a whole.

So what keeps the COOs of well-capitalised London Market insurance businesses awake at night these days? Almost certainly, catastrophic data related business failure is high on the list- such as a data breach, a major data quality issue, or inaccurate reporting. Such scenarios arguably present a greater ongoing risk to an insurance company’s business and brand reputation than undercapitalisation.

Managing legacy systems and bringing them to regulatory compliance first involves recognising this challenge for what it is. Whilst we all aim for a frictionless, straight-through process of the future, specialty carriers are spending way too much time and money manually re-keying data, dealing with data errors, and mitigating their downstream impact.

Data wrangling

The harsh reality is that most legacy systems simply aren’t fit to support the modern needs of regulatory reporting. Instead a huge amount of effort is expended in “data wrangling” – gathering data from various legacy systems, jamming it together in Excel or a datamart (if you’re more advanced), manually adjusting for quality errors and submitting in time for the deadline to avoid penalties.

Re-keying data – a “Londonism” that won’t go away quickly

Whilst initiatives such as Lloyd’s SDC serve to help us reduce the friction of data entry here are three reasons why much of the market will re-key risk data into their legacy systems for a good while yet:

  1. The nature of the legacy systems: typically evolved over 10-15 years, legacy systems are proving slow to provide the necessary staging and validation required to support the latest electronic messages. And removing the soft evaluation performed by a human operator makes it even more essential to have automated, second-line validation.
  2. Size of data set: the London Market is complex compared to personal lines – the datasets required for specialty risks are often three to five times the size and variability of retail lines such as motor and home. This complexity increases the potential for human error at source and requires more sophisticated validation.
  3. A distributed market – international business is channelled to London from legacy source systems at brokers and cover holders around the world, in all its glory and varying formats; good and bad. Better data standards help but you still require carrier specific validation if insurers are to meet their own needs.

A scalable solution for legacy systems

Our approach with DQPro has been to address all of these issues with one, easy to use solution to fix things for specialty carriers on a global scale. We’ve developed new technology that works over old market systems to apply the checks and controls carriers need on their data. And we do this daily, at scale.

Our largest UK customer is a global carrier which uses DQPro for multiple use cases across multiple business teams; to actively manage the data entry performed by an outsourced provider in India, for checking quality and standard of SDC messages, for FX rate consistency across multiple systems in Finance and many more.

Meanwhile, our largest US carrier uses DQPro to actively ensure the standard of data captured and sent from branches in countries such as Brazil, Singapore and Canada are met. This way, all their checks are deployed and managed centrally by software that actively engages business users, increasing visibility, accountability and ultimately overall quality.

The age of data scrutiny

Earlier this year, Lloyd’s Corporation Director of Management, Jon Hancock asserted that more would be done to reduce red tape for carriers, with reporting requirements cut back and a focus on the use of datasets and benchmarks to identify anomalies in classes of business.

This fresh thinking is welcome – certainly, a more risk-based approach could help reduce the burden. But overall it’s fair to say the age of data scrutiny has arrived: data quality matters, and goes hand in hand with the ability to evidence compliance with regulations.

So what’s the answer? London Market businesses require basic, underlying confidence and visibility in core data in order to thrive. It’s fundamental to everything. We cannot replace all legacy systems overnight, but new technology offers the potential to efficiently solve the problem outside of legacy constraints, and on a global scale.