DQPro CEO and Co-Founder Nick Mair recently took part in an exciting live debate for the InsTech London webinar “Is your Data Deceiving you?”
Listen to the Webinar here for the full debate with Nick Mair, Charles Joseph of Datazed and Amrit Santhirasenan of Hyperexponential bring their real-word experience and advice on overcoming the challenges, and the benefits that arise, in getting data fit.
Here’s a round-up of what Nick had to say:
Stop winging it
Getting “data fit” is the million dollar question for a growing list of insurance companies grappling with digital transformation. Essentially this is about having data you can trust. Instead, some specialty insurance companies are still winging it, rather than showing genuine data confidence.
Good data is the lifeblood of efficient operations whether you’re a carrier, broker or MGA, from capital modelling and claims, to exposure management and pricing. Good data gives carriers a competitive edge. Anyone who tells you otherwise hasn’t grasped the reality of the way modern insurance business works.
Bad data drag
Bad data slows everything down, creating drag and increasing operating costs and compliance risk. A data fit company has good data coming in and good data going out, and everything in-between leads to better decision making and ultimately a better bottom line.
They can lack a firm grasp of the data governance and monitoring hygiene needed to successfully achieve their ‘data-led’ and digital transformation aims, to avoid pitfalls along the way, and match the level of discipline they would customarily expect from their underwriters or any other key component within the business.
To be clear, working with old technology doesn’t scare us, although I have the grey hairs to testify to a history working in data warehousing. Thank goodness the conversation has shifted from ripping out legacy technology and building anew each time. New tech plugs into old, most notably through APIs, which is the direction in which Lloyd’s Blueprint One seems to be going, for example.
Data in the boardroom
Most boards now realise that good data is a valuable asset. That means a two-pronged approach to investing in digital transformation works best, focusing on core data infrastructure, including the controls to refine data quality, as much as the new cloud-based insurtech software and platforms that rely on those data.
To build a case to convince a more sceptical board, it’s important not to “boil the ocean” by spreading efforts too thinly. Instead, start small, take incremental steps, and build a business case. Identify the “small data” problems that matter to the business and that erode confidence most.
Like any big problem it needs to be broken down, so begin by focusing on the data that count most, putting in place the controls that count most, and then build those into a daily workflow to become habitual.
What we do
DQPro provides those monitoring controls for specialist market carriers, MGAs and brokers, embedded to seem a natural process. We’re making it easier to get on top of data and establish a base level of data hygiene which delivers big improvements in downstream data efficiency and compliance.
We do this in three steps:
- Firstly we plug into all the old tech and we then run an extensive set of market and company specific rules to proactively identify issues to users and surface those to the right business teams at the right time – that’s about detection and visibility.
- Step two is about helping business users take ownership and accountability. Historically this has been hard to do. Detection is meaningless if there is no subsequent action. More than that, action should become the norm. We ensure this by integrating ourselves into the user’s daily workflow, so that it becomes an easy, natural process.
- Thirdly, we track all of this activity and provide the audit information for compliance and the regulator – whoever that might be.
Huge thanks to Instech London for the opportunity to take part in this highly relevant discussion!