Atticus DQPro CEO Nick Mair exposes the hidden costs in dealing with data issues and asks why so many traditional data quality initiatives fail to make a difference.
From Facebook founder Mark Zuckerberg’s historic US senate grilling to the implications of General Data Protection Regulation coming into force this month, data ownership and accountability are part of the zeitgeist as critical issues for all modern businesses.
In our industry, getting data right is not only a reputational issue for insurance, it also has far-reaching implications when it comes to carrier cost and efficiency. In fact, lack of internal data ownership is already costing London Market carriers millions and significantly hampering efforts to streamline market processes.
Manual data entry and duplication
On a daily basis multiple underwriting operations teams are keying and rekeying data into legacy systems and using a ‘people and spreadsheets’ quality approach to fixing issues, leaving significant scope for errors to creep in. It’s expensive and no one would say it’s efficient. Data quality issues are like a pollution in the headwaters of a river system, impacting multiple areas downstream. That small exchange rate error can get used in multiple calculations, forecasts and reports before it is discovered, the damage already done.
Whilst most of us are aware of the myriad of data issues, the problem has often been viewed as too complex to solve and so it has become normalised as part of doing business in the market. The net result? A huge hidden, avoidable daily cost in dealing with data issues.
Data errors – a US$3 Trillion issue
According to the Harvard Business Review, just a 1% error rate can double the cost of a transaction. Staff in data related positions typically spend 50% of their time correcting data issues, while data scientists typically spend 60% of their time cleaning and preparing data. The estimated cost to the US economy in 2016 of all this data inefficiency? A mere US$3 trillion!
But despite the obvious waste and costs, many traditional data quality initiatives aimed at reducing the impact issue ultimately fail. Why is this?
Making it easy for business-side users
The primary reason is that IT led data quality initiatives are nibbling round the edges, often addressing easy options but failing to drive real ownership and accountability with front office business users. So how best to engage them and drive change? The answer is twofold:
- Make it simple: embed quick and easy-to-use tech into their workflow.
- Make it visible: include data tracking issue resolutions to company-set Service Level Agreements.
When business users feel empowered for owning and improving the data for which they are accountable, and are given the tools to manage this visibly and efficiently, they will do so. Which means less work and less cost downstream in the back office. New tech allows this to happen in the simplest of ways – with happy users on both sides of the office.
Issues surrounding data ownership and accountability are significantly impacting businesses and making headlines around the world – carriers need to make 2018 the year they take effective data governance to the heart of their insurance business.