Australian financial services organisations could be losing tens of millions of dollars a year by creating and using poor-quality data for decision-making.

According to David Howard-Jones of management consultancy Oliver Wyman, poor data quality could cost companies up to 20 percent of their operating profits.
Addressing an Institute of Actuaries of Australia conference this week, Howard-Jones said poor data could lead to costly, overly conservative decisions and inefficiencies within an enterprise.
“Data quality is a massive cost,” he said. “Additional conservatism is natural and necessary when you don't have quality data to calculate parameters.
“Overall estimates of the costs of poor data quality are 15 to 25 percent of operating profits for insurers and potentially even more for large banking groups.”
Oliver Wyman’s estimate was based on its recent, bottom-up assessment of business decisions, regulatory and reporting capital, operating costs, and operational risk capital.
In a 2008 survey of property reinsurers by Ernst and Young, 70 percent said they applied a 20 to 25 percent surcharge on premiums to account for poor data quality (pdf).
Thirty-six percent of Ernst and Young’s respondents indicated that they were willing to offer 10 percent discounts in premiums for better or high-quality data.
The US Insurance Data Management Association separately estimated bad data to cost between 15 and 20 percent of corporations’ operating revenue (pdf).
Mike Thornton, director of group risk management at AMP, attributed poor data quality to a combination of legacy IT systems and human data collectors.
“I think any large company that’s been around for any length of time is going to have legacy systems [in which] you don’t quite have the data you want,” he said.
The implementation of data warehouses that pooled together various sources of information helped mitigate the risk of poor data, he said.
But “merging systems together is expensive as well”, he said.
Howard-Jones said there were software and service offerings for “fixing” poor data but urged conference attendees to address poor data at its root.
Decision-makers and risk professionals required accuracy, completeness, consistency, timeliness, longevity, validity, accessibility and integrity, he noted.
Meanwhile, data creators – typically branch staff, mortgage brokers, or telephone sales staff – were incentivised on how quickly they completed given tasks, he said.
“Process matters; you need to identify your major areas where data is substandard and go to the front line and deeply understand the context,” he said.
“You can’t have a culture where you’re requiring people to do a certain amount of underwriting, where you’re pushing the revenue story, unless you take account of how much extra time they need to capture the data you need to make the best use of your strategy.”
He highlighted the example of one company whose software forced staff to record certain data about customers before moving onto the next step in a loan application process.
“[Staff] quickly discovered that they could just type in a bunch of zeros, close the box and move on,” he said, adding that data creators may not see the business benefit in good data, as they were “so far down the chain of command”.
Organisations were also challenged with allocating responsibility for data quality issues, as benefits were shared and difficult to quantify.
“Quantifying the impact of that [bad data] is really pretty hard,” said AMP’s Thornton, “and the alternative model [of diligently collecting good data] is quite expensive as well.”
Additionally, Thornton noted that data users like risk analysts were unlikely to be able to specify exactly the data fields that they would require.
“Most actuaries are reasonably practical, so having an absence of data is something that can be dealt with,” he noted.
Thornton said businesses had to weigh up costs and benefits of collecting data and prioritise their spending accordingly.
For those hoping to improve data quality, Howard-Jones said chief executive, chief financial and chief risk officers should share responsibility establishing the right culture and creating transparency around the cost of bad data and incentives for improvement.
“I don’t think there are many opportunities out there right now for financial services companies in this environment to find 10 to 20 percent [improvements] on their bottom line,” he said.
“I don’t think that it’s easy, but I think that if you invest in really finding the gap between people who create data and people who rely on data to make decisions … there is an enormous value opportunity.”