ANZ Banking Group is midway through a major reset of how it creates and manages its extensive data holdings in a bid to better understand the value it might derive from them.
The program of work was outlined in detail across several sessions at IBM’s THINK 2019 conference in San Francisco by Thomas Lucey, a self-described “information strategist” consulting into ANZ.
Lucey said the work was partly borne out of a desire by ANZ to better understand what master data management (MDM) was and could do.
“[ANZ] had already invested quite a lot of money into it and didn’t really fully understand the full value of it,” Lucey said.
“We had a lot of simplification to do with our landscape.
“We had about four MDMs which are actually competing against each other and producing different outcomes.
“We wanted to transform this into a single MDM suite where we brought diverse [data] sources together and produced a single outcome.”
Getting a lid on data was seen as a key foundational element for success in a range of domains, from digital transformation through to customer experience and analytics.
“The number one priority is getting data quality to a level where it will drive business value,” Lucey said.
“If you look at how traditional banking is done, banking has been [about going] to somebody to get a financial transaction carried out.
“Digital transformation means the customer is going to do that themselves. They’re going to be in charge of their own destiny, they’re going to have control over it, and they will be empowered to make a lot more decisions. The bank then will have to follow what the customer is looking for.
“I think for ANZ [digital transformation] has gotten real in the last year or so where ANZ went outside looking for people who are not traditionally bankers and brought that talent in at a C-level.
“They’ve brought the need for data to be front and centre of the digital strategy to the board, to the point where the board are very sensitive to it.
“We’re now looking at data being at the centre point of all of this, and if we can’t get the data journey right, we will not be able to get the digital journey right.”
Where things stand
ANZ is currently one and a half years into the MDM overhaul and Lucey said the institution has “probably got another year and a half to go.”
While some time has already been spent on “configuring and standing up the [single MDM] asset” - the bank engaged IBM Lab Services to aid on that front - Lucey said that a large chunk of work, accounting for “about 6-8 months, was [around] getting the business engagement” for the program to proceed.
“We have to satisfy a lot of players - leadership, business owners, data stewards, brokers, bankers and customers, all requiring different pieces of data to be high quality and on time. They all have different demands for data and different agendas,” Lucey said.
“The problem we were trying to resolve at ANZ was get better at creating, [ingesting], connecting, maintaining and also governing customer data.
“That meant we needed to get better at how we capture the data, have some controls around doing it correctly, and remove the margin of error at the point of capture.
“At the same time, [we needed to] look at once we have captured [the data] and at what has already been captured in legacy systems, and how do we clean that up and make that available for the frontline challenge to be able to reuse it.”
This involves some significant cultural change efforts, getting teams and squads within the bank to think about the data they create and to make sure it is clean and formatted in a way that it can be reused by others from the outset.
Aided by AI
Lucey said he hoped artificial intelligence and machine learning technologies might assist with the data cleansing.
In ANZ’s target information architecture, (hopefully clean) data is ingested into a data lake before it flows out into what Lucey calls a “data river” for reuse.
“We look at data from a lake and a river perspective where the lake is the legacy data that’s already in our systems, and then the river is the ability to reuse that data to be able to serve the customer,” he said.
“We’re finding that the data lake needs to be cleansed and there’s a lot of data there, so we’re using tools like master data management, data governance and lots of people to go and cleanse it. But it’s a lot of work.
“We’re looking at AI and machine learning to be able to optimise that process.”
Lucey said he also saw potential for AI/ML to be used closer to where data is being created.
“What we’re trying to do is encourage the projects and the squads to adopt AI to look into the source systems that they want to embark on cleansing - to understand what the risks are and to put some mitigations in place prior to committing to take on that data and [potentially] putting stress and strain on other systems downstream that are designed to actually master the data,” Lucey said.
He likened the influence of AI/ML in this case as smoothing out data as it is loaded into the lake.
“It’s so you’ve got a clear method of taking the data in and no rocks and lumps on the conveyor,” Lucey said.
Though the payoffs of having good, clean data were relatively clear, Lucey repeatedly raised the complexity of the program of work designed to get to that point.
“It’s so complicated,” he said.
“It’s like going to a beach and saying you’ll reorganise all the sand - where do you start? What do you prioritise? What should it look like when you’re finished with it? And would people realise that you actually did something with it?”
Using a slightly different sand analogy, Lucey drove home the point of the program of work.
“The easiest way to explain it is data is like sand for the mortar for the house you want to build.
“If the sand is contaminated with other elements like soil, you’ll make bad mortar and the building will fall down.
“We’re really looking at the basics. We have to get it right.”
Ry Crozier attended IBM THINK 2019 in San Francisco as a guest of IBM.