How CBA unlocked 90 percent of its customer and transaction data

By
Follow google news

Payoffs from SAP modernisation laid out.

Commonwealth Bank’s SAP-based core modernisation last year has unlocked about 90 percent of all its customer, account and transactional data, which it now hopes to harness for deep personalisation and behavioural banking.

How CBA unlocked 90 percent of its customer and transaction data

The bank completed its move off an on-premises SAP R/3 core running on IBM DB2 databases and mainframe infrastructure, to an SAP S/4 core that uses SAP’s in-memory HANA database and is hosted in AWS, in October last year.

AWS, Red Hat, SAP, SAP Fioneer, and Accenture were all involved in the large-scale project.

At the time, the bank said the move would “support our ambitions in AI, data and digital services”.

At AWS re:Invent in Las Vegas at the end of last year, it elaborated on what this will mean for the bank and for an elevated customer experience.

General manager of core banking Simon Davies told the summit that “almost one in two times money changes hands in the economy, it goes through [CBA’s] SAP system.”

The core banking system is “not just [the bank’s] biggest system of record, it's about 90 percent of all of our customer, account and transactional data,” he said.

“This was certainly our most critical system of record, but increasingly, we wanted to turn it into a system of intelligence.

“We wanted to do more things with deep personalisation and behavioural banking – and that requires grunt.”

The move from mainframe to cloud services hosted on commodity x86 servers drove a 30 percent reduction in infrastructure costs.

SAP on AWS also resulted in a 30 percent performance improvement compared to the previous core.

“Web services obviously improved their response times, and [that] manifests most readily in things like balance updates happening a lot faster than they used to,” Davies said.

“Previously, someone would make a payment and then they'd have to refresh the screen to see their balance update. Now that happens straight away. 

“It also enables sort of more inline processing, things like fraud checks etc can happen in real time. But also batch workloads happen much more quickly and [that] allows us to do things like customer-based pricing, which requires millions and millions of recalculations every day to determine what is the best price for a given customer segment.”

Customer or relationship-based pricing uses data analytics to tailor fees and interest rates to specific customer cohorts.

More than that, the transformation of the core into a “system of intelligence” is being realised, with data now able to be funnelled into data pipelines and analytics, and AI use cases much more quickly.

Davies said that the bank used the transformation to change operational structures around the core banking system, in addition to the system itself.

“It was an opportunity for us to be really honest about the state of our technology stack, which we'd lived with for about a decade-and-a-half,” he said.

“Our operations were quite siloed. We had eight different teams that ran the stack, top to bottom. 

“They all spoke different and obscure languages, [and] it was really difficult to coordinate between them or get anything done.”

This also impacted access to data produced by and stored in the core.

“Historically, there was high friction with getting access to that data,” he said.

“That was a combination of a locked-down ecosystem, but also some performance constraints on being able to push data out en masse.”

Within the current landscape, and with the way AI is impacting banking operations, CBA needed to find a way to de-bottleneck data access – and this has emerged as a key benefit of the core transformation.

“You really need to think about it as the richest source of customer behavioural insights that we have in the bank,” Davies said.

“We often hear it said that data is the fuel for AI. There is no higher octane fuel than what we have here, so really being able to exploit that is important. 

“With the move that we've made here, we can actually ‘event’ everything that comes out of this system in real-time, [meaning we have] real live signals coming out of this system. We’ve tested up to about 5000 transactions per second. 

“This means that we can get that customer data to our data scientists [and] to other engineering teams who are building new agentic [AI] solutions a lot faster than we were ever able to before.”

This is delivering data not only to Amazon SageMaker - used for data science and machine learning - but increasingly to Amazon Bedrock - used for generative AI - and Bedrock AgentCore, for building agentic AI systems.

“We’re imagining what the next decade of agentic banking might mean, and experimenting with Bedrock and AgentCore around what capabilities that will give us to deliver more hyper-personalised behavioural banking [and] intelligent experiences to our customers,” Davies said.

Resilience improvements

Davies also highlighted improved resilience as a key benefit of the core upgrade, which is reflected in improvements to the practiced time needed for a site swap and failover in the event of a significant disruption. 

“Previously, if we were to do a site swap, the RTO [recovery time objective] was about an hour-and-a-half,” he said.

“We used to rehearse this every year, and it'd be a bit of a manufactured exercise. Everyone would get at their keyboards, get warmed up, get their coffee, their run sheet, and in that manner, we could kind of choreograph the entire thing and execute it in about 30 minutes. It's pretty good, but it's a bit artificial. 

“When we moved to S/4, out-of-the-box we were seeing about a 16-minute failover - still pretty good, but banking is a very downtime sensitive environment, and we wanted to optimise that a little bit more.”

Davies said that the bank engaged in “deep engineering” with SAP, Red Hat and AWS to reduce the recovery time objective even further.

This resulted in a “bunch of improvements at the Red Hat kernel level to Pacemaker” - a high-availability cluster resource manager - and to an “enhancement to [AWS] EC2”, the latter of which went on to be made available to all AWS customers.

“There was a new feature introduced into EC2 as a result, which I'm really proud of, that all SAP customers can leverage from here,” Davies said.

“It's awesome to see some of the work that we've done to try to push the boundaries make its way back into the main code line for these software vendors and for AWS, and we can see that all customers in future will be able to benefit from some of this innovation.”

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Macquarie brings agentic SRE to its digital bank

Macquarie brings agentic SRE to its digital bank

Bendigo Bank cuts cost, time out of software development efforts

Bendigo Bank cuts cost, time out of software development efforts

Westpac Intelligence Layer breaks cover

Westpac Intelligence Layer breaks cover

Suncorp looks to AI and core overhaul to address insurance affordability

Suncorp looks to AI and core overhaul to address insurance affordability

Log In

  |  Forgot your password?