Great Southern Bank will soon start rolling out its first AI agents after spending years conditioning its data environment for the technology.
The bank’s head of customer technology, data and AI Matt Cammack, said that in 2021 the bank pursued a “scorched earth” platform modernisation program to address structural and hygiene problems with its data, which was spread across legacy systems that evolved separately over the prior 75 years.
The former credit union’s need to clean up its data was partly driven by new reporting obligations it would face when its assets hit $20 billion in value in 2024.
However, Cammack said that most of all the need was to meet the bank’s strategic ambition to use AI and automation to compete against its larger rivals as the new technologies began rolling into view.
“We are now accelerating into new AI use cases," he said.
"We're looking at forecasting, stress testing, scenario planning and we're about to start deploying our first agents.
“We started bringing structured and unstructured data from our small business customers into the platform and we're using it to automate and streamline our business assurance activities that we were doing previously."
The bank has tied its fortunes tightly to Databricks. It has eliminated three main data warehouses in favour of single view of its data using Lakehouse and its Unity Catalog for governance. Sitting on top are Databricks Genie natural language business intelligence modules.
The shift has allowed the bank to cut time involved in some reporting obligations from days down to hours.
It’s also been able to eliminate “fact-checking factories” that the bank needed to deal with inconsistencies in reporting information that arose when it was spread across legacy systems.
By 2025, Cammack said, improvements in the bank’s data quality allowed it to analyse financial data with a greater degree of confidence and improve its capital allocation.
It has also been able to bring some of its AI modelling back in-house from third-party vendors, strengthening its intellectual property position.
“That was very costly, it was opaque but it was also very difficult to adapt as the needs of our business changed. So, one of our analysts in our team, rebuilt all of those models in Databricks in three months.
“It was almost a pet project for him. It was an astonishing achievement, but it matched and, in most cases, improved on the performance of models we had from our third-party vendors. That was not just a cost saved but it also meant the IP was retained by us,” Cammack said.
However, he conceded that the modernisation program wasn’t one that another bank of its size and budget would pursue lightly.
“We decided we couldn’t just incrementally evolve, the many platforms and systems that we had. We had to reset.
“We had to build from the ground up a new capability that was going to serve us to sustain us into the future.
"Some people in the business called it a "scorched earth". Now, for mid-tier organisations like us, that's really not a trivial decision, but we also knew that, without it, we couldn't sustainably scale.
“We couldn't achieve the level of governance and security that we wanted for our customers. And most importantly, we could unlock the value of data and AI in pursuit of our strategic ambitions,” Cammack said.

iTnews Executive Retreat - Data & AI Edition
iTnews Cloud Covered Breakfast Summit
iTnews State of Security Breakfast
The 2026 iAwards
Integrate 2026



