Australian organisations are seeking to put their stores of machine-generated data to greater use in an ongoing bid to improve security, IT service delivery and customer loyalty.
Members of an iTnews panel of public and private sector chief information officers this month highlighted the value that might be derived from data sources like log files and stock tickers.
One CIO told the panel he planned to analyse data from his organisation's customer relationship management system to flag dissatisfied customers before they changed providers.
John Talbot and Tony Yortis, CIOs of engineering consultancies Coffey International and Sinclair Knight Merz respectively, highlighted the deluge of geospatial data at their organisations.
Yortis told the panel that Sinclair Knight Merz collected up to 10GB of data a second from its geospatial surveys, of which 70 to 80 percent was unstructured.
“It’s a big challenge for us, in terms of just managing the knowledge that we create and selling that into the market," he said.
Chris Moran, enterprise architect of NSW financing authority TCorp hoped to extract market information from some 100GB of financial, news and social media data it harvested daily.
TCorp deployed an analytics engine from panel sponsor Splunk in 2010 for IT infrastructure monitoring, to eliminate delays between the execution and settlement of trades.
Splunk described itself as “Google for machine data”, allowing non-experts to apply familiar search terms to unstructured, machine-generated data.
“We said hang on, we’ve accidentally ingested all these market feeds, because that information was in the logs of one of our trading systems,” Moran said.
“Economists started looking at that and saying we can plot what a given currency is doing … we’ve now got years worth of data for watching trends.”
Moran said TCorp was looking to apply the Splunk engine to market feeds but had yet to use the technology for investment decisions.
“The investments we deal with have a lot of zeros in it,” he said. “We’re still experimenting. At the moment, it’s just giving a good analyst or a trader another tool.”
Splunk chairman and chief executive officer Godfrey Sullivan said its search engine tended to attract financial services customers for forensic uses, rather than trading.
The Commonwealth Bank and National Australia Bank use Splunk for fraud detection; hording data first and identifying and analysing patterns later.
CommBank is also understood to have invested in a Hadoop big data mining platform, engaging data scientists from research organisation NICTA.
Sullivan said Splunk and Hadoop performed complementary functions, with the latter better for storing massive amounts of unstructured data, and the former better for queries.
Splunk was licensed on a per-GB model, with NSW TCorp ingesting some 20GB a day and its largest customer ingesting some 50TB a day.
“Where most of our customers are using both, they’re using Splunk as the data ingestion and indexing engine, and where the data is older than 30 days, they pipe it off the batch storage,” he said.