Telstra is set to add the Apache Flink data processing engine to its four-year-old event-based network observation environment, with work to begin “over coming months”.
Technology product owner Javed Bolim told the Data Streaming World Tour in Melbourne that the telco would look to pair Flink with its Kafka-based event stream processing capability, with both technologies consumed via managed services from Confluent.
In its product documentation, Confluent states that the most common pairing of the technologies sees “Kafka usually provid[ing] the event streaming while Flink is used to process data from that stream.”
Flink is also designed to work “at in-memory speeds”, the documentation states.
Telstra didn't specifically describe what step-change it sees Flink bringing to its event streaming and analysis capabilities.
The telco set up an Apache Kafka-based event streaming platform, chiefly for observability-related use cases, about three-to-four years ago, according to Bolim.
This takes data from network infrastructure and devices interacting with Telstra’s various networks and streams it to a central database where it can be used to detect and address faults, inform service development and to ensure that customers get the service levels for which they pay.
Event-based observability, as Bolim called it, tracks “any meaningful change” to its networks.
It means detecting problems earlier and also reducing mean-time-to-recover full services.
“Everyone is interested in something that happened [in the past] minute [that] we need to fix now,” Bolim said.
Telstra also uses streamed event data to ensure that customers get what they pay for from performance-based service “add-ons”.
“A lot of the data that we do get from the network enables us to develop products on top. So the different offerings you do see on the network, on your services, partly the reason [they exist is because] we’re able to see a lot of data, do a lot of trend analysis of it, and find out what our customer needs so we can develop those products for them,” Bolim said.
“[Then it’s about] how do we ensure customers are getting the value that we’ve promised them with those add-on products.
“For the products we do sell on the network we provide proof-of-value on top, so if you’re paying [for] an add-on, we want to ensure our customers - enterprise, retail and consumers - get the right level of service on top of that with service visibility as well.”
Bolim added that streaming data is also valuable in the context of greater use of AI by the telco.
“The foundation of AI and analytics is our data, so we wanted to ensure the data we do get from the network is highly available, high-quality and we can make decisions based on it,” he said.

iTnews Cloud Covered Breakfast Summit
Huntress _declassified Virtual Event
Live & Hands On Demo: Navigating the BMC AMI DevX Platform to Understand Code Faster Using AI
Melbourne Cloud & Datacenter Convention 2026
iTnews Executive Retreat - Data & AI Edition



