Australian organisations racing to deploy artificial intelligence are entering a new phase of maturity, one where success will depend less on prompts and models and more on the quality of the data feeding them.
That was the message from Elastic chief product officer Ken Exner, who used his keynote at Elastic{ON} Sydney to explain to a tech audience of 500 that enterprise AI is moving rapidly beyond experimentation toward what he calls context engineering.
Exner said organisations have already moved through several phases of generative AI adoption over the past three years.
“We’ve watched companies go through a cycle,” Exner told attendees.
“First there was excitement about the technology. Then urgency, as boards pushed teams to implement something quickly. After that came a period of disillusionment when those early pilots didn’t deliver the ROI people expected.”
Now, he said, the industry is entering a new phase of acceleration as businesses begin to understand what is required to deploy AI successfully at scale.
“AI in production is fundamentally a data problem,” Exner said.
“If the model doesn’t have the right context, it doesn’t matter how powerful the model is. It won’t produce the right answer.”
Prompt engineering to context engineering
Certainly, early generative AI experiments focused heavily on crafting prompts and deploying chatbot-style interfaces.
But as organisations attempted to scale those pilots into production systems, their limitations quickly became clear.
“Those pilots were useful,” Exner said. “But they weren’t transformative.”
Instead of simply prompting models, enterprises now face a more complex challenge: ensuring AI systems can access and interpret the right enterprise data at the right time.
That requires retrieving information from structured and unstructured (i.e. PDFs, emails, slack messages) sources across applications, cloud platforms and infrastructure systems.
Exner said the discipline emerging to solve that challenge is context engineering,: the process of delivering relevant enterprise data to AI systems in real time.
Rather than simply deploying large language models (LLMs), enterprises must design systems that connect AI to operational data across the business.
Indeed, Exner said the shift is already reshaping how engineers build AI systems.
“More and more, developers are spending their time on context engineering,” he said. “They are figuring out what prompts to use, what data to bring in, and how multiple agents should collaborate.”
He said the implication for enterprises is significant. “Successful AI deployments will increasingly depend on systems that can retrieve and organise enterprise data with precision.”
Relevance becomes critical for AI systems
For Exner, the challenge of providing AI with accurate data builds on a principle that has long underpinned search technology: relevance.
“The key to search has always been relevance,” Exner said. “It’s the difference between a good result and a bad result.”
In traditional search systems, users might review multiple results and apply their own judgement. But generative AI systems typically produce a single answer.
Essentially, Exner said that means the quality of the underlying data becomes even more important.
“If an AI system only sees part of your data, it’s only telling part of the story,” Exner said. “And if it produces the wrong answer, that erodes trust very quickly.”
Elastic’s platform is designed to address that challenge by allowing organisations to search across structured and unstructured data sources, from application logs and infrastructure metrics to security data and operational systems.
According to Exner, giving AI systems access to that broader data context is essential for building reliable AI applications.
Why early AI pilots disappointed
Exner said many organisations initially underestimated the complexity of deploying AI in real-world environments.
A common mistake was treating AI as a chatbot problem.
Companies deployed ChatGPT-style assistants for internal functions such as HR, sales or support. While useful, those tools rarely delivered the operational transformation executives expected.
That experience contributed to a period of scepticism across the industry, with many organisations questioning whether generative AI could deliver meaningful return on investment.
However, Exner believes the industry has now reached another inflection point.
Recent advances in AI models, particularly those designed for software development and reasoning tasks, have dramatically improved the ability of AI systems to perform complex work.
“Developers suddenly realised these tools weren’t just incrementally better,” he said. “They were dramatically better.”
As a result, the conversation inside many organisations has shifted from whether AI will deliver value to how quickly it will reshape core workflows.

Building the architecture behind agentic AI
The next challenge is enabling agentic AI, systems that do more than generate text and instead execute tasks autonomously.
According to Exner, those systems require a new application architecture.
Organisations must retrieve information from multiple data sources, generate embeddings, apply ranking and retrieval techniques, and orchestrate AI agents alongside deterministic workflows.
Elastic is positioning its platform as the infrastructure layer enabling that architecture.
The company has spent several years expanding beyond search into observability, security and AI infrastructure, allowing organisations to analyse multiple types of operational data within a single platform.
That unified data layer, Exner said, is becoming increasingly important as organisations attempt to operationalise AI across large, distributed environments.
AI meets observability and security
Beyond search and AI applications, Elastic is also embedding AI capabilities across observability and security workflows.
In observability, the company is using machine learning to extract insights from log data,; historically one of the most information-rich but difficult datasets for engineers to analyse.
“Logs actually contain everything you need to troubleshoot a system,” Exner said. “The challenge historically has been that there’s simply too much information.”
New capabilities aim to automatically organise log data into logical streams and identify significant events that require investigation.
In security operations, Elastic is applying AI to reduce alert fatigue for analysts.
Tools such as Elastic Attack Discovery automatically correlate large volumes of security alerts into attack narratives, helping analysts identify the most critical threats more quickly.
The company is also introducing workflow automation capabilities designed to accelerate incident response.
For enterprises, the next hurdle will be ensuring those systems operate reliably at scale. That means building platforms capable of retrieving and analysing data across infrastructure, applications and operational systems.
For Exner, organisations that solve that problem will be the ones that unlock the real value of AI.
Elastic has deep expertise in search technology and artificial intelligence which is the foundation in its platform for its search, observability, and security solutions. Elastic{ON} Sydney was part of a 12-city world tour including stops in Tokyo, London, New York and San Francisco. For more information on Elastic visit elastic.co.

Cyber Resilience Summit
iTnews Executive Retreat - Security Leaders Edition
Huntress + Eftsure Virtual Event -Fighting A New Frontier of Cyber-Fraud: How Leaders Can Work Together
iTnews Cloud Covered Breakfast Summit
Live & Hands On Demo: Navigating the BMC AMI DevX Platform to Understand Code Faster Using AI



