Telstra confronts "really abysmal" reality of customer gripes

By

Looks inwards at its own faults.

Telstra is using an internal research project to understand how “really abysmal” customer service failures from past years continue to fester as long-term dissatisfaction in its net promoter scores.

Telstra confronts "really abysmal" reality of customer gripes
Dr. Violet Lazarevic.

Customer Insights general manager Dr. Violet Lazarevic said almost one in five customers surveyed after an interaction with Telstra did not rate their experience based on the current interaction, but on a negative experience preceding it.

However, this was not apparent when looking at the telco’s overall net promoter score (NPS) - a common methodology used to track customer satisfaction and brand perception.

Telstra uses a variation of NPS that it calls “episodic NPS” or eNPS; Lazarevic described it as being like an “interaction NPS”.

“Several years ago, somebody mapped out customer journey across our services, and identified the key interactions that our customers have with us - and therefore after every interaction, we survey them,” Lazarevic said.

The interactions are represented as five “episodes” - sales and activation, billing, assurance (support), modify (such as re-contracting or a plan change) and move (if the user moves house).

Measuring misery

The resulting eNPS is one of Telstra’s “key business metrics,” Lazarevic said.

“It goes on all our bonuses, so we care about it a lot, and we spend a lot of time looking at it.”

Telstra is currently undergoing an organisation-wide transformation known as T22. This touches all parts of the telco, including customer service.

Lazarevic joined Telstra in February this year as part of the transformation, and her initial research is targeting root causes of unhappiness in the Telstra base.

Lazarevic indicated she was somewhat suspicious of NPS and set out to understand how it worked.

“Those of you who share my skepticism about this score know that it doesn't behave like every other metric statistically, so it can be a really challenging score to look at,” she said.

“This system actually assumes people are logical. It assumes that when we talk to you about your sales and activation, you're telling us about your sales and activation and not your modify experience you had two months ago.

“We really wanted to test this. We wanted to say ‘is this what's really happening’, or have we been totally assuming something false in the numbers that affects every Telstra employee in the country?”

Walking the metrics minefield

Lazarevic’s team started by analysing the text in “verbatims” - essentially, a field in the NPS survey sent to customers where the customer could explain why they had given their particular scores.

“We looked through verbatims and literally hundreds and thousands of items from our customers over the years,” she said.

“We had a group of analysts read every line, and then we coded all of these to really understand what was happening.

“Over the last three years, we could see 17 percent of all verbatims were not related to the episode at all.

“So already [there’s] some really compelling evidence that when we ask you and instruct you to tell us about how you feel about us, and how likely you are to recommend us based on that episode, you're not following instructions.”

Long-term frustrations

To dig deeper into the problem, Lazarevic’s team decided it needed to “triangulate” the data in eNPS with other types of data.

The team decided to introduce a qualitative element.

“We really needed a ‘qual’ component here to understand what was happening, to unpack with customers what were they thinking [and] what were they going through when they were getting these surveys from us,” Lazarevic said.

This took the form of focus groups and produced a large amount of additional insight into why Telstra struggled to move the dial on customer satisfaction.

“One thing we found out, which was really interesting is that people did want to talk about the episodes that they were experiencing, but there were other things happening that overshadowed those experiences and what they wanted to actually talk about or score when we gave them a survey,” Lazarevic said.

The research also found - unsurprisingly - that many customers had been with Telstra for an extended period of time. Telstra may have been their only telco, though that wasn’t always by choice.

Familiarity breeds

“One of the really interesting interaction effects here was actually happening from tenure of the relationship with Telstra,” Lazarevic said.  

“There were two ways this was happening: it was almost an opposing effect.

“We had people who'd been with Telstra a long time, who'd had some really positive experiences over that time, and almost had this ‘halo effect’ happening where we could do no wrong.

“They loved the brand, they felt a bit of ownership over it, they connected with the Australian heritage - and so they were actually probably giving us a score better than we deserved, based on the episode.

“But then ... there were some people who'd been with Telstra a really long time, and they'd had a really abysmal failure at some point.

“There was one guy who stood out really clearly to me, who'd had no internet for three months, and it had taken us that long to fix the issue. That was multiple interactions, multiple calls, visits to the store, etc.

“This poor guy was so frustrated with Telstra. He still stayed with us - [but] he used the word 'tainted' - everything he thought about Telstra from then on was tainted, and his frustration was palpable when you listened to him talk.”

Once bitten

The idea that customers’ survey responses could be ‘tainted’ by past events wasn’t apparent “just from looking at the eNPS” - and soon it became clear there were more reasons why a customer might reach this level of frustration with Telstra.

“We then found out that there are people who live in regional or rural areas who only can be with Telstra, because none of the other providers have gone out that far,” Lazarevic said.

“Some were super grateful to Telstra that they'd gone to the area when no one else had, and then there were some who were really frustrated that they could only be with Telstra, and so it was almost a negative halo effect happening on the brand.”

Another cohort of customers were negative about Telstra due to a poor experience suffered by a family member that was ill or had some form of disability.

“They took that on personally, and we decided to call that theme in our analysis 'Telstra families' because the impact of what has happened to their family had actually impacted how they then felt about Telstra,” Lazarevic said.

“They took all of that [into account] when they were surveyed by Telstra again.”

According to Lazarevic, the focus groups exposed “naivety” in Telstra’s over-reliance on a single measure of customer satisfaction.

“Our naive approach of thinking 'we instruct you to talk about an episode so you do' is that: it's naive,” she said.

“There are all these other emotional, really complex things going on for customers that we needed to take into account.”

Addressing the root causes

Telstra took its learnings from the focus groups and eNPS analysis and used them to build some “really massive regression models so that we could clearly identify what each variable was doing, and how it was affecting what our final eNPS score was.”

The telco uses these models to “tease out some of these nuances” in customer experience, as well as to compare the experience of subgroups such as metro versus regional users.

“Had we not done the ‘qual’, would we have known to look for those things in these models? Possibly the data would have led us there, possibly it wouldn't have,” Lazarevic said.

“This approach allowed us to get to a place where we had a predictive model for our eNPS that was much, much better than the one we started with.”

Lazarevic said that Telstra’s experience highlighted the importance of not relying on a single dataset or score but of triangulating it with other data to fill any gaps in understanding.

“You can't fly your business off of one metric, and this is exactly what we were trying to do,” she said.

“One information source like eNPS or NPS or voice of customer or CX or whatever you want to call it, only gives you pieces of the puzzle.

“You really need to fit in these other lenses of understanding to uncover what is happening with your customers, to truly understand the insight behind the interactions with your customers, and truly also understand where you then need to focus your efforts as an organization in terms of what you need to fix.”

Learning from past mistakes

Telstra’s work to date had put it in a better position to predict the outcome of surveys on future eNPS scores.

“Your journey with your customer is never over,” Lazarevic said.

“It's not about at one point in time, what's the eNPS? It's about trying to understand 'What is the eNPS going to be like next time? Can we overcome some of these negative or positive halo effects? Can we influence these positive or negative perception biases? And what did we do to deserve them in the first place?

“Maybe that's where we need to start with what we need to fix.”

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Optus faces $100m penalty in sales tactics case

Optus faces $100m penalty in sales tactics case

Australian eSafety commissioner and YouTube spar

Australian eSafety commissioner and YouTube spar

Trouble anticipated as NBN Co's new high speeds come online

Trouble anticipated as NBN Co's new high speeds come online

Debate over future of US AI regulation hinges on broadband funding

Debate over future of US AI regulation hinges on broadband funding

Log In

  |  Forgot your password?