Nine Entertainment has switched out parts of its live streaming infrastructure using AWS cloud-based services to improve the stability of the platform.
Digital development director Kunaal Ramchandani told AWS re:Invent 2018 last week that the company had seen sizable growth in its live streams since launching channels at the start of 2016.
“At the start [January 2016] we were only doing a few thousand streams per month, and then we had a peak of 5 million a few months ago [June 2018] and the last few months we’ve been averaging about 4 million,” Ramchandani said.
The low numbers of streams initially were the result of Nine not having content rights for things that people wanted to stream.
“That left us with a very ‘interesting’ experience for users because they would come and log in and get ready for their show and then we’d be forced to show them a blank slate,” he said.
“Naturally, this led to a lot of user complaints.”
Nine used that to shift its strategy and acquire more streaming rights. Its June 2018 peak was fed by two pieces of content: the NRL’s annual State of Origin series, and an Australian version of the British reality TV program ‘Love Island’.
“The company went with a strategy of trying to attract a younger audience so 16-24 age group, which we did, but what we didn’t anticipate was the number of people that actually watched that show on our live streaming platform versus linear TV,” Ramchandani said.
“Fifteen episodes of Love Island had a higher viewership on our platform versus on linear TV which is pretty phenomenal. It remains Nine’s most viewed non-sports event to date.”
However, Nine’s live streaming platform was made up of a lot of components that - together - weren’t handling all of the growth smoothly.
“We took a feed from the broadcast room, send that to a Teradek encoder, that would then get sent to our data centre where it would then have to be decoded, sent to a tricaster where someone could connect and append or edit metadata to it, that would then get sent to a Spinnaker encoder, get re-encoded, finally get an HLS stream that gets sent to a CDN before finally getting to the player,” Ramchandani said.
“Lots of hardware, lots of potential points of failure, and as you can imagine as the stress on these boxes increased and the uptake increased, so did some of the issues.”
Ramchandani said that the setup itself wasn’t the issue since it “actually did what it needed to do”.
“The problem is when you have a lean team, you don’t really have the luxury to deep-dive into a lot of the problems that you have, especially when in true IT style 90 percent of these issues are fixed with a simple reboot,” he explained.
“That left me with unhappy consumers because their streams were dropping during events that they really wanted to see but it also left me with unhappy devs because I’m getting calls at 2am and 3am because the stream’s down and someone needs to get up and reboot the boxes.”
Knowing that was unsustainable, Ramchandani called in AWS. Nine is nearly five years into a digital transformation journey in the business in which it has been “throwing everything we could at the cloud successfully”.
He was interested to see if AWS Media Services, launched at last year’s re:Invent, could help smooth out some of the stability and performance issues.
The two parties nutted out a simpler architecture in the space of a single meeting.
“We now have a Wirecast box, we get the stream from our broadcast, the producers can add or append metadata straight away. We then send that through to the cloud or to [AWS Elemental] MediaLive, MediaLive converts it to HLS straight away. That then gets sent to [AWS Elemental] MediaStore where it’s ready for CloudFront and out to the player,” Ramchandani said.
“I should also mention this entire proof of concept was set up within 40 minutes. By the time they [AWS] walked out of the room, we were ready to stream live.”
Making the new setup production-ready
Still, Ramchandani wasn’t in a super rush to get it into production.
“Knowing if you take a proof of concept live you’re not going to get a chance to change it, we thought ‘let’s just take a step back and productionise it just a little bit’,” he said.
The changes added a pipeline to allow Nine’s DevOps team to code and push out config changes to the setup.
“Devs push their code through to Stash, we use TeamCity for our CI/CD, and that then using the APIs that Amazon have provided for MediaLive and MediaStore, we push those changes through and then we use CloudFormation and CF templates to push changes through for CloudFront,” he said.
“It’s a fairly simple setup.”
Nine initially took the new streamlined live streaming setup live just for news streaming.
“We ran this as a split stream so we had the old and new infrastructure running,” Ramchandani said.
“We used it for news initially just to see what the numbers looked like.
“It was a lot more stable as expected, and I suddenly had much happier devs.
With stability and extra time, Ramchandani’s team started to measure changes in quality-of-service metrics.
“We saw the playback success which is the number of playbacks without errors improved,” he said.
“Our startup time which is time to first frame improved, smoothness which is the lack of rebuffering in the player improved, and video quality which is the amount of upscaling and downscaling required to fill a video player also improved.
“And for the bean counters - the all important metric for them - cost actually went down which shouldn’t be a surprise because we got rid of a whole bunch of boxes and swapped them with AWS.”
Australian Open streams
Ramchandani said that Nine is now looking at how it can use the new streaming setup to handle a wider number of needs, including around “ad hoc and pop-up events”.
That means events such as Nine’s future coverage of the Australian Open tennis are likely to stream using the platform.
“Nine’s going to be the home of tennis for the next 5-6 years in Australia, and contractually we have to show 16 courts where the Australian Open takes place,” he said.
“The tournament only takes place a few weeks a year so it just does not make sense to go and invest in a lot of infrastructure for 16 courts that I’m only going to use for a couple of weeks in the year.
“So something like this is really useful because it gives us speed-to-market without outlaying huge amounts of cash.”