The long-awaited decision late last week to split the Square Kilometre Array project between rival bidders South Africa and Australia has not dampened bandwidth requirements for the local project.
Although "the majority of SKA dishes" under phase one will be built in South Africa from 2016, Australian project director Brian Boyle said Australian IT requirements were largely unchanged from what its single-site proposal.
"The phase one deployment in Australia is pretty much close to the full phase one deployment that we had anticipated in terms of overall IT requirements," Boyle told iTnews.
"The dual-site model changes nothing in that regard. We're not correlating data in real-time with South Africa.
"It's very much getting the reduced data from Australia out to the rest of the world and we have planned for more than sufficient bandwidth in order to do that."
An analysis of the SKA dual-site solution given to organisers this month suggested 1.4 terabits per second of bandwidth would be needed for the Australian portion of the project at its peak.
Australia's ability to meet those needs was questioned during the bidding process, with organisers raising concerns over bandwidth costs and connectivity.
The years-long bidding war between Australia and South Africa was delayed in March when organisers of the $2 billion project could not reach a majority decision on one winner.
At the time, the SKA Site Advisory Committee said Australia had sufficient local fibre connectivity but the cost of additional active components - estimates in the hundreds of millions - left the country with a "medium to high level weakness" against South Africa.
Australian phase one array to generate petabits per second
The ultimate decision, handed down in the Netherlands on Friday, will see Australia expand the $100 million Australian SKA Pathfinder under construction in remote Western Australia to add low frequency and survey telescopes by 2020.
The frequency range is optimised for large-scale surveys and particularly for researching topics such as dark matter and dark energy, according to academics.
South Africa will build out its MeerKAT array to explore portions of space in the medium to higher frequencies, optimised for detecting weaker signals in space.
According to Boyle, New Zealand - a joint bidder with Australia - would not play a significant part in the project until phase two, scheduled for early next decade.
While Australia's responsibilities under the SKA are somewhat reduced compared to its bid, Boyle estimated the project would still gather some six petabits per second of data from local telescopes during phase one.
It would use $46 million worth of local and international fibre connectivity earmarked for use by the ASKAP project to operate the array in Western Australia, including a 40 Gbps connection between the telescopes and the Pawsey Centre.
It leveraged government-funded fibre links built under the $250 million Regional Broadband Blackspot Program to Geraldton in Western Australia, while using on international capacity leased by the Australian research network AARNet.
Much of the data generated by the telescopes will be processed on-site, as well as at the $80 million Pawsey Centre in Perth. Australian data would be stored locally but replicated globally.
Datasets gathered separately by the South African and Australian telescopes would only be collated on-demand by scientists that required both sets for specific experiments.
"You'd imagine there would be repositories around the world, including in South Africa and Australia themselves, where all the datasets would be brought together and you'd be allowed the chance to look at objects across a wide frequency range with data taken from both sites," Boyle said.
Up to 99 percent of the raw data generated under phase one of the SKA will ultimately be dumped as scientists seek to gain grip on the sheer amount of storage required for the project.
Boyle said the organisation would store "as much as we can afford to", with an effort to retaining as much raw and process data as possible.
"It's certainly petascale and it will be pushing the exascale in the size of the data," he said.