Santos to consolidate geoscience data stores

By on
Santos to consolidate geoscience data stores

To deliver computing grunt, data via private cloud.

Oil and gas exploration company Santos is nearing the end of an application virtualisation trial that will see it consolidate petabytes of seismic data from facilities in Adelaide, Brisbane and Perth.

Santos hopes to establish a method of virtualising its Windows-based Petrel reservoir software by June, after consolidating half its data and virtualising its Linux-based Paradigm software some 18 months ago.

The move is intended to allow Santos’ 150 geoscientists to access and analyse seismic data in a central data centre instead of relying on local data, applications and high-performance computers.

“We’ve got pockets of data in Perth and Brisbane and Adelaide on individual workstations, on file shares and all over the place,” Santos’ IS subsurface manager Andy Moore told iTnews.

“Everybody’s got their own copy of something; that might be different to the copy that somebody else is working on... there’s a whole raft of issues on the data management side with that traditional approach.”

Ideally, Moore said he hoped to consolidate all data and applications into Santos’ Adelaide data centre, to be accessed by geoscientists across the country through a custom web interface.

Santos’ geoscientists have used 3D imaging tool Virtual GL and thin-client software Turbo VNC to access Red Hat Enterprise Linux desktop environments from their laptops since late-2010.

Big Data

The environments are hosted on 12 servers in Adelaide. Moore said virtualisation allowed staff to work more collaboratively and pool server resources to analyse larger amounts of data.

“There’s the possibility of the big data idea, where you could start to run analytics on a dataset to reveal attributes of the data that you would not see if you were dealing with it in individual chunks,” he said.

Technology firms like IBM have used big data analytics to draw new information from symptoms, treatments and outcomes in the healthcare industry, and risk and customer habits in banking.

Moore said Santos was "talking to well-known companies like IBM" for assistance with big data analysis.

Using Hadoop data mining technology to locate oil and gas was “theoretically" possible he said, but it was a “nirvana that nobody has even thought about going in that direction”.

“Certainly, the software is nowhere [near] sophisticated enough to do that sort of thing,” he said.

“That is an area where I’m sure the software will develop in future, but we’re a long way away from that.

“There’s hunch involved; there’s a lot of luck involved as well. I guess the answer would be to combine geoscientists’ experience with the analytics.”


Should the Petrel virtualisation project be successful, Moore expected to host the virtualised Windows environments on an additional 12 servers in its Adelaide server room.

But Moore said the company could pull the plug on any further consolidation if geoscientists experienced any performance issues with the virtualised Windows environments.

“If the performance of virtualising those Windows environments and serving it out to Brisbane and Perth is as good as what a geoscientist has on his desktop today, then we’ll do it,” he said.

“If it’s slower, we won’t,” he said, highlighting bandwidth limitations in its wide-area network.

“If you provide the data [to geoscientists] in little bits at a time, then that thought process is broken, they get frustrated and the juices don’t flow.

"Finding oil and gas is an art as much as it is a science," he said.

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © . All rights reserved.

Most Read Articles

Log In

  |  Forgot your password?