US researchers have proposed new benchmarks to test supercomputers' data handling capabilities rather than speed.
Designed to complement the popular Top500 list, Graph500 ranked computers on their ability to perform complex, data-intensive analytics involved in medical research and social networks today.
It ranked computers within six input categories - huge (1.1PB), large (140TB), medium (17TB), small (1TB), mini (140GB), and toy (17GB) - and used 'edges per second' to denote the number of data transfers that took place.
When researchers tested nine supercomputers against the new benchmark, none were able to handle problems in the huge or large categories.
Argonne National Laboratory's BlueGene/P, dubbed Intrepid, topped the Graph500 list last week.
Intrepid was ranked 13th on the Top500, which used the Linpack benchmark to measure supercomputers' speed in solving a basic numerical problem.
"Top500 has really succeeded in getting computer manufacturers to care about FLOPS [floating operations per second]," said lead researcher Richard Murphy of Sandia National Laboratory.
"We'd love to influence the same group, many of whom are on our steering committee, to care more about data movement."
Murphy explained that while traditional supercomputing examined data - for example, simulating how chemical compounds interact - data-intensive supercomputing would discover relationships within a data set.
Instead of testing a hypothesis, data-intensive supercomputing was about "asking the computer to find hypotheses for us", he told iTnews.
He expected data-intensive computing to come to the fore on cybersecurity, medical informatics, data enrichment, social networks and symbolic networks in the coming years.
"Think of trying to keep track of every piece of cargo moving around the planet, examining huge medical research databases to find patterns, or simulating the human cerebral cortex," he said.
"These are very different problems from looking at how chemical compounds interact or crash-testing a car.
"Many of us on the Graph500 steering committee believe that data intensive problems will dominate the application set over the next decade."
No Australian supercomputers were tested against the Graph500 benchmark, which favoured strong interconnection networks and memory systems.
The rankings were:
Copyright © iTnews.com.au . All rights reserved.
Processing registration... Please wait.
This process can take up to a minute to complete.
A confirmation email has been sent to your email address - SUPPLIED GOES EMAIL HERE. Please click on the link in the email to verify your email address. You need to verify your email before you can start posting.
If you do not receive your confirmation email within the next few minutes, it may be because the email has been captured by a junk mail filter. Please ensure you add the domain @itnews.com.au to your white-listed senders.