New hardware to enable a computational brain

 

Supercomputer algorithms too expensive to run, neuroscientist claims.

Computational neuroscientist Terrence Sejnowski has called for more power-efficient, parallel computing architecture to support future robots that could keep up with the human brain.

Delivering the Graeme Clark Oration in Melbourne this week, the Salk Institute director described a future where the brain was more than just “a part of the body”.

But today’s computing architecture was unlikely to support that vision, he said, describing the brain as “the most complex device in the known universe”.

Although electrons travelled far quicker than neurons in the brain, current sequential algorithms were often a challenge to write, and expensive to run.

Sejnowski highlighted the Chinese Tianhe-1A supercomputer, which performed 2.57 quadrillion calculations a second and last November was ranked the most powerful supercomputer in the world.

“Just the power to keep that going requires four megawatts – a small power plant,” he said. “The power bill is millions of dollars a year.”

He compared Tianhe to a honeybee, which had less than one million neurons to control how it foraged for food, travelled long distances, and communicated with others in its colony.

“That supercomputer cannot see as well as the bee, it cannot fly as well as the bee, and – as far as I know – that supercomputer has never reproduced,” Sejnowski said.

“There seems to be a disconnect between what computers can do, and what nature can do.”

Sejnowski and his colleagues aimed to reverse engineer natural principles to reveal how technology may match the capabilities of the 100-billion-neuron, 20-watt human brain.

To date, logic-based artificial intelligence trumped humans in arithmetic, games of chess, and theorem proving. But computers still fell behind in vision, motion, and “common sense”.

Sejnowski explained that although neural signals spent “many milliseconds” in transit, the brain performed many computations in parallel.

Brains also functioned probabilistically, performing only the calculations that were most likely to produce the desired outcome, he explained.

The traditional, iterative approach of computers could be expensive – especially in future, human-like robots that had to pronounce words, detect auditory signals and recognise human expressions.

“If you have a problem that can be put into a mathematical form – if you had a learning algorithm that let you learn through experience – [artificial] networks are very, very powerful,” he said.

“[But] we can’t afford supercomputers to run these algorithms. We have to build hardware that is able to run things cheaply, efficiently, and in a way that is very lightweight from the point of view of power.”

While fundamental physics and molecular biology dominated the past century’s innovations, Sejnowski said the years between 2000 and 2050 was the “age of information”.

Data storage, retrieval and search would likely be key challenges of the years to come, he said.

Copyright © iTnews.com.au . All rights reserved.


New hardware to enable a computational brain
 
 
 
Top Stories
Coalition's NBN cost-benefit study finds in favour of MTM
FTTP costs too much, would take too long.
 
Who'd have picked a BlackBerry for the Internet of Things?
[Blog] BlackBerry has a more secure future in the physical world.
 
Will Nutanix be outflanked before reaching IPO?
VMware muscles in on storage startup in hyper-converged infrastructure.
 
 
Sign up to receive iTnews email bulletins
   FOLLOW US...
Latest Comments
Polls
Which is the most prevalent cyber attack method your organisation faces?




   |   View results
Phishing and social engineering
  69%
 
Advanced persistent threats
  3%
 
Unpatched or unsupported software vulnerabilities
  12%
 
Denial of service attacks
  7%
 
Insider threats
  11%
TOTAL VOTES: 626

Vote