This week, researchers from Stanford University reported they had created an artificial neural network that had 11.2 billion simulated connections - 6.5 times larger than that of Google’s.
The remarkable part of this is that it was achieved with only 16 servers that contained NVIDIA Graphic Processing Units (GPU) in addition to the traditional CPUs. In fact, reproducing the same network as Google took only three machines in comparison to the 1000 used in Google’s experiments.
Google's network simulated 1.7 billion connections and was used to teach itself to recognise cats, faces and bodies in stills taken from YouTube videos.
Although originally designed to handle the real time high speed graphics required by games consoles and high end PCs, graphics processors have been increasingly used for specialised calculations that they can perform much faster than traditional CPUs.
Consequently, GPUs are finding their way increasingly into the world’s most powerful supercomputers.
The Titan supercomputer for example, the world’s second fastest computer, uses 261,632 GPUs in addition to its 299,008 CPU cores for its processing power.
The significance of the Stanford announcement is that they achieved a big leap in computing power using a relatively small number of commodity off-the-shelf computers. They achieved this by solving a number of technical challenges using both software and hardware.
The end result is that this configuration of processors, software and communication technologies will enable more people to have access to this sort of computing power both for research and commercial purposes.
The techniques used to illustrate the advanced machine learning in these experiments can also be used for tasks such as speech recognition and natural language processing - the type of technology that drives Google and Apple’s personal digital assistants, for example.
Google has used a variant of neural networks, "deep learning”, to achieve large improvements in its speech recognition.
A high performance computer of this sort could also be used to greatly accelerate the cracking of passwords and sift through enormous amounts of data looking for connections and matches.
This is the sort of thing that secret service agencies do when looking through communications gathered by programs such as the NSA’s PRISM program.
For companies taking advantage of the masses of data that they collect through their everyday operations, the ability to use commodity high performance computing on this big data is going to prove invaluable.
The artificial neural network created by the Stanford researchers is still only a fraction of the size that would represent a real human brain, which is estimated to have about 84 billion neurons and between 84 and 100 trillion connections.
Research into machine learning however continues to shed light on how the brain itself learns and moves us closer to true artificial intelligence.
For the time being, being able to recognise cats is a significant step.