Scientists raise quantum error threshold

By

New design allows up to a quarter of qubits to be lost.

Researchers have devised a theoretical quantum computer that could function even if one in four quantum bits (qubits) were missing.

Scientists raise quantum error threshold

With scientists struggling to build devices as large as three qubits, the new method could bring future applications closer by lowering the engineering requirements of a functional machine.

University of Queensland physicist Thomas Stace worked with Sean Barrett of the Imperial College London to address two quantum information issues: decoherence and loss.

The former pertained to inaccuracies in the information carried by qubits. The latter dealt with the loss of qubits themselves.

Stace explained that quantum computers that used photons - particles of light - as qubits risked losing some of these particles as they were scattered or absorbed.

Some researchers have devised methods that could tolerate the loss of one in two qubits. Other theories allowed for decoherence in one in a hundred qubits.

But none tolerated both decoherence and loss to a great degree until now. Stace said the next most tolerant method, by Queensland physicists Michael Nielsen, Christopher Dawson and Henry Hasselgrove, tolerated 0.1 percent loss and 0.01 percent decoherence.

Stace and Barrett's method, detailed in this week's Physical Review Letters, was based on the work of the University of British Columbia's Robert Raussendorf.

While traditional machines manipulated bits sequentially, using a series of logic gates, Stace and colleagues suggested that quantum computations be performed by measuring qubits initially laid out in a complex pattern.

"As you measure a quantum state, you change it," he explained, referring to the Heisenberg Uncertainty Principle in quantum mechanics.

The universal, initial state involved sets of entangled qubits in a pattern that would depend on what type of particle - electrons, ions, or photons - the qubits were.

Qubits would then be measured in an order defined by what a user wanted to achieve. Researchers already had a method to directly map these measurements to traditional logical operations, Stace said.

Only one in four measurements needed to occur, thanks to error-correcting code that used the context of remaining qubits to decipher the information in those that had been lost.

And because operations hinged predominantly on the initial, entangled state, the system had fewer points of failure than most quantum computing models, and was thereby more robust.

Stace described it as a "divide and conquer" approach that could be easily restarted if too many measurements failed.

The researchers have discussed the method's potentials with experimentalists from Yale and Sydney University.

But experiments with large scale devices were still "easily a decade" away due to engineering difficulties, since even "elementary demonstrations" required "several tens" of qubits, Stace said.

"You could do some proof of principle with 20 qubits," he mused, noting that this may be sufficient for small, simple devices that could act as signal repeaters in quantum key distribution networks.

"I wouldn't be surprised if that happens in the next three to five years," he said.

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Chinese porn king jailed for life

Chinese porn king jailed for life

Hello Optus: ASKAP telescope gets first radio signal

Hello Optus: ASKAP telescope gets first radio signal

Video: First ASKAP radio telescope is listening

Video: First ASKAP radio telescope is listening

Australia puts $88m into space

Australia puts $88m into space

Log In

  |  Forgot your password?