FYI.

This story is over 5 years old.

Tech

Google Claims Its D-Wave Quantum Computer Is the Real Deal—Sort Of

A study posted this week shows the two-year-old machine beating out a classical computer 100 million times over.
Image: D-Wave

On Tuesday night, a team of computer scientists and physicists from Google posted a study to the arXiv open-access server containing a very bold claim: The quantum computer Google had purchased from Canadian start-up D-Wave systems in 2013 is in fact performing quantum computations as advertised.

Why shouldn't it be? Fair question. The D-Wave computer has been a source of controversy for years now, with many if not most quantum computing experts disputing D-Wave's claims of having achieved true quantum computing. Making things even more difficult is that the D-Wave computer is only meant to enact one particular subset of quantum computing, known as quantum annealing, which makes it quite unlike the more fundamental machines—fundamental in that they can be expected to perform any computation a classical computer can be expected to compute—being worked on in laboratories around the globe.

Advertisement

In other words, it's not a universal computer or really anything close to it. It solves one sort of problem.

The Google paper hasn't been peer reviewed or otherwise accepted for publication in a proper journal, but Google obviously isn't your run-of-the-mill open-access crank either. The new results are based on head to head comparisons between the D-Wave machine and a classical single-core computer simulating the annealing functions of its quantum peer. The researchers found a speed-up by a factor of 108 with the D-Wave computer, which is a whole lot of zeroes.

Quantum annealing is a way of finding optimal solutions for problems with a bunch of variables. The basic idea is that you start out with every possible solution together in a single superimposed state and then watch as it collapses into lower and lower energy states, each one being a new local minima. Because quantum physics is really weird, it's possible for particles to tunnel out of their current local minima and possibly find one of an even lower energy thanks to natural random fluctuations.

So, like this, a system can find a state that minimizes the values of its variables. This is hardly trivial—algorithms that try to find optimizations aren't cheap, and artificial intelligence and machine learning schemes are mostly based on really big optimizations involving lots of variables. The benchmarking performed in the new paper is based on roughly 1,000 binary variables.

Neat. But there is one big problem: The D-Wave machine was tested against only a classical computer simulating the same quantum algorithm. Pitting it against any classical optimization algorithm, and things start to look a whole lot different, as the Google researchers note: "Based on the results presented here, one cannot claim a quantum speedup for D-Wave 2X, as this would require that the quantum processor in question outperforms the best known classical algorithm."

Quantum annealing might then remain as a cool idea more so than a truly practical one, and, as far as general quantum computing, physicists and engineers are still wrangling with the bare fundamentals. This fall's big news was the first successful implementation of a two-qubit quantum logic gate on a silicon substrate, which isn't all that far from the drawing board.