In a big step forward for quantum applications, big data could be used to fix quantum systems before they break.
Image: Erkan Utu/Pexels
A team of physicists at the University of Sydney have successfully demonstrated the possibility of using big data and machine learning to accurately predict the future of a quantum system. These predictions enabled the researchers to perform actions that prevented the quantum system from breaking down in a big step forward for practical applications of quantum systems.
In a classical (that is, non-quantum) system, predicting the future of a system is a relatively straightforward affair because a particle occupies a single, specific position at any given time. Yet in a quantum system, a particle can occupy two different positions at the same time, a property known as superposition. This property of quantum systems is leveraged in quantum computers to allow the machines to calculate many different problems simultaneously, but also presents a host of technical difficulties that make it hard to apply in any practical sense.
The primary difficulty is that superposition is incredibly difficult to maintain for any given length of time. Simply measuring a quantum system will cause it to collapse into a discrete state (so that the particle either occupies one position or another, instead of both at the same time). It is impossible to completely isolate a quantum system from its environment, however, so even if it isn't measured, the system will inevitably decay and lose its quantum properties as a result of interference from its environment.
The decay of a quantum system is called decoherence, and it has the effect of entirely randomizing the quantum state, making it useless for practical purposes.
"Much the way the individual components in mobile phones will eventually fail, so too do quantum systems," said Michael J. Biercuk, an experimental physicist at the University of Sydney. "But in quantum technology the lifetime is generally measured in fractions of a second, rather than years."
The trick to making a better quantum system, one that can be harnessed for practical uses in quantum computers, is to figure out how to correct for the environmental interference that causes a quantum system to decohere before it happens. Kind of like if your phone could monitor the wear and tear of its own parts and know in advance when it was going to finally break. However, unlike the parts of a phone, which decay in accordance with laws that are regular and known, quantum systems decohere randomly.
"Humans routinely employ predictive techniques in our daily experience; for instance, when we play tennis we predict where the ball will end up based on observations of the airborne ball," Biercuk said. "But what if the rules changed randomly while the ball was on its way to you? In that case it's next to impossible to predict the future behavior of that ball. This situation is exactly what we had to deal with because the disintegration of quantum systems is random."
In other words, the researchers just had to guess when and how the system would decay, sort of like walking onto a tennis court blindfolded and starting to swing, hoping you'll hit a ball eventually.
For a normal human, this task would be impossible due to the sheer number of possible ways for the system to decohere. But thanks to machine learning, which can filter through unfathomably large data sets in a fraction of the time it would take a human and make predictions based off of this data, the team was able to accurately predict and prevent the decoherence of quantum states.
For their quantum model, the researchers used ions of the element Ytterbium trapped in what Biercuk described to Motherboard as an "electromagnetic bottle." According to Biercuk, this device could hold small collections of the ions in a fixed position for periods ranging from several hours to several days.
"Ytterbium ions are an ultraclean quantum system allowing us to carefully test how our approach performs," Biercuk told Motherboard in an email. "They are the most advanced qubit technology, decades ahead of competing technologies such as semiconductor spins. Our work is part of a global community of ion trapping research at the cutting edge of the field."
These ions represented qubits, the quantum equivalent of the smallest unit of information in classical computing, a bit. Unlike a normal bit, which can either be a 1 or a 0, a quantum bit can be both at the same time. To train the machine learning algorithm used in the experiment, Biercuk and his colleagues fed it time-stamped measurements of qubits as they decohered.
What appeared to be purely random behavior on the system's part as it decayed actually contained enough information to allow the computer program to predict how the system would decay without actually observing it. Indeed, the measurements were so accurate that Biercuk and his colleagues were able to take corrective action and compensate for the decay of the system, thus maintaining its quantum properties for two to three times longer than it would otherwise.
"We know that building real quantum technologies will require major advances in our ability to control and stabilise qubits," said Biercuk. "Our techniques apply to any qubit, built in any technology. We're excited to be developing new capabilities that turn quantum systems from novelties into useful technologies. The quantum future is looking better all the time."
- motherboard show
- University of Sydney
- quantum computing
- machine learning
- Michael J. Biercuk