FYI.

This story is over 5 years old.

Tech

Single-Photon Quantum Computer Chips Are Scaling Up

A new "flip chip" technique slashes the error rate in optical quantum chip manufacturing.
Detector layered above light channel (black line). ​Image:​ MIT/Nature

A team of MIT engineers has developed a technique for creating integrated chip-mounted arrays of light detectors with single-photon sensitivity. Moreover, these sensors can be mounted on regular old silicon computer chips using regular old manufacturing processes, opening yet another door in the long hallway toward practical quantum computing.

To put the achievement into perspective, we need to look at what a photon even is. Sure, it's a particle of light, but it's not a "particle of" like anything else. In a sense, it's the particle; with no mass, it becomes the fastest thing in the universe, setting the cosmic speed limit that governs space and time itself. A photon has no size in any sense that we normally think of "size" (though it's a frau​ght question), nor does it have most of the properties we usually associate with subatomic particles.

Advertisement

Photons have energy, but are fundamentally neutral. They have no charge themselves, but, at the same time, photons are charge. That is, photons are gauge bosons, a force carrier; in this case, the force is electromagnetism. So, the task is that: detecting a single one of these on a computer chip in such a way that it might be used for information processing, particularly quantum information processing (or just quantum computing).

Photons, compared to other particles, are relatively stable when entangled with other photons. Quantum entanglement is the phenomenon in which multiple distinct particles can inhabit the same state, becoming in some sense the same particle. Add this behavior to the concept of quantum bits or qubits, in which a particle can occupy many states at once, and we have unimaginable, change-everything parallel computing potential.

Image:  ​MIT/Nature

"Because ultimately one will want to make such optical processors with maybe tens or hundreds of photonic qubits, it becomes unwieldy to do this using traditional optical components," said Dirk Englund, coauthor of a new paper in Nature Communications describing the detection scheme, in a statement.

"It's not only unwieldy but probably impossible, because if you tried to build it on a large optical table, simply the random motion of the table would cause noise on these optical states. So there's been an effort to miniaturize these optical circuits onto photonic integrated circuits."

Advertisement

Currently, photon detection involves a bit of dumb luck

Currently, photon detection involves a bit of dumb luck. The hardware is temperamental and for every 100 deposited onto a silicon chip, usually only a handful will actually work. At nanoscales, it doesn't take much of a defect to sink the whole thing.

"The central challenge when building systems with multiple [single-phone detectors] remains the low fabrication yield," the paper explains, "which is limited by defects at the nanoscale. This yield problem is exacerbated when such detectors are integrated onto photonic chips, which can require tens of additional fabrication steps of their own."

What Englund and his coauthors were able to do is come up with a process in which the testing and fabrication can be done separately. Which sounds like a kind of "duh" proposition—testing a thing before installing it—but at these scales, components aren't built so much as grown (as is the case with lots of nanotechnology), so swapping bits and pieces around isn't such a feasible way of doing things. Lego, this is not.

So, the technique is basically to make the sensor array on another silicon substrate and then transfer it over using a special Silly Putty-like material. You just smoosh the transfer medium over the first substrate and then smoosh it back down onto the final one, leaving the tested and relatively defect-free array of sensors. This is the "flip-chip" process.

Flip-chip gives a success rate of about 20 percent (20 percent of the transferred sensors worked), which sounds pretty weak, but previous attempts have only succeeded in about .2 percent. Even techniques that involve tediously and inefficiently installing the sensors one by one max out at 4 percent. Ninety percent is the quantum computing goal, but at the very least it seems a more reasonable target. The new technique is scalable.

Finally, the MIT group concludes: "This demonstration opens the door to fully integrated, high-performance photonic processors for quantum information science." No less.