FYI.

This story is over 5 years old.

Tech

Scientists Make a Light Switch Controlled By Individual Photons

The optic transistor might prove useful in quantum computing.

Researchers from MIT, Harvard, and the Vienna University of Technology have worked together to develop an optical transistor that is sensitive to light at a quantum level.

Transistors are the electrical “switches” that make both information storage and logical operations possible in computers. Normally, electricity flows through them at either higher or lower voltages to give them values of 1 or 0. The 1s and 0s, which are called bits, combine with other bits in specific patterns to establish information, which software then manipulates procedurally.

Advertisement

This transistor uses light instead of electricity, and it functions according to quantum mechanics. Specifically, its design takes advantage of both wave and particle descriptions of light that interacts with two reflective mirrors. When light passes through the two mirrors the transistor is "on" with a value of 1, and when it does not it is "off" with a value of 0.

To achieve the "on" state, light is shot directly at the first of the two mirrors. Photons, characteristic of the particle description of light, stop at the mirror. An electromagnetic field, which is characteristic of the wave description of light, passes through. Assuming the distance between the two mirrors is calibrated for the light’s wavelength, it resonates in the chamber. It builds in intensity, ultimately passing through the second mirror and traveling onward.

To make the transistor "off,” the researchers send a “gate photon” into the chamber obliquely, so that it disturbs the chamber’s internal substrate of supercooled cesium atoms. When the gate photon arrives at the unexpected angle, it excites the cesium molecules. Their heightened energy levels prevent incoming light from fully passing through the transistor.

Traditional transistors have served computers well, but their limitations are increasingly felt. Today’s complex calculations require them in greater numbers, which means increased energy consumption and a consequent risk of overheating. By addressing both problems, this transistor provides a light at the end of the tunnel, so to speak. As Jelena Vuckovic, a professor of electrical engineering at Stanford University, said in MIT’s announcement of the transistor, “You don’t have to spend a lot of energy for each bit. Your bit is essentially included in a single photon.”

Advertisement

Most tantalizingly, the researchers believe that that the transistor’s precision might make it useful in quantum computing. Quantum computing is a nascent technology that relies on a quantum phenomenon called superposition to enable exponentially faster calculations.

Superposition holds that "tiny particles of matter can be in mutually exclusive states simultaneously." Because of superposition, quantum bits, which are called qubits, can exist in more than just the two states of a traditional transistor. With more possibilities that can be considered, computations should become both more efficient and in depth.

Vladan Vuletić, who led the research, describes a “superposition state of the photon being there and not being there,” which could lead to “a macroscopic superposition state of the light being transmitted and reflected.”

If that sounds just too far out, the researchers do concede that this is only a proof of principle. Several kinks must be worked out for this technology to be implemented widely. Supercooling cesium, for instance, is not very feasible on a grand scale.

Still, the field of quantum computing is coming along. In recent weeks researchers at USC just came short of proving that the D-Wave computer performs quantum computations. Indeed, they ruled out classical annealing as its sole modus operandi.

The more quantum computing enters the mainstream, the more we’ll be able to model real-life occurrences with unprecedented thoroughness. Its contributions to fields as disparate as climate science, biology, and artificial intelligence may be enormous.