This 'brain-to-text' device could help make the internet of brains possible.
When Stephen Hawking talks, a laser pointed at his face detects whether he's moving one of the only muscles he still has control over as a computer cycles through individual letters of the alphabet, determining what he's trying to say. It's an excruciatingly slow process, and it can take several minutes for him to say anything at all. What if, instead, a computer could simply read his mind?
Scientists in Albany, New York have just demonstrated for the first time that it's possible to turn a person's thoughts into a legible phrase using what they're calling a "brain-to-text" interface.
To be clear, these are the very early days of mind reading, if you want to call it that. The Albany study, carried out by researchers at the National Center for Adaptive Neurotechnologies and the State University of New York at Albany, took place in a very controlled environment and had to employ a couple tricks to get the job done.
Here's the good news: Researchers' computers were able to decipher seven patients' "silent speech"—their unspoken thoughts—and translate the output as text.
And here's why you can't go grab a brain-to-text device and hook it up to the internet just yet: For one, the patients' skulls were split open and electrode sheets were attached directly to their brains. They were also asked to read aloud from various texts (the Gettysburg Address, JFK's inaugural address, Charmed fan fiction, and a children's story) to get a baseline of what their brains were doing while they were speaking. Finally, the "dictionary" of the brainwave recognition device was limited—it wasn't selecting words from everything in the English language.
Nonetheless, Peter Brunner and his colleagues have accomplished something pretty amazing, and he does say that internet-connected brains could be coming someday. For now, brainwaves are regularly read by machines, but interpreting their meaning is always difficult.
"I liken it to having a helicopter over a crowd of cheering people. If you're near them, you can hear them cheering, but you can't hear individual people. If you put a microphone on one or two people, you're able to hear them, but you can't hear the overall picture," Brunner told me. "But if you put electrodes all over the surface of the brain—giving microphones to groups of people cheering—you can figure out what those groups are cheering for."
Of course, it's not terribly easy to find people willing to have their skulls split open to try out a mind reading device. But it turns out that there's already a clinical use for such technology. People with severe epilepsy will often wear a sheet of electrodes on their brain to determine which part of it is being overloaded during a seizure, so further treatment can take place. They wear these electrodes while they're in the hospital, waiting for a seizure to occur.
But in the meantime, there's no reason why other experiments can't take place (with their consent). Brunner said that his team "piggybacked" on their treatment.
"We were wondering if you could infer what someone was saying based on the signals sent from the surface of their brain—whether those signals are sent when they're speaking out loud or silently," Brunner, who published his research in Frontiers in Neuroscience, said. "It turns out you can to a degree that's much better than chance."
Brunner says the research was limited by his time with the patients and was also limited by their conditions: Each patient had the electrodes placed on different regions of their brain, depending on the part expected to be causing seizures. With more time to "train" the interface and more targeted electrode placement (he believes that the superior temporal gyrus, in the temporal lobe of the brain, would be ideal), brain-to-text interfaces could become much better. He also hopes to make interfaces that don't need to be placed directly on the brain.
"This could be relevant for people who suffer from ALS—their muscles don't work but there's no effect on the brain," he said. "You would probably want to have a person train the interface before they're fully locked in."
Brunner's device wasn't connected to the internet, but he said there's no reason why it couldn't be, which would allow real-time brain-connected communication. Maybe that whole internet of brains isn't actually so far off.