FYI.

This story is over 5 years old.

Tech

Researchers Are One Step Closer to Mind Reading in Real Time

Analyzing brain waves transmitted via electrodes, lets researchers see what the person is perceiving.
This image shows the broadband response (black line) from an electrode (blue disk) in the temporal lobe, while patients were shown images of faces (blue bars) and houses (pink) in 400ms flashes. Image: Kai Miller/Stanford University

Researchers have moved a step closer to reading people's minds.

In a study published Thursday in the journal Plos Computational Biology, researchers from the US used computational software to decode brain signals and predict what their subjects were seeing in real time.

"Researchers have been able to decode images that you see for something like 15 years with some degree of accuracy, but they always told their algorithms ahead of time when their test subjects were shown some pictures," said Kai Miller, paper lead author and a neuroscientist at Stanford University, over the phone.

Advertisement

"Our technique is really new in the sense that it can read from the brain signals in real time, without information on how often it should be looking for some perceptual event. It decodes the brain signals continuously," Miller added.

For their study, researchers worked with seven epilepsy patients who'd had electrodes implanted in in multiple locations in their temporal lobes. Subjects were shown images of human faces, blank gray scale and houses in brief 400 millisecond flashes, and told to watch out for an image of an upside-down house. The electrodes connected to a computer software that digitized the brain signals 1,000 times per second. It gleaned two different types of brain signals: event-related potentials (when neurons are acting in sync) and broadband spectral changes (when neurons are acting out of sync in the brain).

"With appropriate [brain] recordings you could learn extremely fine details of a person."

The software analyzed the brain signals in order to glean which electrode locations responded when a subject saw a particular image. The software also analyzed the data to determine which combination of electrode locations and signal types correlated best with what each subject actually saw in real time.

What differentiates this studies from previous research is that instead of revealing what one single neuron does, researchers analyzed how a bunch of neurons worked together when processing different images—in this case houses and faces.

Advertisement

According to Miller, the algorithm that he developed within the computational software "learns the expected pattern of brain response" from initial brain recordings gleaned when researchers showed their subjects a random sequence of 200 images composed of gray scale, houses, and faces.

Miller explained that this stage allowed them to take into account the various timing differences as aspects of objects viewed are processed in different brain areas, and to work out an average for each of the images—in this case, houses, faces, and gray scale. This data is subsequently fed to the algorithm and was used to predict what would happen when subjects were presented with a further 100 images. The algorithm processes the brain signal as the person viewed a given object, and managed in most cases to predict if and when images have been shown.

In the future, the researchers want to expand their study by integrating other areas of the brain in order to help them predict what a person would see when presented with a more complex picture of reality. For example, a face of someone a subject knows and therefore has an emotional connection with, and the face of a stranger.

"Nobody else has done this spontaneous type of brain decoding before so we picked a simple task—we're just showing two general classes of things (faces and houses)," explained Miller. "We hope that in the future that we could not only explore a wide variety of things that you might see, but also try to understand what happens when people compose scenes together."

They also want their research to be used in order to help rehabilitate people who have lost movement owing to an injury, or who have lost the power of speech.

"You can learn a lot of details about a person's intended actions or perception just by looking at their brain signals even though they can't show any behaviour," said Gerwin Schalk, paper co-author and neuroscientist at the Wadsworth Institute in New York, over the phone.

"With appropriate [brain] recordings you could learn extremely fine details of a person. For example, where are they moving, what objects do they see when they look somewhere. What are they literally thinking," he added. "This is really quite profound if you think about it."