How Hackers Could Get Inside Your Head With ‘Brain Malware’
Image: Shaye Anderson

FYI.

This story is over 5 years old.

Tech

How Hackers Could Get Inside Your Head With ‘Brain Malware’

Brain-computer interfaces offer new applications for our brain signals—and a new vector for security and privacy violations.

Hackers have spyware in your mind. You're minding your business, playing a game or scrolling through social media, and all the while they're gathering your most private information direct from your brain signals. Your likes and dislikes. Your political preferences. Your sexuality. Your PIN.

It's a futuristic scenario, but not that futuristic. The idea of securing our thoughts is a real concern with the introduction of brain-computer interfaces—devices that are controlled by brain signals such as EEG (electroencephalography), and which are already used in medical scenarios and, increasingly, in non-medical applications such as gaming.

Advertisement

Researchers at the University of Washington in Seattle say that we need to act fast to implement a privacy and security framework to prevent our brain signals from being used against us before the technology really takes off.

"There's actually very little time," said electrical engineer Howard Chizeck over Skype. "If we don't address this quickly, it'll be too late."

I first met Chizeck and fellow engineer Tamara Bonaci when I visited the University of Washington Biorobotics Lab to check out their work on hacking teleoperated surgical robots. While I was there, they showed me some other hacking research they were working on, including how they could use a brain-computer interface (BCI), coupled with subliminal messaging in a videogame, to extract private information about an individual.

Tamara Bonaci (right) and the author in the University of Washington Biorobotics Lab. Image: Motherboard

Bonaci showed me how it would work. She placed a BCI on my head—which looked like a shower cap covered in electrodes—and sat me in front of a computer to play Flappy Whale, a simple platform game based on the addictive Flappy Bird. All I had to do was guide a flopping blue whale through the on-screen course using the keyboard arrow keys. But as I happily played, trying to increase my dismal top score, something unusual happened. The logos for American banks started appearing: Chase, Citibank, Wells Fargo—each flickering in the top-right of the screen for just milliseconds before disappearing again. Blink and you'd miss them.

Advertisement

The idea is simple: Hackers could insert images like these into a dodgy game or app and record your brain's unintentional response to them through the BCI, perhaps gaining insight into which brands you're familiar with—in this case, say, which bank you bank with—or which images you have a strong reaction to.

Bonaci's team have several different Flappy Whale demos, also using logos from local coffee houses and fast food chains, for instance. You might not care who knows your weak spot for Kentucky Fried Chicken, but you can see where it's going: Imagine if these "subliminal" images showed politicians, or religious icons, or sexual images of men and women. Personal information gleaned this way could potentially be used for embarrassment, coercion, or manipulation.

The 'Flappy Whale' game. Images appeared in the top right. Image: Motherboard

"Broadly speaking, the problem with brain-computer interfaces is that, with most of the devices these days, when you're picking up electric signals to control an application… the application is not only getting access to the useful piece of EEG needed to control that app; it's also getting access to the whole EEG," explained Bonaci. "And that whole EEG signal contains rich information about us as persons."

And it's not just stereotypical black hat hackers who could take advantage. "You could see police misusing it, or governments—if you show clear evidence of supporting the opposition or being involved in something deemed illegal," suggested Chizeck. "This is kind of like a remote lie detector; a thought detector."

Advertisement

Of course, it's not as simple as "mind reading." We don't understand the brain well enough to match signals like this with straightforward meaning. But with careful engineering, Bonaci said that preliminary findings showed it was possible to pick up on people's preferences this way (their experiments are still ongoing).

"It's been known in neuroscience for a while now that if a person has a strong emotional response to one of the presented stimuli, then on average 300 milliseconds after they saw a stimulus there is going to be a positive peak hidden within their EEG signal," she said.

The catch: You can't tell what the emotional response was, such as whether it was positive or negative. "But with smartly placed stimuli, you could show people different combinations and play the '20 Questions' game, in a way," said Bonaci.

When I played the Flappy Whale game, the same logos appeared over and over again, which would provide more data about a subject's response to each image and allow the researchers to better discern any patterns.

"One of the cool things is that when you see something you expect, or you see something you don't expect, there's a response—a slightly different response," said Chizeck. "So if you have a fast enough computer connection and you can track those things, then over time you learn a lot about a person."

A brain-computer interface. Image: Motherboard

How likely is it that someone would use a BCI as an attack vector? Chizeck and Bonaci think that the BCI tech itself could easily take off very quickly, especially based on the recent sudden adoption of other technologies when incorporated into popular applications—think augmented reality being flung into the mainstream by Pokémon Go.

Advertisement

BCIs have already been touted in gaming, either as a novel controller or to add new functionality such as monitoring stress levels. It's clear that the ability to "read" someone's brain signals could also be used for other consumer applications: Chizeck painted a future where you could watch a horror film and see it change in response to your brain signals, like a thought-activated choose-your-own-adventure story. Or imagine porn that changes according to what gets your mind racing.

"The problem is, even if someone puts out an application with the best of intentions and there's nothing nefarious about it, someone else can then come and modify it," said Chizeck.

In the Flappy Whale scenario, the researchers imagine that a BCI user might download a game from an app store without realising it has these kind of subliminal messages in it; it'd be like "brain malware." Chizeck pointed out that many fake, malware-laden Pokémon-themed apps appeared in the app store around the real game's release.

But hacking aside, Bonaci and Chizeck argued that the biggest misuse of BCI tech could in fact be advertising, which could pose a threat to users' privacy as opposed to their security.

"Once you put electrodes on people's heads, it's feasible"

You could see BCIs as the ultimate in targeting ads: a direct line to consumers' brains. If you wore a BCI while browsing the web or playing a game, advertisers could potentially serve ads based on your response to items you see. Respond well to that picture of a burger? Here's a McDonald's promotion.

Advertisement

The researchers think there needs to be some kind of privacy policy in apps that use BCIs to ensure people know how their EEG data could be used.

"We usually know when we're giving up our privacy, although that's certainly become less true with online behaviour," said Chizeck. "But this provides an opportunity for someone to gather information from you without you knowing about it at all. When you're entering something on a web form, you can at least think for a second, 'Do I want to type this?'"

Brain signals, on the other hand, are involuntary; they're part of our "wetware."

The reason the University of Washington team is looking into potential privacy and security issues now is to catch any problems before the tech becomes mainstream (if indeed it ever does). In a 2014 paper, they argue that such issues "may be viewed as an attack on human rights to privacy and dignity." They point out that, unlike medical data, there are few legal protections for data generated by BCIs.

One obvious way to help control how BCI data is used would rely on policy rather than technology. Chizeck and Bonaci argue that lawyers, ethicists, and engineers need to work together to decide what it's acceptable to do with this kind of data. Something like an app store certification could then inform consumers as to which apps abide by these standards.

"There has to be an incentive for all app developers, programmers, manufacturers to do it," said Bonaci. "Otherwise why would they change anything about what they're doing right now?"

The Washington team has also suggested a more technical solution, which would effectively "filter" signals so that apps could only access the specific data they require. In their paper, they call this a "BCI Anonymizer" and compare it to smartphone apps having limited access to personal information stored on your phone. "Unintended information leakage is prevented by never transmitting and never storing raw neural signals and any signal components that are not explicitly needed for the purpose of BCI communication and control," they write.

Chizeck said a student in the lab was currently running more tests to characterise further the type and detail of information that can be gleaned through BCIs, and to try a method of filtering this to see if it's possible to block more sensitive data from leaking out.

By doing this work now, they hope to nip future privacy and security concerns in the bud before most people have ever come into contact with a BCI.

"It's technically becoming feasible; once you put electrodes on people's heads, it's feasible," said Chizeck. "The question is, do we want to regulate it, can we regulate it, and how?"