Raw brainwave data used for neurogaming could be tapped to get a detailed read-out of your private thoughts.
Gadgets that read your brainwaves are reaching the consumer market. What then? Image: Emotiv
In the near future, companies, hell even the NSA, could be mining our brainwaves for data. It’s bad enough the private details about our lives that are revealed in hoovered up emails and phone calls; imagine if Big Brother was literally reading our minds? That’s some dystopian shit.
We're heading in that direction. Brainwave-tracking is becoming increasingly common in the consumer market, with the gaming industry at the forefront of the trend. “Neurogames” use brain-computer interfaces and electroencephalographic (EEG) gadgets like the Emotiv headset to read brain signals and map them to in-game actions, basically giving the player virtual psychic superpowers.
Now there’s a fear that we’re not doing enough to protect our raw thoughts from getting hacked with "brain spyware" or being tracked and gathered like the rest of our personal data. The concern was raised last month at the 2014 Neurogaming Conference in San Francisco, NPR reported.
“We may wake up in a few years and say, ‘Oh, we should have done something. We should have thought about the privacy of this data,’” Arek Stopczynski, a neuroinformatics researcher at MIT told me in an interview.
It’s possible to glean private information like PIN numbers, credit cards, addresses, and birthdays "leaked" from brain signals.
EEG data is extremely rich, or “high-dimensional,” meaning a single signal can reveal a lot of information about you: if you have a mental illness, are prone to addiction, your emotions, mood, and taste.
Raw brainwave data uploaded to a server for gaming purposes could also be tapped to get a detailed read-out of your psyche. It’s possible to glean private information like PIN numbers, credit cards, addresses, and birthdays "leaked" from brain signals, as researchers demonstrated in a 2013 paper on the privacy and security implications of brain-controlled consumer products.
And unlike your Facebook profile, EGG data is a unique biometric identifier, like a fingerprint. Researchers have demonstrated they can identify people based on their EEG data with an 80-100 percent accuracy rate.
The greatest potential danger when it comes to brainwave data privacy, Stopczynski argued, is the possibility of linking EEG databases to other databases with information about finances or location. “If we don’t do something about it or start talking about it, we will end up with this big dataset of personal EEG data that no one will have proper control over,” he said.
If, let’s just say the NSA, began collecting brain data, they could theoretically match it with other datasets culled from online data mining to create a complete profile of an individual that goes far beyond what they divulge through posts and messages alone.
How can we stop this kind of invasive mining of our minds? The simple answer is that brainwaves can be protected just like any other personal data.
In a recent paper, Stopczynski and several colleagues outlined a security protocol for EEG data, called openPDS. The system marries two technologies: a smartphone app that reads EEG data and a generic data storage system that only releases the answers to specific queries “asked” by programs and services—not the raw data itself.
So before firing up a neurogame, the user would first have to install a company-provided module that would only be able to calculate specific parts of the data emitted by an EEG headset to generate code that translates into in-game actions.
The goal is to prevent our brain data from being disseminated through cyberspace without our knowledge or say-so, the way personal information from the web is now. Data security experts wants to make sure consumers retain total control over where their brainwaves go, and whose hands they end up in.
By ensuring that the raw EEG data is never released to another party, Stopczynski’s system would offer users control over their own neurodata. “Eventually, you should own the only copy of your raw data,” he said, “You should not have your data, especially your biometric data, duplicated multiple places.”
Folks may have a cavalier attitude toward online privacy, even a willingness to exchange personal data an all-access pass to the digital world. But I’m inclined to think that our brains are different.
Before neurogaming gives way to other brain-controlled services and products, we might want to make sure that won’t mean giving corporate giants and government snoops unfettered access to our private thoughts. Facebook and the NSA don’t have carte blanche access to our minds just yet, and we should probably keep it that way.