FYI.

This story is over 5 years old.

Tech

​Telepathic Melodies Are the Soundless Music of the Future

No more wires; no more earphones; more play buttons. Just thoughts—and scans of those thoughts, meandering about your brain.

Zoltan Istvan is a futurist, philosopher, journalist, and author of the bestselling novel The Transhumanist Wager. He writes an occasional column for Motherboard ruminating on the future beyond natural human ability. 

Sound is commonly defined as vibrations that travel through the air or another medium and can be heard when they reach the ear. But like with many things in the coming transhumanist future, prepare to reassess your definitions.

Advertisement

For a philosopher of mind or neuroscientist, the more accurate definition of "sound" is a particular set of neurons in the brain firing in an arrangement that creates the sensation of noise to the mind. The ear is just a vessel that converts vibrations from the outside world into data for the brain.

The key word there is data—because it's something that in the coming future, microchips and brainwave-reading headsets will be able to process and make perfect sense of, without the ear.

That's why the music concert of the future could very well be silent. Dead silent. Just like live music was replaced by vinyl record players and vinyl record players were replaced by walkmans and walkmans were replaced by iPods, iPods and other music-playing devices could be replaced by low-cost EEG brainwave-reading headsets. No more wires; no more earphones; more play buttons. Just thoughts—and scans of those thoughts, meandering about your brain.

Of course, with all this radical technology would likely come a new way to interpret music: You wouldn't be "hearing" it in the traditional way anymore, but rather sort of feeling it, or perhaps better explained, simply knowing it.

Musicians will be plugged in too, and you'll be listening to the music they play in their minds.

Such futurist speculation rests on developing brainwave technology and the devices and microchips that will use it. Some of that tech is already here. A number of companies—most startups—already sell brainwave-reading devices, such as the Muse headband or Freer Logic BodyWave armband.

Advertisement

Some companies offer headsets that can turn on the lights in your house just by thinking about it. Or help you play a video game on your iPhone using only thoughts. NueroSky's MindWave can attach to Google Glass and allow you to take a picture and post it to Facebook or Twitter with your mind. Scientists have even flown helicopters using only thoughts and a brainwave headset.

The way the mind-reading tech currently works is via electroencephalography (EEG) sensors that pick up and monitor brain activity. However, it's mostly only a one-way technology. What will make this a paradigm shift in human communication—music, and language too—is if headsets can not just receive but also fire back back sophisticated impulses and signals to chip implants in the skull connecting directly to the brain.

Scientists and technologists are already working on many forms of brain implants. For example, Matti Mintz, a scientist from Tel Aviv University, and his colleague, Paul Vershure, from Universitat Pompeu Fabra in Barcelona, are working on an implantable brain chip that help with the ability to learn new motor functions.

The New York Times reported on how brain implants improved the functions of primates. Scientists from the Wake Forest Baptist Medical Center and University of Southern California successfully used a device that improved brain functionality, by improving communication between neurons. The Pentagon has also actively sought and funded research to use implants to improve memory.

Advertisement

All these technologies rely on one thing: interfacing with the incredible complexity of the human brain and its approximate nearly 100 billion neurons. And every year scientists make breakthroughs that bring humans and machines closer together, until the long sought point when every human thought will be clearly understood and revealed via a brainwave device or chip implant.

Then the human being will truly be a cyborg. Then the definition of "sound" will indubitably be rewritten.

Brainwave-reading technology is in its infancy, but it's the start of what could easily become a trillion-dollar industry. It could be used by nearly everyone, with hundreds of differnt applications, similar to cell phone use today.

Speaking at the 2014 World Future Society conference last month in Florida, Singularity University Professor Jose Cordeiro said, "Spoken language could start disappearing in 20 years. We'll all talk with each other using thoughts scanned and projected from our headsets and maybe even implants. This will radically increase the speed and bandwidth of human communications."

Twenty years is not that far off. And, if that's the case, Google translator will also eliminate the need for a second or third language. It has crossed my mind that I should drop my four-year old daughter's Spanish classes.

However, it's the field of music where I'm most fascinated with the impending brainwave tech. I'm a long-time rhythm guitarist and freestyle pianist, and I try to play daily. And I know it's possible an instrument won't be needed at all anymore—its tones and melodies and tweaks can simply be imagined and constructed in the mind, creating orchestras and symphonies like we've never heard.

It's not that strange, really. In a way, Mozart and many composers have done this for centuries, penning whatever they could create and hear in their heads on sheet music. After all, most of us have songs playing in our head often. And a B, G, or C chord is simply just a bunch of neurons firing a certain way. That same chord played on piano creates a different pattern of neurons firing than a guitar would. Training electronic devices to understand this will spawn an entire new way of knowing sound. The music industry better pay attention. They're not going to be in Kansas anymore in some 20 years.

Go a bit deeper into the future and "live concerts" could be virtual shows, where listeners use full body haptic suits, virtual reality goggles, and implants or brainwave headsets to participate and listen into their favorite musicians. The twist is the musicians themselves will be plugged in too, and you'll be listening to the music they play in their minds.