FYI.

This story is over 5 years old.

Tech

How Your Ears Help You to See

New revelations that the brain's visual cortex processes sounds as well as sights.

It's easiest to think of the human body's various sensing mechanisms as more or less discrete tools. There is a thing called sound: you have ears to hear it and a special part of the brain to process sound information. There is a thing called smell: you have a nose to detect it, and a special part of the brain for "smells." And so forth. With the exceptions of perhaps taste and smell, which are widely recognized as tightly interconnected phenomena, relating the experiences of touch, sight, and sound becomes a bit mind-bending, the growing ranks of self-proclaimed synesthetes notwithstanding.

Advertisement

But nothing is that easy in the brain. The perils of completely cordoning off different portions of the brain for different tasks and experiences are by now well known, so this week's Current Biology study highlighting a connection, previously unobserved in humans, between neurological sound and image processing shouldn't be taken as Earth-shattering. The overall future is finding more, not fewer, connections between the brain's different zones.

But it's still interesting to consider. The current study, courtesy of researchers at the University of Glasgow, took 10 subjects (a small number, mind you) and sent them through a series of five experiments. First, the group was blindfolded and asked to listen to three common nature sounds: birdsong, traffic noise, and a talking crowd. Using just fMRI pictures of the subjects' early visual cortices, the researchers were able to determine (using an algorithm) which sound was being listened to. A second experiment looked at this cortex as subjects imagined pictures in the absence of actual visual or auditory input, finding the brain's visual processor is tasked with even inputs that are completely abstract (e.g. with no actual sensory information).

These observations held up even as the team pushed the subjects' attentions, finding that the observed encodings stayed robust in the face of "mild manipulations of attention and working memory." But as the manipulations grew more intense, requiring more and more cognitive focus, the senses split back into their more conventionally understood processing regions. This variability under pressure makes the overall finding even more interesting, just in the suggestion that multisensory perception is higher-level than the base, raw functioning of the visual cortex.

"In [the] future we will test how this auditory information supports visual processing, but the assumption is it provides predictions to help the visual system to focus on surprising events which would confer a survival advantage," said the study's lead author, Lars Muckli, in a statement. "This might provide insights into mental health conditions such as schizophrenia or autism and help us understand how sensory perceptions differ in these individuals."

The higher-level coding hinted at by the study is more interesting than it might seem initially. It's quite possible that we're not just seeing a mere mixing of different sorts of sensory information across the brain, but rather the action of a sort of sense beyond senses, or a sense that exists before senses. As there is raw data behind a digital picture—which is not a shade of color or degree of intensity but a simple measurement of voltage—there should be some raw form of sensory data that's portable across all of the brain's sensory (and memory) centers.

In an email, Muckli explains it as such: "We think of categorical templates of some sort. Let’s say your eyes are closed and you hear the sound of a breaking waves: this would trigger some kind of a shoreline expectation. Depending on the sounds it might be a beach or a cliff coast-line. The higher-level abstract expectation might be a template of the beach without the fine grained details. An expectation of a vehicle approaching is another example of a higher-level abstract expectation without the details of the exact brand and colour. It could mean that some sensory information of vertical contrast could be triggered by beaches or some other activation pattern we have not yet understood."

This data, even in the absence of an actual image, is able to capture both the content and category of what the image should be. "This suggests that information feedback is unlikely to be caused by an exact pictorial representation but instead contains abstract and possibly semantic information," the authors wrote. That's a pretty weird idea, and the suggestion is that it's a predictive mechanism. The brain in essence "primes" its processing centers with data suggesting likely sensory inputs gleaned from other senses and/or, perhaps, memory.