FYI.

This story is over 5 years old.

Tech

A Whistled Turkish Dialect Is Challenging Theories of Languages and the Brain

A group of tweeting Turkish villagers are challenging notions of the brain and language.
Onur Güntürkün conducts his research with a village from Kuskoy. Image: Onur Güntürkün

In a remote hilltop village above Turkey's Black Sea coast, a group of villagers occasionally prefer to whistle for long-distance communication. Along with speech, they use a whistling Turkish language that, according to researchers, debunk previous theories of language processing by using both sides of the brain, not just the left.

Whistled Turkish is a dialect of Turkish that uses the words and syntactic codes of Turkish, but transforms the language into a series of whistles of different pitches and melodies. It is endemic to a remote hilltop village called Kuskoy (Bird Village) in Northeast Turkey, and is one of several whistled languages in the world.

Advertisement

"Before the invention of telephone and mobile phones, whistled Turkish was an extremely handy way to communicate over vast distances in this extremely mountainous region where travelling is really very difficult," Onur Güntürkün, study lead author and professor of behavioral neuroscience at Ruhr-University Bochum in Germany, told me.

In a study published today in the journal Current Biology, the German team of researchers argue that whistling Turkish tests the notion that language processing and encoding is a largely left-brained activity, which is independent of the physical structure of language (speech, writing, and sign language, for example).

"I strongly believe that the processes related to brain asymmetries start at the very early level of signal encoding," said Güntürkün. He chose whistling Turkish to test this hypothesis as its physical characteristics (melody, pitch, and frequency) are predominantly processed by the right-hemisphere of the brain.

According to Güntürkün, the right-hemisphere dominates prosodic analyses of language—when different meanings emerge through tone and speed of a word/sentence.

"The right hemisphere extracts slow modulations of the acoustic signal," explained Güntürkün. Whistled Turkish, he said, just happens to be "acoustic information in slow-mo."

"Here's a language where we have all the grammatical structures and words of Turkish translated into a physical structure that consists of an acoustic signal that slowly varies over time," explained Güntürkün.

Advertisement

A man whistling Turkish style. Image: Onur Güntürkün

In their study, the researchers investigated how the brain processed spoken and whistled Turkish in 31 study participants (whistle speakers) by using a dichotic listening method. This is when participants "simultaneously hear via headphones either same (homonymic) or different syllables (dichotic condition) on left and right ears." The study reports that: "dichotic listening tests reveal that usually right ear input is perceived, which is related to left hemisphere speech sound processing," while whistles are processed by both sides of the brain.

They asked the participants to listen to vocal and whistled syllables transmitted to either their left or right ears through a set of headphones. The researchers found that the participants heard spoken Turkish well through their right ear, which is associated with the speech processing abilities of the left hemisphere, while whistles were heard well by both ears.

Güntürkün said that his findings showed that language asymmetries in the brain were indeed changed by the physical structures of language. Contrary to previous research that stated a left-hemisphere-centric approach to language processing, people who use whistled Turkish rely on both sides of their brain to process the sound.

"The right hemisphere is suddenly pitching in and contributing to this task because it has the means to encode this slow change of the acoustic signal over time. But in the end, after it has processed the syllable, it's language, so the left hemisphere is also needed. The two hemispheres are equally involved in this task," said Güntürkün.

Next up, Güntürkün said he wanted to use EEG (electroencephalography) devices to gain a deeper insight into the brain's electrical activity when processing whistled and non-whistled language. But he referred to an even "more tantalizing" investigation.

"Think of patients with left hemispheric strokes […] They would not be able to understand or produce spoken Turkish, but if these patients were able to speak whistled Turkish, would it be an alternative route to the conscious understanding of the subject?" he asked.

Whistled Turkish, said Güntürkün, might hold the key to communicating with these patients as the language would still engage with the right side of the brain. When questioned about the need for an active left hemisphere to help process whistled Turkish in people with inactive left brain hemispheres, Güntürkün acknowledged that the experiment would be critical in testing his initial assumption: "that the equal left and the right hemispheric contribution is due to the left hemisphere understanding, and the right hemisphere processing the auditory input."

"This is an idea of mine, but obviously I do not know if this is the right explanation. These kinds of patients could be a means to test such assumptions," he said.