FYI.

This story is over 5 years old.

Tech

How to Teach a Baby Robot to Talk With Its Hands

New research suggests that hand gestures are an integral part of human communication—what about machines?
The iCub childlike learning robot. Image: Jiuguang Wang/Flickr

Raising a baby robot is weirdly similar to nurturing a human one. For instance, the toddler-sized iCub had to be taught how to crawl and grasp things by its creators in its fledgling years. But unlike us, our android friends aren’t handily conversing by the time they're toddlers. The ongoing challenge is how to help robots understand human communication.

The answer may be body language. A new study by researchers at the International School for Advanced Studies (ISAS) found that, unsurprisingly, gesticulating allows humans to understand language and constitutes an integral aspect of human communication beyond just words.

Advertisement

The study is the first to explain why we use gestures in speech, and its findings have implications for the development of artificial intelligence and robot cognition.

“One aspect of language that hasn’t been studied so far is that some aspects of speech are really mapped to gestures,” Guellai Bahia, one of the study’s authors, told me in an interview. As a developmental psychologist, Bahia undertook the study to explore how gestures play a part in the early stages of language acquisition.

The researchers focused on “prosody”—the patterns of intonation in speech that convey nonverbal meaning—and sought to understand whether gestures that punctuate speech have the same prosodic function. In other words, Bahia and her colleagues wanted to find out whether talking with your hands is as much a part of communicating as words themselves.

“What is interesting about prosody is that it drives some important aspects of speech, especially the non-intelligible aspects of it; for example, emotion, etcetera,” Bahia explained. “This is also supposed to be the part that infants, from the in-utero stage and from birth, are very, very sensitive to.”

So, what does all this have to do with robotics? Bahia and her colleagues at ISAS recently agreed to collaborate with a team at the University of Genova’s Italian Institute of Technology, which is using the iCub to mimic human cognition and communication.

The iCub, first developed in 2004 as an open source testing bed for robotics research, has since been used for artificial intelligence development based on “embodied cognition”—the idea that our physical body’s movements, developed at the same time as our language skills, play a large role in how we think and communicate.

By building a robot that can move like a toddler, researchers at IIT hope to speed along its ability to think and communicate like one, too.

“To imagine humanoids as close to humans as possible, you really need to understand what we first rely on to communicate, to interact, etc.,” Bahia said. “Because we are more and more sure that language is multimodal, this baby robot can use more of its body. But for this, we need to have more studies like the one that we have published to really understand the correlations between the body and auditory speech.”

Gesticulating robots may one day be flailing more than Slavoj Zizek on a tear about capitalism and Batman. Hopefully we’ll be able to understand them better, though.