The development of Artificial Intelligence over the years has been a bit like the development of a baby. A newborn baby pops out and it looks more alien than human. It stares blankly and malfunctions a lot. Its face is mostly expressionless, and it doesn’t really interact with humans yet in a reciprocal way. Over time, the baby’s misshapen head starts to look a little less weird, the parts of its face shift into place. It gets cute. We see a spark in its eye. We smile, the baby smiles back. We frown, chide it not to pull the cat’s tail: The baby frowns back, distressed.
Are we still talking about babies and robots? Yes. And also baby robots.
Researchers at the University of California San Diego's Machine Perception Lab have worked on several exciting projects in recent years that use facial recognition technology and computational brain modeling to, among other things, build robots that interact with humans in exceptionally convincing ways. Their latest is a robot baby named "Diego-San," seen interacting with its creators in the new video above—the perfect symbol of a science still in its infancy but coming into its own.
Hanson Robotics, creators of a robotic Einstein face that made the cover of Wired in 2006, built Diego-San's hyper-realistic face and mounted it atop a robot body built by Kokoro. High-def cameras fixed inside the baby robot’s “eyes” allowed it to “see” its human parents; the lab’s AI technology enabled it to learn from and mimic human facial expressions similar to the way a real baby would.
AI, you’ve come a long, baby.
The practical applications have as much to do with understanding human development as they do developing AI. “Its main goal is to try and understand the development of sensory motor intelligence from a computational point of view,” said the lab’s head, Dr. Javier Movellan, in an article by Gizmag.
“It brings together researchers in developmental psychology, machine learning, neuroscience, computer vision and robotics. Basically we are trying to understand the computational problems that a baby’s brain faces when learning to move its own body and use it to interact with the physical and social worlds.”
According to Gizmag’s reporting, the robot baby stands about 4 feet, three inches tall and weighs 66 pounds. The head contains 27 moving parts. (Human faces, by comparison, contain two moving eyes, a moving jaw and as many as 43 muscles—though, oddly, some human faces have fewer.)
For as realistic as the baby is, it still roams decidedly in the uncanny valley—a theory proposed by robotics professor Masahiro Mori in 1970 to describe the phenomenon whereby a robot feels more human the more human traits it assumes up to a point. After that point, things that are so closely human but not quite tend to provoke revulsion. A Real Doll might be one example (though they’re clearly not uncanny enough for lots of weird dudes to not want to have sex with it ); a zombie is another; Michael Jackson’s nose, yet another.
David Bowie is another. At least he is in his new video, in which his face is superimposed on a weird little doll body, strangely resembling the uncanny baby above. The resemblence was just too—well, uncanny—not to include the video in this post. Leave it to Ziggy to, like the baby, place himself at the threshold between making us feel a little exhilarated and a little uncomfortable.