Students at New York University have created a working prototype of an app that uses machine learning and augmented reality to enable hearing people to understand sign language, and turns spoken words into sign language for the deaf.
Image: Courtesy Zhongheng Li
We’ve made great strides when it comes to technology’s ability to translate foreign languages for us. Still, there’s one language that hasn’t been as easy for technology to translate: sign language. A new prototype augmented reality app from students at New York University is able to translate sign language for hearing people, and turns spoken words into sign language for the deaf.
The students created the project as part of Verizon and NYC Media Lab’s third Connected Futures challenge. Zhongheng Li, one the project leads, told me that he was inspired by a friend who has two deaf parents.
“Her family moved from Hong Kong to the US, and she explained to me that there’s no universal sign language, so they were having trouble communicating,” Li said at a Demo Day in New York Friday.
In fact, while you may be most familiar with American Sign Language (ASL), deaf people around the world have hundreds of different sign languages currently in use, and translating between sign languages, or from spoken languages to sign, is still very difficult.
Li and his colleague used machine learning to create the app, which right now is limited to a narrow use case: being able to make an appointment in a medical clinic. The app is limited in how many phrases it can detect and translate. They hope to expand it to serve patients who don’t have other ways of communicating at small clinics, where sign language interpreters aren’t always available (state-run hospitals are required to have interpretation services for deaf patients).
There have been other prototypes for translating sign language released in recent years, including wearables that read sign language gestures, and 3D camera devices that track movements and convert the signs into text. But a full-blown, Google-translate-style app for all of the sign languages on the planet is still a long way away, Li told me.
“There are limitations,” Li said. “Sign language is complex and includes facial expressions. The word order is different from English, it’s a lot for the computer to learn.”
Get six of our favorite Motherboard stories every day by signing up for our newsletter .