FYI.

This story is over 5 years old.

Tech

The VR Controller of the Future Could Be Your Own Hands

Microsoft's Handpose lets you use your hands to interact with virtual worlds.
David Sweeney demonstrating Handpose. Image: Victoria Turk/Motherboard

One of the most disconcerting things about virtual reality, motion sickness aside, is the inability to see your own hands. You're dropped into a world so immersive it makes you feel like you're there, except when you hold up a hand in front of your face, or reach out to touch something, there's nothing there, immediately shattering the illusion of your presence in the virtual environment.

A group of Microsoft researchers hope their hand tracking tech, named Handpose, could help humans interact with virtual environments more naturally, not just in VR and AR but also our interactions with computers and other connected devices.

Advertisement

"It starts off with the realisation that when we're interacting with the physical world, our primary mode of interaction would be our hands," Design Technologist David Sweeney told me at Microsoft Research's offices in Cambridge, UK. "It's what we're used to; we don't even think about it, it's intuitive."

This demo shows the raw data from the depth sensor being translated into hand gestures. Image: Victoria Turk/Motherboard

The Handpose project has been in the works since 2014, with advances in its computer vision unveiled earlier this year, and while it's still a research project, Microsoft invited me to see it in action.

"It eats depth information and it spits out hands," Sweeney summarised as I moved my hands in meatspace to see them replicated on a screen in front of me, able to grasp a bunch of virtual strings and toggles created in Unity. It's a similar idea to the Leap Motion.

Read More: I Stuck My Hands into a Virtual Reality Interface and Felt the Future

A Kinect sensor captures the depth information, which the Handpose software turns into a cloud of data points it then fits into a mesh model of a hand (there are several models to suit different hand sizes). Essentially, each depth point is mapped onto the surface of the model hand so that it twists into different poses as your real hands do. It uses machine learning to interpret movements into gestures.

When I tried it, it was a bit glitchy when more than one person was in range or I moved my hands too close together, but it reliably tracked where each part of my hands were in impressive detail. This is the main challenge of hand tracking: Our hands are made up of many moving components, and the smallest movement can be a natural gesture.

Advertisement

Applied to VR, the effect is naturally more immersive; using Handpose with an Oculus Rift headset, I could see my hands as I prodded and poked a bunny that moulded to the touch like Blu-Tack and interacted with a series of virtual controllers that mimicked analogue dials, buttons, and sliders.

The advantage to using your own hands, as opposed to a haptic device such as Oculus Touch or HTC Vive controllers, is that the movement is intuitive—come across a virtual lever and, even if you've never seen it before, you know what to do with it.

"These are millions of years of evolution; they're fine-tuned with a huge amount of subtlety and precision," said Sweeney. "It's a shame to throw all that away and have to carry around a device with batteries in it."

Imagine turning on a light switch just by pointing at it

Interestingly, Sweeney suggested that haptic feedback (being able to feel what you're interacting with) wasn't really necessary when you're interacting with in-game objects that have physics—like the bunny demo—and offer visual feedback. "I can kind of feel the material interactions between my hands, psychologically," he said. This was true of my own experience.

The Handpose team prides itself on its speed and accuracy, but also its efficiency; the software is designed to use minimal computational resources and only run on CPU so you can keep your graphics card free to create the rich virtual worlds to interact with.

Sweeney wouldn't say when Handpose might be available to the public. But he hinted at applications beyond VR and AR, such as using your hands as a "low energy remote control" to interact with Internet of Things devices. Imagine turning on a light switch just by pointing at it, perhaps in combination with voice recognition.

A Minority Report-style gestural interface could be on the horizon—but thankfully it might not require those gloves.

Get six of our favorite Motherboard stories every day by signing up for our newsletter.