Controlling Robots In Virtual Reality Could Make Work Safer For Everybody

Could all those years playing MechWarrior actually translate into a real job?

Man and machine may eventually stroll happily into the singularity-tinted sunset hand in mechanized hand. But for now, the two don't make the best coworkers. Toiling alongside gigantic semi-intelligent machines puts workers at great risk of bodily harm. And even in the cases where humans aren't crushed quite so easily, they still face other detrimental effects like the "increased risk of mental illness" (among other things) that warehouse employees face due to the strain of being forced to behave like machines.

Most of the solutions that engineers and researchers have begun to advance focus on improving the behavior and sensitivity of the robots. Attaching sensors to human workers could give robots a better sense of contextual awareness, for instance. And developing robust networks for robots might help them communicate with one another to better avoid the pitfalls of navigating a workspace.

Ideas like these are promising. But they rely on advances in artificial intelligence and sensor technology, which might not keep ahead of the market forces pushing robotics to the forefront of economic life. So what if we could take human workers out of the equation? Not by stripping them of their jobs, but safely ensconcing them in another part of a factory that isn't within reaching distance of all the heavy machinery?

That's the idea behind some new research coming out of Johns Hopkins's Computational and Interactive Robotics Laboratory called the "Immersive Virtual Robotics Environment," or IVRE for short. By its description, the system "enables a user to instruct, collaborate and otherwise interact with a robotic system either in simulation or in real-time via a virtual proxy."   

In other words: control the robots from a safe distance using virtual reality programs. Or a less immersive computer program, I suppose. But come on; what kind of MechWarrior fan doesn't get excited by the thought of controlling a real-life robot with an Oculus Rift?

IVRE developer Kel Guerin showed off a demo of IVRE prototype in a recent video posted on the projects website. It's still in its early stages, so the person wearing the Oculus Rift isn't stepping into a full-blown rendition of the environment in which the robot is located and issuing commands in real time. And he's not embodying the robot either, so to speak. Rather, the user here steps into a virtual rendition of the room holding the robot, and controls it the way anyone operating heavy machinery would—only he's not actually there. In the video, the user tests out a number of different motions to determine what the best course of action is for a robot claw to pick up a series of boxes.

"Once we've decided that the trajectory is acceptable, it can be played back on the real robot without any errors," Guerin says at the end of the demo as the real claw starts to move.

If it proves effective, a system like this could improve in a number of different areas. First, there's the obvious benefit I've already identified: keeping humans from being gored by their mechanized employees. But it could also bolster safety and security by letting humans stay further away from unsafe working conditions. Robots have already proved useful when it comes to dangerous tasks like cleaning up the radioactive detritus of the Fukushima-Daiichi nuclear-power plant. But as a Wall Street Journal article from earlier this month showed, the robots being deployed there aren't exactly perfect.

"They cost millions of dollars each and most can perform only one or a few tasks," the Journal reported. "Some of them need other robots to do preparation work for them. They also have accidents, and at least one had to be written off as lost after it got stuck inside the plant."

Getting a better handle on guiding and steering these robots could help them perform more effectively and safely.

But there's another possible benefit that stems from VR that could have profound implications when it comes to the labor market: it alleviates some of the pressure to acquire new skills that displaced workers face. The IVRE system that the John Hopkins researchers showed off used an Oculus Rift prototype, a gadget that was initially developed to make video games. Playing games doesn't require anything more than a suspension of disbelief and a degree of patience.

Much of the discourse around technological unemployment has centered on investing in education for specialized fields like engineering and programming. Could virtual reality offer a different solution for the short term by emulating the form and function of various lower-tech professional tasks? Obviously many jobs have already been taken over completely by robots. But systems like IVRE could help retrain certain types of workers at a lower cost than, say, trying to teach them all to code.