FYI.

This story is over 5 years old.

Tech

Berlin's Preeminent VR Artists Explain Why Augmenting Reality Matters

We chatted about how 3D camera technology is going to be a part of your life sooner than you think.

When motion capture technology came out, everyone started making their own versions of the big video boards from Minority Report. Martin Förtsch and Thomas Endres, on the other hand, decided to try to make Terminator. Living in Berlin, they don't sound exactly like Arnie. Not only is their accent different, they're really cheerful and friendly guys who are enthusiastically looking forward to the dawn of the 3D camera era. Part of that is going to be controlling robots in more intuitive ways, but another part of their dream is to see the world like T1000.

Advertisement

Skyping with the Deutschland duo, we chatted about how 3D camera technology is going to be a part of your life sooner than you think, how the technology can be applied at nuclear power plants, and even used to treat a rare cognitive condition. We also talked about using an Oculus Rift to give you Terminator-vision.

So who are you guys? What do you do?
Martin: Our main profession is working as software consultants at TNG Technology Consulting, since we studied computer science. And in our free time we are authors of the blog,http://parrotsonjava.com/.

The fun part is when it comes to 3D cameras, gesture control, Intel® RealSense™, and as you maybe already saw, the Oculus Rift "See the World Through the Eyes of a Terminator." We are building devices and doing stuff with 3D cameras.

Thomas: We've started with controlling objects in the real world with our bare hands. So we had, for example, a robot that you can control which will be exhibited at a German museum. And then we have this quadrocopter drone that you can fly around using bare hands only, using these 3D cameras. We also have some new ideas like games for the Oculus Rift.

How did you get started doing that?
Martin: Originally, we wanted to control the drones with LeapMotion, and after that we've just used RealSense™ technology. It's a 3D camera too, but it's different from the LeapMotion unit. At any rate, the idea was to have a quadrocopter or drone controlled by bare hands, and of course, the robot we can control with our bare hands.

Advertisement

So why do you want to control with just your hands? Why is that appealing?
Thomas: I was controlling the drone with a joystick at first. In 2011, I was controlling a Parrot AR Drone with a keyboard, but it never really felt right. I was trying a Wii balance board, and this was where it started to make sense and was fun to do. When we got in touch with these gesture control cameras we thought, "This is how you do it. This is how you want to control these real-world objects." That's where we started.

Martin: Most of the people who started working with motion capture started making stuff like Minority Report—standing in front of a glass wall and moving software information or controlling the GUI, the graphical user interface, with their bare hands. Our idea was, we don't want to do this as well; we want to control real world objects.

We've started with controlling objects in the real world with our bare hands

So what is it that you make?
Thomas: It's mainly the software. We're building an interface between the 3D camera and the object itself. It starts with grabbing the information from the RealSense™ camera and ends with commands to, for example, the quadrocopter. There's a little more hardware work that we've done for the German museum project. We also built a driving robot that can move around.

The robot itself is just a little microcontroller that you can program for motion. To connect or steer the robot we are then using the RealSense™ 3D camera.

Advertisement

Martin: The German Museum (Deutsche Museum) is the world's largest technical museum, so we're building this exhibition robot with the idea that we are using an Intel® RealSense™ camera, and someone will be able to control the drone just by using their hands. The robot will move like this:

Thomas: The cameras just started to be reliable in the last few years. There was gesture control and 3D control before, but the tech behind it was still missing, I think. Since a few years ago, we've got these time-of-light cameras that can measure how long it takes for light to bounce off an object and reach the camera. That used to not be possible in say, 2008. It just started with the RealSense™ in 2013 and then with the Kinect version 2. Before, you used to have to do really difficult stuff to do something as simple as track the hands, but it was just a new thing.

Martin: Maybe you can remember the Wiimote? It was a handheld device based on the technology on the front of the Wii remote control—an infrared camera that would trace to the front of the panel and with this you were able to determine the relative position of this handheld device. Another tech is using more straight-up RGB-based cameras, but with this you are not really able to get a 3D capture. It's really imprecise. There were different cameras—a Kinect from Microsoft, for example. Now it's getting really interesting, because the cameras are getting smaller and lighter.

Advertisement

The Kinect is useful for capturing the whole body, but the LeapMotion is more of a specialized camera. It can be used to capture special gestures and different angles.

This is the very new Intel R-200 camera, it's really, really small. It's so small you could attach it to a glass or something and you can imagine what it means. They're getting so small. It's awesome.

What does a smaller camera allow you to do?
Martin: The Oculus Device, we call it Augmented Rift. For this we used a camera, the Intel® RealSense™ F200.

Thomas: You can put it on glasses and build your own augmented reality.

What kind of augmented reality?
Martin: The original idea was we want to see the world through the eyes of a Terminator. This was the original idea. Maybe you know the red vision of Terminator as he walks through the street and faces get identified, the names were written in his view?

Thomas: He had some extra elements that he used to do his tasks too….

Martin: Even showing the pulse rate. And the cool thing is with the F200 camera, we know this is possible. You can identify faces, you can identify emotions, you can measure the heart rate, and so on and so on.

Thomas: And all the landmarks on the face—it can see where my nose is, where my mouth is. It can see me and do all the things that the Terminator did back in the 1980s.

Martin: In real time! So what we've done is we took the Oculus Rift and connected two cameras on it.

Advertisement

Martin: You can see we have two cheap camera modules that record the real world to a stereo catcher, and then we add the different information from the F200 cameras. There we gather all the information—the landmarks, pulse rate, emotions–everything.

Thomas: Then we put this into the imager using the space augmenter and display it along with the camera output. And the last thing, we are also rendering the whole thing in red so we have real Terminator vision.

Martin: This is the fun part, but there is a serious part as well. If you think more forward facing, you can use this device for people who are, for example, blind. At this point, this device is a little bit too big, because of the Oculus–but remember how the cameras are shrinking. A transparent display could be integrated in a pair of glasses soon enough and this kind of thing could become normal in just a few years.

Thomas: The first prototype for this was built in one day! At TNG we have a winter retreat where we were able to do some hacking. And we always use it for hardware hacking. Last year, we made a panoramic camera, and this year it was the Augmented Rift. It was still using Java, but it was built over the winter retreat! You can do it in one day! That's the amazing thing about all this 3D camera stuff, Oculus, and augmented reality stuff: it doesn't take much effort.

Martin: Back in the days of IT history, augmented reality history, it would've taken years to build such prototypes, but nowadays with Raspberry and Oculus and everything you can just connect those things, do some development, and you're hopefully fine!

Advertisement

How do you see this tech being used? Not just the Augmented Rift, but the hand-based control?
Thomas: There was one client from an energy company, and he was telling us that he needed to inspect these cooling towers at nuclear plants. Normally, it would be a big construction project, but using drones you could do this much more easily. You wouldn't need all of that money to build a bunch of ladders and everything. We didn't end up doing it in the end.

Martin: Another energy company had trouble with people who were working with high voltage. This is dangerous so they have special gloves. But the problem is, when they have to control a laptop, they just pull off the glove and type in what they need. Then, they reach back and touch the high-voltage. This is a really big problem so the idea was, What if you could use gesture control so there is no need to pull off the glove? Another possibility is to use gesture control for medical devices. There is already a medical device from the company, GestSure, and they've built a prototype that's gone commercial. With this device, you are able to scale pictures and zoom in and out for medical images–from x-rays, or CT, or whatever. The idea is that the doctor is able to transform this data as he wants, without touching anything, so he will remain sterile, for example.

We want to build showcases where people say, "This is astonishing. This is crazy," and we want to show what you can do. Like the game we're working on!

The important thing is that 3D cameras are getting more and more interesting because they're already getting built into, for example, Hewlett Packard laptops. The F200 and the R200 camera are integrated into tablets and all-in-one PCs. So what happens? People will get in touch with this technology and they'll ask, "Hey, what can I do with this?" That's what developers are going to be tasked with doing.

The Intel® Software Innovator program supports innovative independent developers who display an ability to create and demonstrate forward looking projects. Innovators take advantage of speakership and demo opportunities at industry events and developer gatherings.

The Intel® Developer Zone offers tools and how-to information to enable cross-platform app development through platform and technology information, code samples, and peer expertise in order to help developers innovate and succeed. Join communities for the Internet of Things, Android*, Intel® RealSense™ Technology, Modern Code, Game Dev and Windows* to download tools, access dev kits, share ideas with like-minded developers, and participate in hackathons, contests, roadshows, and local events.