In the future, computers are environments rather than objects.
How's this for a bad-ass future? "Interactive self-levitating programmable matter." This is how researchers at Queens University's Human Media Lab are describing their new virtual reality scheme, dubbed BitDrones, set to be unveiled Monday at the ACM Symposium on User Interface Software and Technology in Charlotte, North Carolina.
The floating interface is enabled by swarms of nano quadcopters (the drones of BitDrones), of which there are three varieties. "PixelDrones" come equipped with a single LED and a small dot-matrix display; "ShapeDrones," which are intended to form the building blocks of 3D models, come covered in a fine mesh and a 3D printed geometric frame; and, finally, "DisplayDrones" are fitted with a curved flexible high-resolution touchscreen, a forward-facing video camera, and an Android smartphone board.
All three varieties then come equipped with reflective markers, allowing them to be tracked in real-time using motion capture technology.
It turns out that the concept of a physical 3D computer interface has been around for a long time. Even when computers still consisted of room-sized monoliths, engineers were busily dreaming up machines capable of morphing through their environments: computers as spaces.
"In 1965, [computer scientist Ivan] Sutherland envisioned the 'Ultimate Display' as a room in which the computer controlled the existence of matter," the Queens engineers write in a paper describing their BitDrone system. "According to Toffoli and Margolus, such programmable matter would consist of small, parallel, cellular automata nodes capable of geometrically shaping themselves in 3D space to create any kind of material structure. Since then, there has been a significant amount of research conducted towards this goal under various monikers, such as Claytronics, Organic User Interfaces, and Radical Atoms."
As the Queens researchers note, however, much of this work has been theoretical rather than immediately build-able. The chief problem, they say, has to do with how Catoms, the discrete units of self-actualizing hardware behind such a system, cope with gravity. Methods envisioned to levitate and control Catoms in 3D space such as ultrasound technology and magnetic levitation come with some pretty severe limitations when it comes to movement.
Enter the nano quadcopter drones. "In BitDrones, each drone represents a Catom that can hover anywhere inside a volume of 4m x 4m x 3m in size," the Queens group writes. "Drones are safe for users, who can walk around the interaction volume and interact with each drone by touch. A drone can be used for input, for output, or for both at the same time."
"Simple atomic information can be displayed by a single drone," they continue, "while more complex 3D data displays can be constructed using several drones, providing the rudiments for a voxel-based 3D modeling system capable of representing sparse 3D graphics in real reality."
The system as it currently exists is pretty limited, consisting of only three drones. They're controlled using MultiWii software running on an iMac, while the system tracks markers on the user's hands, allowing for gesture-based inputs. A persistent problem revealed by the prototype is the interference of turbulence in the environment, which limits how the drones are able to interact with each other.
Still, the BitDrones system offers a first step toward Sutherland's "ultimate display" dream. In his 1965 paper, he summarized a goal that persists over a half-century later: "If the task of the display is to serve as a looking-glass into the mathematical wonderland constructed in computer memory, it should serve as many senses as possible."