Greg Chamitoff, Michael Fincke, and Richard Garriott pose with SPHERES. Image: NASA
Hey, so you’ve seen Star Wars: A New Hope. Remember that free-floating training orb that shoots lasers at a blindfolded Luke Skywalker? The robot inspired Luke to use the Force, obviously, but it also left quite an impression on the non-fictional MIT professor David Miller. So much so that in 1999, he told a class of his undergraduate students to build him one.
That is the origin story of the ingenious robots known as SPHERES (Synchronized Position Hold, Engage, Reorient, Experimental Satellites). Miller’s class built five prototypes of the scifi-inspired satellites, and in 2006, three of them were launched to the International Space Station.
The orbs are about the size of a volleyball, and propel themselves around the space station using 12 carbon dioxide thrusters. Over the last eight years, they have been primarily been used as test beds for emerging technologies, such as repairs, automated docking, satellite servicing, and other experiments that require a free-floating platform.
But as practical as these tests have been the SPHERES program is about to receive a novel upgrade that could dramatically boost its usefulness. No, it’s not a laser gun for Jedi training (we’ll get there one day). It’s the Project Tango prototype Android, cooked up by Google’s Advanced Technology and Projects Division.
As Motherboard’s Joseph Cox reported back in February, Project Tango is a phone and tablet platform that can generate instant 3D-models of all types of environments. “The goal of Project Tango is to give mobile devices a human-scale understanding of space and motion,” said Google in its February 2014 announcement of the project.
“What if you could capture the dimensions of your home simply by walking around with your phone before you went furnitures shopping?” the announcement mused. “What if you never again found yourself lost in a new building?”
The delivery of two Project Tango smartphones to the ISS last Wednesday raises a new hypothetical: What if you could create free-floating satellites that “knew” their orientation in 3D space? What if the ISS’s astronauts were helped out in their daily tasks by autonomous orbs decked out with advanced mapping techniques?
Interview with Smart SPHERES project manager Chris Provencher. Via NASA/YouTube.
These what-ifs will be ironed out as the new “Smart SPHERES” are tested by the ISS crew. The planned experiment has two phases to be carried out over the coming weeks. The first will be to have the Smart SPHERES generate a complete 3D model of the space station's interior using Project Tangos cameras.
After the map is created and coordinate points are logged, the SPHERES will move around different waypoints in the station, recording their progress and feeding it back to flight controllers at NASA Ames’ Intelligent Robotics Group.
The ultimate goal is to create satellites that can actually take over some of the housekeeping duties of the astronauts, such as video surveys or air quality and flow measurements.
"By connecting a smartphone, we can immediately make SPHERES more intelligent,” said lead engineer DW Wheeler in a NASA statement. “With the smartphone, the SPHERES will have a built-in camera to take pictures and video, sensors to help conduct inspections, a powerful computing unit to make calculations, and a Wi-Fi connection that we will use to transfer data in real-time to the space station and mission control."
In the mere 15 years after David Miller challenged his undergrads to build him a Jedi laser-ball, the concept has evolved into an intelligent helper-bot for the ISS crew. Clearly, Miller’s next challenge should be building a hyperdrive.