A NASA model of a pulsar via Wikimedia Commons.
The humble home or office computer—so commonplace that they’re being phased out for the novelty and convenience of tablets—is capable of amazing things. And not just in a “if you brought one back to World War II, you’d be king” sort of way. You see, a network of home and office PCs just surveyed the Milky Way and found 24 pulsars. Stick that up your Google glass.
The project Einstein@Home combines the computing power of everyday PCs (and recently, Android devices) from all around the world, all owned by anyone who’s interested in “donating” their unused computing cycles to searching through archival data and looking for weak astrophysical signals from spinning neutron stars, known as pulsars. The 200,000 computers combined to create 860 teraflops of computer power, which is supercomputer speed.
The international team of researchers, led by scientists at the Max Planck Institutes for Gravitational Physics in Hannover, employed this mighty cloud to sift through data collected from 1997 to 2001 by CSIRO’s Parkes radio telescope in southeast Australia.
The team was specifically looking for pulsars, which are remnants from the explosions of massive stars. They are dense, highly magnetized neutron stars that rotate rapidly and emit a beam of radio waves along their magnetic axis, like a beam from a light house. When the radio beam sweeps past Earth, they can be detected.
Six of the 24 pulsars discovered with Einstein@home are in highly-prized binary systems. According to the study, “pulsars in binary systems are of particular interest to the astronomers. That is because these objects allow insights into their formation history and because they can be used as testbeds for Einstein's general theory of relativity.” The resulting study was published in The Astrophysical Journal.
Utilizing a bunch of idle civilian computing power is an elegant solution to one of the inherent complexities in so-called “data-base astronomy.”
In astronomy as well as physics, researchers are collecting more information than they have time or resources to search through. The Large Hadron Collider at CERN was producing one petabyte of data every second, but researchers have to filter out most of it, so CERN “only” stores 25 petabytes each year—which is still the equivalent of 1,000 years of DVD-quality video.
While NASA’s Kepler spacecraft is no longer operable, scientists have so much data to sort through from the four years of observations that they’re confident the mission’s “most interesting discoveries are still to come.”
Terrestrial radio telescopes, like the Parkes and the British radio telescopes used to confirm the findings, generate huge amounts of data too. In April 2012, IBM announced that it was developing what would be the world’s largest space telescope, which would generate one exabyte daily—which is in the realm of the total daily traffic on the internet.
Even with this much power, it took Einstein@home eight months to search through all the data. This seems like a long time—I think my computer’s five minutes of booting up feels like a lifetime—until you consider that doing the same task with a single CPU core would have taken 17,000 years. Further proof that a long time is relative.