FYI.

This story is over 5 years old.

Tech

This Composer Made Music Out of Gravitational Waves

Arthur Jeffes collaborated with NASA astrophysicists to turn a billion-year-old star signals into music.
Jeffes aimed to capture and reinterpret the gravitational waves' different sonic textures. Image: NASA

Since the LIGO announcement last week, gravitational waves—ripples in the curvature of spacetime—have been in the limelight. Now one composer has reimagined the star signals from over a billion years ago as music.

Arthur Jeffes, a London-based composer has taken the gravitational wave data and turned it into something akin to a chill spacedub.

Collaborating with Samaya Nissanke—a NASA astrophysicist whose team detected the gravitational waves—Jeffes took the "chirps," the sonic expression of this phenomenon, and crafted his own music around them.

Advertisement

Jeffes, whose friendship with Nissanke goes back several decades, started planning the project with her 18 months ago. "We had this idea where we thought it would be great to work together and model the chirps," Jeffes told me over the phone.

Jeffes has always been fascinated with space sounds. Back in 2012, he worked with Nelly Ben Hayoun, a French designer, to turn Wow! signal—strong narrowband radio signal—into music.

This time around, Jeffes took data from the two black holes colliding and used the audio editing software Logic to chop, stretch, and overlay his own music over the sounds. The waveforms, according to Jeffes, are like exponential curves that acquire a higher pitch as they peak. To obtain his piano lines, Jeffes took these curves and mapped them into MIDI patterns.

"If you stretch them (the waveform models) out, you get other waves inside them. […] You can get the computer to just track the shape of the waveform—and that's how I was getting all the piano melodies," he said.

In addition to his gravitational waves project, Jeffes is also working with another NASA astrophysicist, Jean-Michel Desert, to create an algorithm that turns data from exoplanets into music.

"We wrote an algorithm where you take eight characteristics of any mean planet—things like the gravity of the surface compared to Earth's, for example," said Jeffes.

"The idea is that when you get a planet that is very like Earth, then the melody the algorithm generates is very nursery rhyme-like with small intervals, but if you get a planet like Jupiter with different characteristics, the sounds will turn out bizarre."

Up next, Jeffes, Nissanke, and Desert will be working with Marshmallow Laser Feast to bring their music to life audio-visually. A virtual reality experience will, for example, allow users to feel like they are directly seeing and listening to the gravitational waves. "We want to make it work on an interactive level," said Jeffes.