Terraform

A New Study Suggests There Could Have Been Intelligent Life on Earth Before Humans

Looking for aliens across deep space is great, but have we looked hard enough in our own terrestrial backyard—here on Earth?

Becky Ferreira

Becky Ferreira

The artist Surian Soosay offers a playful depiction of 'the Silurian Hypothesis'

One author of the new study, leading climatologist Gavin Schmidt, wrote a work of fiction to explore its findings. Read 'Under the Sun', which we published at Terraform alongside the following piece.


The human yearning to connect with other intelligent life-forms runs deep, and it has become the driving force behind a dazzling range of scientific pursuits. From the SETI Institute’s radio sweeps of the sky, to the discovery of liquid water on neighboring worlds, to the thousands of exoplanets detected over the past two decades, there have been major gains in chasing one of the ultimate cosmic mysteries—whether or not we are alone in the universe.

But in our rush to search for life by peering into deep space, have we overlooked the merits of looking for it in deep time? Earth is the only planet that we’re absolutely certain can support a technologically advanced species, yet little thought has been lent to the possibility that during its 4.5 billion year lifespan, our own world might have produced more than one industrialized civilization.

Outside of some science fiction stories and a speculative paper by Penn State astronomer Jason Wright, little serious thought has been afforded to the possibility that we humans are not the first species to build an advanced civilization in the solar system’s history.

“It actually hasn’t been explored that much,” climatologist Gavin Schmidt, director of the NASA Goddard Institute for Space Studies in New York, told me over the phone. “It never gets brought up as a potential thing that you want to look for.”

So, Schmidt paired up with University of Rochester physicist Adam Frank to co-author a paper entitled “The Silurian Hypothesis: Would it be possible to detect an industrial civilization in the geological record?” The hypothesis borrows its “Silurian” title from the fictional reptilian species depicted in the science fiction franchise Doctor Who—these scaly Silurians flourished on Earth many millions of years before the dawn of our own society.

Published this month in the International Journal of Astrobiology, the work outlines what kind of signature a technologically adept species might leave behind. Schmidt and Frank use the projected footprint of the Anthropocene, the current era in which human activity is influencing planetary processes like climate and biodiversity, as a guide for what we might expect from other civilizations.

“There’s lots of things that are going well for [human civilization], but there’s a big price that’s being paid in the ecology and biology,” Schmidt told me. He emphasized that many of these consequences can seem to be “out of sight, out of mind” due to conveniences like sewage infrastructure and garbage relocation. But when considered in totality, anthropogenic activities really add up, and impact the geological record. “All of the waste and footprint is being hidden from us, but it isn’t hidden from the planet,” he said.

It’s unlikely that any massive telltale structures would remain preserved through tens of millions of years of geological activity—that holds true for both human civilization and any potential “Silurian” precursors on Earth.

Instead, Schmidt and Frank propose searching for more subtle signals, such as byproducts of fossil fuel consumption, mass extinction events, plastic pollution, synthetic materials, disrupted sedimentation from agricultural development or deforestation, and radioactive isotopes potentially caused by nuclear detonations.

“You really have to dive into a lot of different fields and pull together exactly what you might see,” Schmidt said. “It involves chemistry, sedimentology, geology, and all these other things. It’s really fascinating.”

In his spare time, Schmidt also wrote a short story called “Under the Sun,” which Motherboard is publishing alongside this article in Terraform, and which dramatizes some of its key ideas. The story follows Stella, an environmental scientist who stumbles on evidence of past intelligent life in sediments from dating from the Paleocene–Eocene Thermal Maximum (PETM). This warming period occurred approximately 55 million years ago, when “something triggered a massive shift in the global carbon cycle” and “all the environmental indicators went haywire.”

Stella’s fascination with the PETM reflects Schmidt’s own ruminations about this mysterious period of climate change, when average global temperatures were about 8°C Celsius higher than they are today. Some 15 years ago, he was discussing the geological impact of the PETM with colleagues, when he realized it would be somewhat analogous to the predicted aftermath of the Anthropocene.

In “Under the Sun,” that connection between the PETM and the Anthropocene is made explicit—it’s the nuclear fingerprint that gives Stella and her colleagues their Eureka moment. Despite the thrill of this discovery, the story hints at the ominous consequences of detecting radioactive fallout from a past society, even as nuclear sabre-rattling continues unchecked in human civilization.

In this way, Schmidt’s paper and his short story both relate the Silurian hypothesis to the Drake equation, which is a probabilistic approach to estimating the number of intelligent civilizations in the Milky Way, developed by astronomer Frank Drake.

One of the equation’s key variables is the length of time that civilizations are capable of transmitting detectable signals. A proposed reason that we have not achieved contact with an alien species is that this “length of time” variable may be extremely short—either because technologically advanced civilizations self-destruct, or because they learn to live sustainably on their home worlds.

“It might be the detectable period of a civilization is much shorter than its actual longevity, because you can’t last a long time doing the kinds of stuff that we’ve been doing,” Schmidt explained. “You either stop, because you’ve messed it all up, or you learn not to do it. Either way, the burst of activity, wastefulness, and massive footprints is actually a very short amount of time.”

“Maybe it’s happened a billion times in the universe,” he added, “but if it only lasted 200 years every time, then you’d never see it.”

The same logic holds for any previous civilizations that may have flourished on Earth, only to either collapse in ruin or scale down on activities that threaten their lifespan. There are definitely some not-so-subtle lessons that humans can take from this forked path which is, after all, an industrial version of the age-old evolutionary mantra—adapt or die.

That, for Schmidt and Frank, is one of the core themes of the Silurian hypothesis. If we can mull over the possibility that we are not the first Earthlings to have produced a technologically advanced civilization, perhaps we can better appreciate the precariousness of our current situation.

“Our thinking about our place in the universe has been this progressive distancing of ourselves from the study,” Schmidt told me, citing outdated beliefs like the geocentric model of the universe. “It’s like a stepwise retreat from a total self-centered view, and [the Silurian hypothesis] is really just one extra way of doing that.”

“We do need to be objective and open to all sorts of possibilities,” he added, “if we’re going to be able to see what the universe actually has to offer us.”

Motherboard’s documentary series “Dear Future” was nominated for a Webby. We’d love your vote, and it only takes a minute.