The Year in Speculations
​Image: ​Wikimedia

FYI.

This story is over 5 years old.

Tech

The Year in Speculations

Assessing and discussing the top speculations of 2014.

Before the new year rolls around and we all find ourselves subjected to rampant speculations about What Will Happen in 2015, maybe we should take stock of all the predictions that were bandied about last year.

Science, business, journalism, governance—much of what we do is based primarily on speculation. How people think the world works, what people think other people need and need to know, what people think their lives should look like. Speculation is fundamental to being human.

Advertisement

Of course, in the age of mass media, some of those speculations are apt to get stuffed into the megaphone and broadcast above all others—sometimes, for the betterment of society (hurricane's coming) sometimes for the worsening of it (the future is Google Glass). So, since Motherboard is ultimately a magazine about the future, I thought it'd be worth examining some of the more prominent speculations made over the course of 2014: What better group of folks to guide you through predictions about the future than those who spend a weird amount of time thinking about it? ​

Without further ado, here's the year in speculations.

Facebook is dead.

This prediction seems to come out every year now. "Facebook will undergo a rapid decline in the coming years, losing 80 percent of its peak user base between 2015 and 2017," according to a study from Princeton's department of mechanical and aerospace engineering that came out in January of 2014.

This study applies patterns of disease epidemics to social networks, looking at metrics like "infectious recovery dynamics such that contact between a recovered and infected member of the population is required for recovery." That is a lovely idea, but like all the other "Facebook is dead" studies, this one seems flimsy in the face of the social network's continued dominance: 1.35 billion active users as of September, which makes Facebook more popular than English.

Advertisement

-Adrianne Jeffries, managing editor

"The internet of things won't work because things won't work."

This ​is speculation from John Welsh, a writer at a blog that I have been unable to learn basically anything about, and it didn't get much attention, but I think he hit on something. We've heard so much about the internet of things and what it'll do and how it's going to make everything work better together and how our toasters will talk to our smartphones, which will talk to our cars, which will talk to each other, which will talk to the traffic grid. Except, we can't get any of this shit working correctly right now. My phone won't connect to my Bluetooth speaker. My Playstation has to update every time I want to do anything. My router breaks regularly. How do we expect all this stuff to work together? I think Welsh really nails it here:

"You walk up your front steps with your hands full, furious that the door isn't unlocking. You put your stuff down and enter the 6 digit passcode on your phone that manually overrides the lock. The fridge door is covered in blinking alerts: TURKEY LOW; FRIDGE LEAKING; ERROR MESSAGE 7: UPDATE REQUIRED. You grab the turkey out of the fridge, and as you're eating it, you close out of all the alerts. ERROR MESSAGE 7 seems to be stubborn, so you investigate further, and the fridge is telling you an update is required, because the egg counter only counts white eggs for some reason, but you got brown ones, and this update will fix that. You update the software, but in doing so, it reboots, and you have to manually enter everything that's in the fridge again."

Advertisement

Sounds about right.

-Jason Koebler, staff writer

Illustration: ​Pixabay

We'll see malicious AI in "10 years at most"

It's spectacular, in that special "watching a train wreck" kind of way, to watch a public figure like Elon Musk mix wildspeculation, sensationalism, and a certain eye-sparkling panache into a jug of internet Kool-Aid absolutely irresistible to certain segments of the media. This year, Musk called artificial intelligence development "summoning the demon," which would be rad as hell if it wasn't so off-base.

He also predicted that we'd see strong AI within 10 years. Movie-like AI may indeed be a reality one day in the far, far future, but 10 years is some truly insane shit, especially given that Musk had previously predicted self-driving cars would be ready in five. The terrifying, and more likely, reality is that we don't need to have a real-life Terminator for technology to totally ruin our lives.

-Jordan Pearson, writer

"Actually it's about ethics in gaming journalism."

Oxford Dictionaries defines "speculation" as "the forming of a theory or conjecture without firm evidence," which clearly characterises the attempted justification for GamerGate's hate campaign against women in the gaming community (albeit much more politely than I would have put it). The internet furore was fuelled by conspiracy theory-style speculation that the integrity of gaming journalism had been sullied by the dastardly charms of power-hungry women intent on corrupting the gaming community in pursuit of their feminist agenda, one immorally-obtained non-existent review at a time … or something like that.

Advertisement

The initial allegations didn't hold any water and the whole thing was clearly ridiculous from the start, a shameful bid to excuse misogynistic behaviour from doxing female game developers to threatening a school shooting at a feminist critic's lecture. It would be laughable, except for the fact that the abuse rationalised by this speculated conspiracy is real.​

-Vicki Turk, UK editor

"Content" is dead

It's been lamented for some time now, but 2014 was the year that the pushback against "content"—a word used as a means to an end by folks looking to order 10 "pieces of content," whatever it may be—gained wider traction than ever, and that's good!

It's a word representative of the commodification of journalism, of storytelling, of authenticity, and a dozen other words that also sound cliché thanks to their hijacking by the almighty Lord Content and that hellbeast's many followers, all in the pursuit of dragging eyeballs to pages, not for the sake of learning or discussion or community or the wider dissemination of human knowledge, but simply to say that said eyeballs had arrived, with the quality or point of the destination a mere afterthought to be figured out later by the writerly types. This myopic view of content production—of doing journalism, of writing stories, of making comics, or whatever else—is depressing as hell, and I hope it dies forever.

But it hasn't yet! At the end of 2014, the content concept is as alive as ever. Why? Because no one's figured out how to rebrand it. Rebrand, gross as it is, is the appropriate word, I assure you. Just look at some of the folks who pushed hard for the end of "content" this year: the very people who created it in the first place.

Advertisement

Take this Medium post that argues that the word "content" is killing the publishing industry. True! Except who's it written by? A "media entrepreneur and innovation consultant" who says we should replace the content model with buzzy-ass buzzphrases like "capture the power of uniqueness" and "the wrong way is the right way." So instead of thinking about making certain amount of content, the goal is to make whatever we want, as long as it's different! Stuffing the content pigeon in a different hole isn't real change.

Or here's another post from this year, which also calls for the end of the word "content." Hey, guy, you've got my attention! Let's kill this word, which is rooted in the soulless, empty idea that any old blog post is as good as another, as long as we make enough of them to satisfy a variety of quotas. So what should we do instead?

"We need to review the content strategy for next quarter" becomes, "We need to review the 10 blog posts, five short-form instructional videos, an eBook, and an infographic for next quarter."

This level of details makes the strategist uncomfortable because this description prompts the practical "get it done" people to ask important questions.

I'm actually frowning at my computer right now. The reason the word "content" espouses everything terrible about modern media is *most definitely not* that it's not specific enough about commodifying editorial products. That goes further in the wrong direction, and away from a model that supports journalism and storytelling in a variety of forms—longform, blogging, lists, whatever—because of a fundamental understanding that each story is different and thus requires a specific approach to be most effective. Saying "well, our audience probably will engage most highly with our brand if we publish 10 blog posts of X length on X days" is asinine because it's forcing stories to fit a mold, and not the other way around. Getting that backwards ends in failure nearly every time.

Advertisement

-Derek Mead, editor-in-chief

"We're finally ready to embrace wearable computing" and Google Glass will become the "must-have tech gadget of the year."

I don't know if this is a "wrong" or a "not yet"—I guess that's true of any prediction about the future—but the Washington Post was confident that "the consumer technology market is finally ready to embrace wearable computing as a full-on trend in 2014" and that Google Glass was en route to becoming the "must-have tech gadget of the year."

Even if Sergey Brin thinks its emasculating, I'm pretty sure that the must-have tech gadget of the year was the iPhone 6. I haven't seen anyone wearing a smartwatch out in nature, and I'm pretty sure I haven't seen anyone wearing Google Glass since I went to San Francisco in February, and even then that was just a dude in the airport.

I'm sure that wearable tech is eventually going to win out, and maybe in Washington DC's fancier circles it has, but it seems like there are a lot of barriers yet to be overcome—from legal challenges, to being extremely expensive, to looking like a total fucking dick when you wear them. Come to think of it, I can see how they'd be a huge success in Washington DC's fancier circles.

-Benjamin Richmond, contributing editor

We're going to do so much damage to our planet that we have to leave.

Interstellar created a lot of fodder for debate over the artistic, scientific, and cultural merits of the film. The world disagreed on whether or not it was 2001: A Space Odyssey for the modern age and if it's theoretically possible to travel through a wormhole like that. But most people were willing to accept the basis of the film, which is that mankind will eventually fuck up the Earth so bad that we'll have to seek a new home on another planet. That's probably because there's already plenty of evidence that that's exactly what we're doing.

Advertisement

-Kaleigh Rogers, staff writer

Science is ready for the prime time again

In the weeks leading up to the reboot of Carl Sagan's Cosmos this spring, there was a lot of speculation about whether a science documentary was capable of mainstream success. Fortunately, the series was one of the most anticipated shows of the year, premiering on March 9 in 181 countries and 45 languages, with an introduction from President Obama. A second season is reportedly in the works now, too.

If the public response to Cosmos is any example, there is an enormous and largely unmet public appetite for science narratives out there. The trend was further corroborated this fall by two high profile biopics of scientists: The Theory of Everything, about Stephen Hawking, and The Imitation Game, about Alan Turing. The scientist's archetypal "hero's journey" seems to be increasing in popularity, accompanied by a renewed romanticization of science as a whole.

It's not as if this is a 21st century novelty or anything—public fascination with scientists has been waxing and waning for centuries. Regardless, it does seem like we are entering a new renaissance of mass engagement in scientific inquiry, and not a moment too soon. Case in point: when Kim Kardashian tried to break the internet with her butt last month, she was summarily beat out by coverage of the Rosetta mission's historical landing on a comet. Well done, everyone. Here's to more diversity in science narratives in 2015.

Advertisement

-Becky Ferreira, contributing editor

"A drone will crash into a plane"

Many, many people have said this. Right now, it's the number one fear associated with drones. And, yeah, it'll definitely could happen. Maybe next year, maybe in 10 years. The question is: Do we give up on an entire technology because we're scared something might happen?

-Jason Koebler

"If we release a fraction of Arctic carbon, we're fucked"

When I saw a respected card-carrying climatologist tweet that out in the middle of summer, my jaw dropped. It's hard to get my jaw to do that, because in the years since I have been reporting on climate change, weariness and cynicism have fused it shut. When ​I interviewed the scientist to confirm if he meant what he said, and he did, it dropped again.

Now, there's a lot of debate over how great a problem methane release in the Arctic really is, and a lot of work left to be done to understand how and why it happens. But the fear is that methane, which is a much more powerful heat-trapping gas than carbon, will start leaking out of the Arctic as it begins to melt. The more it melts, the more methane will spew, and the hotter it will get, and the more it will melt. Call it the world's most vicious cycle.

Regardless, it was an especially harrowing warning about the heating of the planet's climate from an informed source, in a year full of them.

-Brian Merchant, senior editor