FYI.

This story is over 5 years old.

Tech

Uninformed Tech Regulations Will Either Kill Innovation or Risk Human Lives

The US can't even figure out how to maintain a free and open internet—how's it going to regulate new technologies that let people enhance and clone themselves, create synthetic organisms, and cheat death?
Image: Shutterstock

The United States can't even figure out how to make regulations to maintain a free and open internet—how's it going to regulate new technologies that let people enhance and clone themselves, create synthetic organisms, and, perhaps, cheat death?

The reality is, aging politicians and slow-moving legislatures are probably not going to be able to. Countries are increasingly going to have to rely on the communities of people developing and using future tech to regulate themselves in the absence of real rules. There's a long-known reality that technology moves faster than culture (and much faster than politicians), but it's one thing when that technology is, say, a tablet computer, and a completely different one when it's DIY neuroscience.

Advertisement

"For the first time in hundreds of thousands of years, our technologies are not aimed outwards at modifying our environment—increasingly, they're aimed inward, at modifying our minds, our memories, our metabolisms, our personalities, our kids," Joel Garreau, a Future Tense fellow at the New America Foundation, said at a Washington DC discussion about future tech regulations. "If you can do all that, you're in the stunning position of being the first species to take control of your own evolution—not in some distant science fiction future, but right now, on our watch."

And, with that, comes the unenviable position of figuring out whether it's wise to let humans play with the building blocks of life to make synthetic organisms, whether it's OK for people to implant computers into themselves, whether it's OK to make designer babies. And, perhaps more immediately pressing, whether it's OK for people to drive with Google Glass, fly drones wherever they want, or for internet service providers to discriminate against certain types of traffic.

"They're talking about spending 5-10 years to regulate technologies that are already 5-10 years old."

"If we're waiting for the House Judiciary to solve our problems, we're toast, because the gap keeps getting wider," Garreau said. "They're talking about spending 5-10 years to regulate technologies that are already 5-10 years old. I see nothing good coming from that … I think that [kind of system of regulation] is doomed."

Advertisement

Garreau says that, instead of making laws at a federal level, we have to begin looking at bottom-up, self regulation from people who know what's going on best. Basically, what happens is you have people who get into a specific type of technology, enjoy doing it, and don't want to see their hobby and their innovations being killed by a government that doesn't fully understand it. So smaller groups, and the hobby as a whole, condemn a few bad actors who might be acting recklessly.

We're already seeing this type of regulation in the drone hobby world and among groups of people who do DIY biology—in those communities, there are forums, worker groups, and, in the case of DIY bio, laboratories that either encourage or, in some cases, require safe practices before one can be admitted. Whenever someone does something stupid with a drone, for instance, I check one of several forums to see the sorts of reactions that are happening. Uniformly, people are worried that people who don't know what they are doing are going to give a bad name to the entire hobby.

Similarly, in DIY bio, most community laboratories teach safe practices and won't allow most people to work with dangerous organisms or perform certain types of experiments.

"You see these ethics and morals evolve fast, from the bottom up—I've got a lot more optimism about the people actually doing it coming up with ways to make sure we don't destroy the human race than I do in the FDA," Garreau said.

Advertisement

We've also seen this with human cloning: Garreau says that there has been "so much revulsion to human cloning" that there has been a de-facto ban against it, even without regulation (though there's been some of that, too).

The question then becomes: is self-regulation enough?

Regulate too harshly, and you kill industries before they even get a chance to start. Stay too hands-off, and we might end up with a country where drone-airliner crashes become common, a country where genetically-engineered athletes are the norm, or a country where synthetic life beats out naturally occurring life in the ecosystem.

Even if these emerging industries and hobbies initially govern themselves well, it doesn't always scale. Eventually nascent tech trends become too mainstream and too lucrative, or too scary for Washington to ignore completely. Once politicians look to the "industry" to make formal regulations, they tend to turn to the major players—the Googles and Facebooks and Comcasts of the world—and those companies are looking out for their bottom line.

We've already seen inklings that drone rules might be written by those who already dominate the industry, and we've seen an FDA e-cigarette rule proposal where the barrier to entry to creating a new e-cig company is so high that only already established firms can afford to stay in the marketplace.

It's not hard to imagine a company that makes DIY bio labs expanding throughout the country and then helping to write regulations that say you can only start a new lab if you've got a certain amount of money or demonstrated expertise in the field? Or a biohacking chip manufacturer writing regulation that prohibits people from modifying themselves unless they use an FDA-approved chip that costs millions of dollars to get approved.

Advertisement

And, by the time Congress looks to the industry to help it write actual regulations, it doesn't turn to the hobbyist groups to help it nail down the legalese, it looks to those bigger players, who often help write the laws themselves. That's why Michael A. Rogers, the New York Times' former "futurist in residence," said that it's not always best to reach out to the industry itself when regulating, especially when the industry has matured.

"It's not the little guys [helping write regulation], it's the big guys," Rogers said at the event. "Gigantic lobbying interests are doing their best to keep the internet from being regulated. That's why the best thinking about privacy and data retention is going on in Europe, where they aren't as hindered by [corporate interest]."

Basically, it's hard to determine which way is best, but the current system of regulating individual technologies after they've already become huge isn't going to work. Our culture and our politicians are going to have to figure out a way of keeping up with technology much better than they have been, so they aren't sitting around waving an e-cigarette and asking why it has a plug at Congressional hearings, years after the devices have already become popular.

"The question I'm asking," Garreau said," is 'Can we figure out a way to accurate the curve of human response in an identical way that DARPA accelerates things like technology?'"

Maybe it'll take a biohacking chip implant in each of us to do that—if we're allowed to get them, that is.