Is Computer Security Becoming a Hardware Problem?

The co-author of the TLS standard has some concerns about the complexity of our machines.

In December of 1967 the Silver Bridge collapsed into the Ohio River, killing 46 people. The cause was determined to be a single 2.5 millimeter defect in a single steel bar—some credit the Mothman for the disaster, but to most it was an avoidable engineering failure and a rebuttal to the design philosophy of substituting high-strength non-redundant building materials for lower-strength albeit layered and redundant materials. A partial failure is much better than a complete failure.

In a new piece for the Communications of the ACM, Paul Kocher, chief cryptographer at semiconductor firm Rambus, argues that current computing devices are similarly vulnerable: "Today's computing devices resemble the Silver Bridge, but are much more complicated. They have billions of lines of code, logic gates, and other elements that must work perfectly. Otherwise, adversaries can compromise the system. The individual failure rates of many of these components are small, but aggregate complexity makes vulnerability statistically certain."

To Kocher, this is a scaling problem. While the complexity of our machines increases exponentially, the development of new, reliable security schemes has not kept pace. Instead, security engineers take comfort in the complexity itself. That is, we make claims about the strength of our (weak) digital defenses based simply on not knowing which elements may actually fail. We can build a solid-looking bridge, jump up and down on it a few times, and call it safe—but this isn't nearly the same thing as going through the bridge piece by piece and testing every possible (excessive) force on every one of them individually. To engineers, it might just look futile.

In 1996, Kocher co-authored the SSL v3.0 protocol, which would become the basis for the TLS standard. TLS is the difference between HTTP and HTTPS and is responsible for much of the security that allows for the modern internet. He argues that, barring some abrupt and unexpected advance in quantum computing or something yet unforeseen, TLS will continue to safeguard the web and do a very good job of it. What he's worried about is hardware: untested linkages in digital bridges.

"The individual failure rates of many of these components are small, but aggregate complexity makes vulnerability statistically certain."

"Most computing devices running SSL/ TLS are riddled with vulnerabilities that allow adversaries to make an end-run around the cryptography," he writes. "For example, errant pointers in device drivers or bugs in CPU memory management units can destroy security for all software on a device. To make progress, we need another building block: simple, high-assurance hardware for secure computation."

What does that actually mean? Kocher's solutions lie in dedicated, quarantined, and simple hardware. If a computer must do a cryptographic calculation, that calculation should be computed somewhere apart and somewhere shielded from the complex goings on of the rest of the system.

In the mid-'90s, Kocher and colleagues were able to produce some statistical techniques that can be used to crack security protocols by monitoring variations in CPU power consumption or radio frequency fluctuations (what's generally known as a side-channel attack). Things haven't improved much.

"If the computation is moved to isolated hardware, the private key's security depends only on the logic of a comparatively simple hardware block," Kocher writes. "The amount of security-critical logic is reduced by many orders of magnitude, turning an unsolvable security problem into a reasonably well-bounded one."

Key to making it all work is in building these new hardware security building blocks simple enough for real-world engineers to understand. That is, complexity is itself an inherent danger in crafting tools to fight complexity.

The progression of computing systems is a total crush of new features: more cores, deeper memory architectures, sensors for everything. Every new interaction between elements within a system represents a new security risk, and, as Kocher explains, risks will in the worst case increase as the square of the total number of components within a system. If my computing system has 100 components, it has 10,000 vulnerabilities.

That's just a mess.