FYI.

This story is over 5 years old.

Tech

When Transistors Cost a Buck Apiece

The history of computing in a single statistic.
​Image: Marcin Wichary/Wikimedia

​The story of computing is told through transistors. Armed as we are with one or probably several supercomputers (by the standards of not really all that long ago), whirring along at clock speeds of several billion cycles per second, the scales involved in transistor evolution are a thing to behold—are the thing to behold. And they're still not good enough.

For the current issue of IEEE Spectrum, which is a whole big celebration of Moore's Law on its 50th anniversary, semiconductor industry analyst Dan Hutcheson summarizes the history of transistors in a quick wallop of staggering statistics. For example, 2014 saw the production of 250 billion billion transistors, which is about 75 times the number of galaxies in the known universe and 25 times the stars in the Milky Way. Every single second of that year produced about one trillion transistors.

Meanwhile, in a Q&A, Carver Mead, the Caltech professor that actually coined "Moore's Law," remembers when transistors cost a buck apiece. "We were working with really cheap transistors that were about a dollar apiece in the stockroom," Mead recalls. "For a student to shell out that for [a device] that might burn out on the first experiment was not easy." So, figure that your computer right now is probably cooking along with a transistor count of around a billion and a half or so and you can do the math.

Which is fucking nuts.