FYI.

This story is over 5 years old.

Tech

New MIT Tool Boosts Computing's Analog Fringe

Also: what the hell analog computing even is.

Computing is overwhelmingly dominated by digital machines. This is obvious enough and one could feel safe in just assuming that computing, generally, implies digital information. Transistors, the fundamental hardware unit of most every contemporary digital computer, maintain states as binary values of "on" or "off," 1 or 0. This is what it is to be digital: existing as discrete values that either are or are not, with naught in between. It's hard to even conceive of what the alternative to that might be.

Advertisement

But an alternative does exist: analog computing.

The term analog computer probably conjures up images of a computing prehistory dominated by difference engines and slide rules, but that couldn't be further from the truth. In reality, analog computing is a contemporary, useful technology, however esoteric. Its great strength is in modeling natural systems that are defiantly undigital, that can only be described imperfectly by discrete computations. Analog computing offers a solution, but programming an analog machine is a deep challenge in itself. There is no analog JavaScript, to put it mildly.

Computer scientists at MIT have developed a new tool, named Arco, that could potentially make analog computing much more accessible. While analog programming currently means directly programming/manipulating computing hardware, Arco serves as a compiler layer bridging high-level instructions and low-level circuit specifications. It is able to take certain difficult equations known as differential equations and translate them into voltages and current flows on an analog chip.

First off, what exactly do we mean by "analog" here anyway? Essentially, it's a machine built around computations involving continuous values and functions. When it comes to phenomena such as waves, digital computing is always only making approximations based on sampling those waves at regular discrete intervals. An analog computer takes those same waves and represents them almost literally in computer hardware. If you were representing some equation in an analog machine, you would be able to find a direct mapping between that equation and the physics of the machine itself. This is, I think, kind of mystifying and unlike how we're taught to think of computers.

Advertisement

In this sense, an analog computer is by definition not a general purpose computer, but one that exists for some problem (or physical model) and that problem alone.

The upshot of this is that it's difficult to think of variables and the other abstractions we're used to in computer programming because they don't exist in the analog computing world. Changing the value of some parameter in an analog computation may mean twisting a literal potentiometer (a dial, basically) on a console full of dials representing different values to be computed in different ways. Rather than the 1s and 0s/yeses and nos/is and isn'ts stored by configurations of transistors, we may instead find voltages representing continuously changing values; instead of a voltage reaching a certain threshold and a value flipping from 0 to 1, the value can change from some number between 0 and 1 to some other number between 0 and 1. Which is roughly the distinction between discrete mathematics and continuous mathematics.

To go a little further, you can imagine an analog computer as a computer that does not have a fixed architecture, a property that much of the computing world is built around. In this sense, an analog computer is by definition not a general purpose computer, but one that exists for some problem (or physical model) and that problem alone. In fact, the "analog" here can be thought of as an analog computer necessarily acting as an analog of some system to be solved/modeled. If there's anything like it in the digital world, it would be data flow programming tools like Simulink or PureData.

Anyhow, the idea behind Arco is that it's fed differential equations of varying complexity and, in response, it spits out analog computer configurations. Owing to the difficulty of this sort of equation—which is broadly concerned with changes in dynamic systems through time—this takes a fair amount of computational effort. With the simplest set of equations the MIT group tested, consisting of just four equations commonly used in biological research, the tool took less than a minute, but with 75 equations, this went up to an hour of computation.

Still, an hour is much better than attempting to work things out by hand.

"'Digital' is almost synonymous with 'computer' today, but that's actually kind of a shame," notes Adrian Sampson, an assistant professor of computer science at Cornell University, in an MIT news release. "Everybody knows that analog hardware can be incredibly efficient—if we could use it productively."

"This paper is the most promising compiler work I can remember that could let mere mortals program analog computers," Sampson said. "The clever thing they did is to target a kind of problem where analog computing is already known to be a good match—biological simulations—and build a compiler specialized for that case. I hope [the researchers] keep pushing in this direction, to bring the untapped efficiency potential of analog components to more kinds of computing."