FYI.

This story is over 5 years old.

Tech

Intel Bets $16.7 Billion on the Massively Parallel Future of Computing

Why its acquisition of field-programmable chip-maker Altera matters.
Image: gatech.edu

On Monday, Intel completed its largest-ever acquisition, paying $16.7 billion for chip-maker and recent Intel partner Altera. The move, the culmination of seven months of negotiations, represents an all-in bet on what most of us already assume to be the computing future: data, the Internet of Things, intelligent systems. At the heart of this bet is a change-everything chip technology known as the field-programmable gate array (FPGA).

Advertisement

"We will apply Moore's Law to grow today's FPGA business, and we'll invent new products that make amazing experiences of the future possible—experiences like autonomous driving and machine learning," Intel CEO Brian Krzanich said in a statement.

So, what is an FPGA anyhow? Sure, it's a "programmable chip," but so is every other chip, right? I write code for a piece of software that is compiled into machine instructions specific for a given piece of hardware and usually I just call that "programming." I have programmed a chip to accomplish a new task. But that's only true in a sense.

There is another level of programming: programming actual hardware. At this level, I, as a typical programmer, don't have much to do. Mostly, I'm not even thinking about the vast arrangements of wires and gates that make a chip do chip stuff, nor would I know where to even begin changing all of that chip stuff and then writing new programs for my new chips. Seems like a good way to have your head explode.

But that's just what an FPGA allows: the retooling of physical hardware via software. The fundamental technology has been around since the mid-1980s (in a primitive form) and it's employed nowadays mostly in high-performance math-heavy roles: radar systems, missile guidance, MRI machines. Here, an FPGA can be very, very fast.

Generally, this speed boost comes via parallelism. Parallel processing is a very different world from the one we're used to, which is based on sequential processing. Our everyday computers step through instructions one at a time and are always waiting for the results of some prior calculation to do the following calculation. For example, if a statement is true, the computer will do one thing, and if it's false, the computer will do another thing. Sequential computing can involve a lot of waiting around for certain events.

Advertisement

Image: llnl.gov

Parallel computing, not so much. Here, many pieces of data can be computed at one time so long as they're not interdependent. A classic example of this is in image editing, where some software might make a single change applying to all of the thousands of pixels in an image at once. Because this operation is applied to each pixel independently, it makes sense to do them simultaneously. This is the whole point of a GPU: massively parallel computation.

An FPGA is, as the name would imply, a big old list of gates‐AND gates, NOT gates, XOR gates… the whole family‐a line of digital turnstiles stretching around the entire globe. These gates are designed to do something to a piece or pieces of data, and they're able to do it all at once. This is in contrast to a normal sequential computer processor which would instead iterate through that array of gates, doing one operation at a time.

So: the parallel FPGA just takes a single step through the whole line of gates, while the sequential processor has to walk around the whole globe, going one turnstile at a time.

The present is prime-time for parallel computation, so Intel's move makes all kinds of sense. (It had already been collaborating with Altera.) For one thing, FPGAs can handle a tremendous amount of input/output operations at once—handling sometimes thousands of input/output pins per single chip—which is good news for Internet of Things applications. It's also good news for processing large amounts of information, generally, and if the near future of computing promises anything it's lots and lots of data about absolutely everything, from stunning video game worlds to the finest structures of deep space.