FYI.

This story is over 5 years old.

Tech

Happy 200th to George Boole, the Grandfather of Digital Logic

Almost a century before computers, Boole realized the universe can be computed.
Image: Michael Coghlan/Flickr

George Boole gets left out of most timelines of computer history. Babbage, Lovelace, Turing—sure, they're all there. Only rarely does the grandfather of digital logic get his due. Perhaps it's because his creation just seems like nature itself, a fundamental feature of information and so of the universe. Boole is a discoverer among inventors.

Computer scientists get acquainted with Boole and his namesake early on. Boolean or "bool" is a data type built into most programming languages. It can hold one of two values: true or false. Nothing else. It could be a user-defined variable holding that true or false, or it could be some function or sub-routine, a collection of code that we can ask to spit out either a true or false result. Booleans are especially helpful as flags, which are sort of switches that might serve to tell a program to start or stop doing something.

Advertisement

But there is so much more to Boole's new world than a high-level programming data type. Boolean logic is digital logic. A system of mathematics based on trues and falses is one of yeses and nos and, ultimately, of 1s and 0s. There and not there. This is computing.

Formal logic itself existed before Boole, of course. The founder of the idea is usually taken to be Aristotle. It wasn't until much later, in the 1600s, that Gottfried Wilhelm Leibniz took things even further, arguing that logical thought could be reduced to symbols and therefore could be examined as a computational system. If we can solve geometric problems using symbols and numbers, Leibniz believed everything could so be reduced.

Leibniz forms a bridge of sorts between Aristotle and, centuries later, Boole. Boole's idea was, similarly, that if logical arguments and the relations between them can be reduced to symbols and rules, as in proper algebra, then a whole universe of thought could be computed mathematically. The result is what became known as Boolean algebra: a set of symbols and connectives allowing for the simplification of logical expressions. Formal truth-finding.

As plus and minus signs are fundamental to arithmetic, Boolean algebra has the operations AND, OR, and NOT (to start). Each one of these relations receives one or two Boolean values (trues and-or falses) and spits back a single true or false in response. Computing "true AND true" results in true, for example, while NOT simply returns the opposite or inverse of the input: true becomes false and false become true.

Advertisement

All of this translates immediately to switches and gates within a physical computer. A NOT gate takes a 1 bit and outputs a 0; an OR gate takes in a 0 and a 1, and outputs a 1.

The relationships are all defined in truth tables, which you can see below.

Image: SparkFun

You can see Boole's accomplishment in most any circuit involving computation:

An elevator control circuit. Image: Cameron Steiger/Penn State

In his paper "The Mathematical Analysis of Logic," Boole wrote that, "What may be the final estimate of the value of the system, I have neither the wish nor the right to anticipate. The estimation of a theory is not simply determined by its truth. It also depends upon the importance of its subject, and the extent of its applications; beyond which something must still be left to the arbitrariness of human Opinion."

The applications of Boolean logic wouldn't been realized until almost a century later. At MIT in 1937, the mathematician and engineer Claude Shannon proposed using Boole's system as the basis for electrical switches in circuits, as seen in the diagram above. This idea is what became known as information theory and, soon enough, it became the basis of computing.

For several reasons, Boolean algebra breaks down in quantum computing, but also not completely. Many of the algebraic rules of quantum logic remain the same, but there are some pretty ugly differences. Boole won't soon be obsolete, but his system may wind up looking quite different.