FYI.

This story is over 5 years old.

Tech

What Binary-Based Money Would Look Like

The decimal world is a relic. Let's go binary.

Computers haven't really changed how we count. While our machines use a simple, logical counting system consisting of a mere two values (1 or 0, true or false), we still think in terms of tens. Our numbers consist of 10 numerals (including 0), with which we build bigger number using different "slots." In the right-hand slot, a numeral gets multiplied by just 1, and moving further left, we multiply by 10, 100, 1,000, and so forth.

Advertisement

That's all our numbers are: an array of numerals with each having a value of that numeral multiplied by a different power of 10. For, say, 666: we multiply 6 by 1, 6 by 10, and then 6 by 100, and we have a number. This is beyond intuitive, of course, but if we were to sort out, say, a hexadecimal number, which is based on multiplying 16 different numerals by different powers of 16, the actual guts of counting become more apparent for the simple reason that, for most of us, we need to do more than just glance at a hex number to get its value. We have to do the whole process of multiplying and adding, multiplying and adding.

Given someone that's only ever known a hexidecimal counting system—maybe from some other planet where intelligent life comes with 16 fingers instead of ten—they would find converting into our everyday decimal system just as confusing. Computers don't have much use for our decimal numbers, however, and instead use hex and binary to store values: binary for obvious reasons, but hexidecimal because it turns out to be super-easy to convert between binary and hex. So, given our digital world, should we then expect a slow migration away from decimal?

Money seems like a reasonable place to start in our abandonment of decimal numbers.

Supposedly, decimal numbers became the preferred numbers because we have ten fingers and fingers are the human body's built-in calculator. There isn't a great reason to stick with decimal numbers otherwise, is there? Besides intuition, of course. I'm leery of the weird push for programming to be regarded as a basic or fundamental skill, like reading or algebra, but people should at least be able to sort out a hex or binary number. That seems reasonable for a world in which elementary school students are required to sport laptops and/or tablets to participate in basic learning.

Advertisement

What if we started with money? Money is decimal, mostly. Even as more and more of our everyday currency migrates to the digital, virtual world, we still tend to think in terms of 10s and 100s and 1,000s. Some University of Utah math students at the blog 3010tangents recently considered an alternative civilization where currency is instead binary, just 1s and 0s. Given that the vast majority of our numerical experience has to do with money, it would seem like a reasonable place to start in our abandonment of decimal numbers.

The Book of Mormon describes such a currency system, one based on binary. Start with a single dollar bill, and double the value as denominations increase. So, we'd have a one dollar bill, a two dollar bill, four dollars, eight dollars, sixteen dollars, and so forth. Similarly, for values less than a dollar, we'd have a half-dollar piece, a quarter-dollar piece, and an eighth-dollar piece. You have to admit that the eighth-dollar makes a bit more sense than our current dime. We halve our way down to quarters, but then suddenly switch it up and drop down from one-quarter of a dollar to one-tenth?

"Now based on this, we might say that the system as it stands is a bit inflexible, unable to go to very high numbers, as it maxes out at roughly sixteen dollars, and hits a minimum at about 12.5 cents," the post notes, "but with a few tweaks on our part, applying the same pattern, we can achieve a wide array of numbers and a very intriguing property."

That intriguing property is the ability to simply calculate amounts based just on denominations. The process is the same as converting a binary number to decimal, but without the actual conversion. Say you have a pile of coins in your hand, and you need to count out a pile of it for some reason, paying for something or making change. The first denomination you plop down is the highest value coin (or bill) below the total you're trying to give. If the bill were, say, $70 in decimal terms, you'd first hand over a $64 piece. You now owe $6, so you again find the next largest piece below $6. That's $4 and so you owe $2. Toss a $2 coin on the pile. As a binary sequence, it's 100110.

This process seems rather ungainly mostly because we're needing to look at it/think about it in decimal terms. If binary counting were more intuitive, we'd just do it and we might also find that the whole counting process works more naturally. If you look closely, you'll notice that there's no actual counting in our process. Assuming you have large enough coins, no more than one of any denomination is used.

That's the cleverness of binary: the whole system never needs something other than a 1 (and the absence of a 1, or 0) to represent a value. The whole world of numbers is just right there as a series of statements each saying whether or not something exists or doesn't. No counting, just is or isn't. That's perfect.