FYI.

This story is over 5 years old.

Tech

The Story of FMX, a Wannabe Radio Standard That Was Killed in a Very Public Way

Attempts to improve AM and FM radio technologies tend to land with a thud—a thud no harder felt than with the FMX standard, circa 1989.

A version of this post originally appeared on Tedium, a twice-weekly newsletter that hunts for the end of the long tail.

Recently, the Norwegian government—which has helped the country become first-movers on the electric car front—decided that it wanted to be ahead of another kind of curve.

It was a curve that not everyone was ready for, however. Norway recently decided to drop FM radio, in favor of a modern equivalent, digital audio broadcasting (DAB). This early move comes with a lot of risks—in this case, leaving portions of the Norwegian population without a working radio—as does every other attempt to mess with the technology that drives the airwaves.

Advertisement

Today I'd like to highlight one of those attempts, an effort to improve the FM signal that was taken down in a very public way.

We're making noise about signal over here.


Why attempts at building a better radio standard have failed with surprising consistency

New standards in the radio industry have come up repeatedly over the years, and few of them have stuck.

For example, did you know that there's an existing standard for broadcasting AM stations in stereo? According to Meduci, LLC, a manufacturer of AM stereo receivers, approximately 66 radio stations in the United States support this standard, known in the industry as C-QuAM. The standard has been around for 40 years, since 1977, when it was first published in an IEEE journal, but has only seen take-up more recently as a direct side effect of it being included as a feature with HD radios. It's niche, but if you're looking to hear Rush Limbaugh's voice in a slightly nicer frequency, you can certainly find a way to listen to it if you have the right device.

Perhaps the most fascinating of these attempts to improve the radio signal, however, is that of FMX. Formulated in the late 1980s as an update to the FM standard, it intended to solve a major weakness with FM that had been lingering since stereo had been added in the early 1960s: When you move to the edges of the coverage area, the sound quality gets really low.

As you can probably tell by the fact that it's generally still an annoyance in many vehicles today, FMX failed to solve that problem.

Advertisement

But the reason why it failed to solve that problem is more complicated than saying it didn't work. There were both technical and political issues at play.

The technology, for what it's worth, did have the right folks supporting it: The brainchild of Tom Keller, an engineer with the National Association of Broadcasters, and Emil Torick, who worked in the same role for the CBS Technology Center, FMX was intended to fix stereo's weaknesses in low-quality areas. The best part? It was backwards compatible. It would reduce noise and improve the fidelity of FM stations for stereos with upgraded equipment, but those with cheap beater radios would still have the same staticky-in-outlying-areas experience that they did before.

An October 1989 Radio-Electronics article that discussed the FMX controversy. Image: Internet Archive

Paul Riismandel, a radio industry observer who co-founded the industry news outlet Radio Survivor, notes that the FMX technology wasn't the first of its kind. For example, Dolby attempted to bring its noise-reduction technology, common in cassette tapes, to radio stations in the 1970s, but its offering was generally ignored due to the fact that proprietary equipment was needed. (Over at the Internet Archive is a sample of what Dolby FM sounded like during a 1978 Minnesota Public Radio broadcast.)

But FMX likely got further than most due to two factors that became apparent in the 1980s: The fuzziness of radio stations in fringe parts of the broadcast areas, and the pristine sounds of the compact disc, which was becoming popular at the time.

Advertisement

"FMX was a way for radio to compete with this new digital technology and adapt to listener expectations," Riismandel noted.

But the FMX technology proved controversial within the radio industry due to two separate incidents that cost the technology its momentum. The first came about during the 1986 summer edition of the Consumer Electronics Show, which took place in Chicago. In an attempt to show off the technology to tradeshow attendees, the Chicagoland FM station WFMT, which was known for pumping classical tunes on a very powerful FM signal, switched to FMX technology as part of a secret test to demonstrate the technology in the wild.

The station's own engineers weren't even told of the move. It was supposed to go completely unnoticed by anyone except the handful of Summer CES attendees who were checking out the FMX demo.

Problem is, it was very noticeable to regular listeners. Rich Warren, a longtime Chicagoland-area radio pro, reported in the Chicago Tribune that WFMT had received numerous complaints about its signal quality during the week of CES.

"We applaud CBS' attempt at such a system. But as it presently exists there is a compatibility problem," noted the station's chief engineer, Alfred C. Antlitz. "Thus we cannot broadcast with the system. It seems to cause increased multipath distortion for listeners. Perhaps if CBS reduced the system's noise reduction capabilities the problem might be reduced."

Advertisement

Warren, a prominent figure in Chicago radio who to this day hosts the long-running folk music program The Midnight Special on WFMT, appeared to be completely turned off by the hiccup, comparing the technology to Sony's Elcaset flop.

"With any luck, by next January it will be just a memory," Warren wrote.

It almost was, thanks the the November 1986 closure of the CBS Technology Center, the industrial design house which produced FMX, along with numerous other important broadcasting technologies.

But FMX survived the closure, with NAB agreeing to take on the technology and spinning it off, with the help of investors, as Broadcast Technology Partners (BTP). From there, the engineers involved in FMX took the lessons from the experiment's initial stumble and kept improving it, with the goal of ensuring that the WFMT incident was a mere hiccup rather than a long-term problem.

It was actually something else that did FMX in.


"These improvements fail to take hold because their effects are subtle, and require new equipment in order to benefit from them. On top of that, broadcasters bear the cost burden of upgrading their facilities."

Radio Survivor's Paul Riismandel, explaining why upgraded radio technologies such as FMX, AM stereo, or Dolby FM struggle in the radio market. He compares the situation to a chicken-and-egg situation: "Broadcasters are understandably reluctant to upgrade if listeners don't have compatible equipment, and listeners are reluctant to upgrade if there aren't many enhanced broadcasts to tune in," he noted. In this sense, Norway's decision to pull off the band-aid in a harsh way makes some sense, as the closest parallel to this switchover, the move to drop analog television in favor of digital, required an act of Congress to pull off in the US.

Advertisement

How FMX was basically killed by a single MIT lecture

In early 1989, FMX had improved enough that more than 100 radio stations were genuinely interested in turning it on. But around the time, the technology faced its second major setback, this time closer to a body blow than a mere stumble.

It came thanks to a noted Massachusetts Institute of Technology professor named Amar Bose, who at the time was considering bringing a certain fancy home radio to the market through his namesake company. (Fun fact: I wrote about him recently. What a coincidence!)

Bose had taken up an interest in the FMX effort because he was trying to figure out if his company's forthcoming Wave radios would need to support it, but he started doing the math, and he concluded that the numbers simply did not check out. Then he did some real-world tests and was even less impressed.

In January of 1989, Bose shared his findings with the world during an MIT lecture. BTP wasn't happy about Bose speaking out—something the engineering professor alluded to before presenting his research.

From the Feb. 28, 1989 issue of MIT's The Tech.

"I have been threatened with legal action, placing me under heavy personal liability, if I proceed with this lecture," Bose said, according to an April 1989 New York Times article recalling the incident.

The threat from BTP didn't stop Bose or his colleague, William Short. In the lecture, he made the case that the technology worked much better in the lab than in the field, and that, if used in the real world, the quality of radio reception would be severely degraded by the use of FMX, no matter if the technology was installed or not.

Advertisement

Bose had done his homework, asking BTP for information on the technology they used, getting ahold of said technology, then conducting tests of his own using the campus radio station. And he was ready with a whole lot of data. Radio Electronics reported in October of 1989 that "those in attendance received a massive document detailing the mathematical modeling, computer simulation of the effects produced by FMX, and a summary of results obtained from actual broadcasting experiments that led to those startling conclusions."

What Bose found was greatly disappointing. But the disappointment was felt on the part of FMX's creators as well—disappointment in Bose's research, that is. BTP's Emil Torick, who was at Bose's presentation, strongly disagreed with the professor's findings and argued the whole situation was an attempt to manipulate the media and tear apart FMX as a technology.

"We feel there is grossly misleading information being presented here which is not a representation of the real world," Torick told UPI at the time.

It was a dramatic scene, with Bose and Short at one point in a showdown with Torick, who criticized the duo for what he called "a grossly misleading conclusion wrapped in what Mr. Bose calls beautiful mathematics."

Rich Warren, the Chicagoland radio personality who winced the first time FMX reared its head, was ready with another takedown piece after Bose's lecture.

Advertisement

"The effects of multipath on an FMX encoded signal are increased distortion, increased noise, stereo soundstage motion, volume level changes and even timbre changes, he concluded," Warren wrote in a Chicago Tribune piece. "I have heard the tapes of the worst-case FMX broadcasts and they sound horrendous. It even adds distortion when listening in mono."

In the months afterwards, BTP tried to defend itself, making the case that the FMX technology had been tested, successfully, in the wild elsewhere, with few complaints from the public. But Bose's argument came at a pivotal time for the technology, and the negative PR in the end killed BTP's chances of taking FMX mainstream.

"At that point we needed more money for additional R&D, but there were limited funds and licensing problems," FMX co-inventor Tom Keller recalled in an article honoring Torick after his 2010 passing. "Even though we had some very successful stations, the system died a slow death."

It's not every day that a single two-hour lecture kills a major industry initiative.


"Standards-setting in broadcast is as much a political process as it is a technical one," Riismandel tells me, and it makes a whole lot of sense.

There are a lot of stories like that of FMX, though not nearly as dramatic or played out on such a public stage.

(One recent example is the fight by broadcasters to convince smartphone-makers to add FM radio capabilities to their devices. Approximately 43 percent of phones sold in the US have built-in FM chips installed but not activated, according to the National Association of Broadcasters. NAB has been making the case in recent years—somewhat in vain in Apple's case—that smartphones should support the ability to listen to FM radio, something they could technically do if it was enabled by manufacturers.)

The politics of the matter usually limit uptake—or, in some cases, encourage it. In recent years, HD Radio gained prominence and uptake in the US after the Federal Communications Commission picked it as the prevailing standard over a number of other technologies, including the DAB standard that Norway is using.

HD Radio, which adds a data layer to traditional AM and FM broadcasts, started slow and has generally stayed in that setting, but it nonetheless found a niche, thanks in no small part to many modern cars including HD Radio functionality (22.5 million cars had it as of 2015). Even so, it's still failed to turn the needle for some. In 2014 for example, one radio-industry researcher told the St. Cloud Times that the technology had seen "no meaningful uptake or adoption."

Nothing against the Norwegians or anyone else attempting to make radio fidelity better, but maybe these attempts to update and/or replace AM and FM radio just aren't worth the trouble.

Maybe radio was meant to be analog all along, in its fuzzy, imperfect form. We have podcasts for everything else.