FYI.

This story is over 5 years old.

Tech

Do Health Apps Need Government Oversight?

Or, rather, why wouldn't they?

What we're supposed to think is that health apps are empowering. No longer do patients have to consult fancy "doctors" to get diagnoses; we can just click an app and it'll read our blood pressure and other vitals or, hell, even do a quick sonogram or blood panel.

We won't ever again be faced with the slight frown of a "professional" as they perform some subjective computation in their head about your well-being. An app has data; doctors have, what, only 15 years of education, internships, and residential training? Data is never hurried; data doesn't forget; data is in your pocket right now. It's not that easy though, and a compelling new commentary in the New England Journal of Medicine calls for "more robust" oversight by the FDA over the health app (mHealth) market.

Advertisement

What we're supposed to think about a whole lot of internet-based technology is that, generally, the days of experts are over. From software development to electrical engineering to journalism, everybody can do everything, more or less immediately, given the right information in a timely manner (e.g. with the internet). It's a marketing pitch more than a technological consequence, of course, but you have to admit it's a rather appealing idea, particularly given the increasingly stretched-thin American health system.

But there's another perspective about health apps and anti-expert culture in general. This is that the implied self-empowerment of such technologies is actually just the empowerment of some app maker. Matching a set of diagnostic readings across a database is not trading the ambiguity that exists in that doctor's frown for some more perfect, quantitative criteria. It is instead pretending that the ambiguity isn't real, that medicine is as precise as computer code (which, as a computer science person, it pains me to say). It isn't, not really, despite the exhortations of venture capitalist Vinod Khosla.

Throwing a bunch of information into the black box of a health app is riskier than it might seem, given the fictitious "certainties" of apps and the fact that those certainties are currently highly unregulated, at least in the United States. As it stands now, I might whip up some app that takes a heart rate reading (or really whatever) and return just about any result I please. I'd be particularly inclined to deliver results that could draw you into whatever I might also be selling—pills, supplements, devices, actual snake oil.

Advertisement

The NEJM report found that of the some 100,000 mHealth apps on the market, only 100 of them had received any sort of FDA blessing. As such, the market is largely "a Wild West," according to lead author Nathan Cortez, a Southern Methodist University law professor.

A lot of things are a Wild West on the internet, though its recent libertarian honeymoon is waning as cities and governmental agencies look to crack down on things like Airbnb and Uber. Despite the expected right-wing legislative rallying calls against increased mHealth oversight, including proposed bills banning regulation of "clinical software"—legislation obviously made with only the health and safety of constituents in mind—it's plainly obvious that mHealth tools carry with them risk, arguably even more than their ride- and couch-sharing kin.

Cortez et al point to the 2012 recall of a diabetes app that miscalculated insulin doses, but much of the potential harm from unregulated apps could well remain somewhat out of sight. I'm not quite sure of how much motivation a patient delivered to the ER in diabetic shock is likely to have for telling their doctor that they were following the commands of something they just downloaded because it had four stars.

Part of the problem is that, among 100,000 health apps, it becomes quite easy to pick and choose whatever desired health reality. If one app or website says you should probably get that lump checked out and that's scary, there's probably another one out there that will say the opposite and that the user should just chill and buy some vitamins, especially if they're "brand x" vitamins.

Advertisement

"A bewildering array of mHealth products can make it difficult for individual patients or doctors to evaluate their quality or utility," the NEJM paper notes. "Increasing reliance on mHealth raises questions about compromised patient privacy, the cross-jurisdictional practice of medicine, and legal liability for injuries. Serious mistakes with an mHealth product might affect thousands of patients at a time and often without ready mechanisms for detection and correction."

It should be noted that overall, the commentary is very in favor of mHealth technology. Quick, cheap access to diagnostic tools that would have formerly taken a day at a health center and cost thousands of dollars is a very real and amazing possibility. So too is being able to compare the results of those tools across vast amounts of new and dynamic data. That can't really be overstated.

But lack of regulation puts those possibilities in question, leaving patients and doctors without good answers as to what's really safe and effective, leaving patients either unable to take advantage of good technologies or subject to the sketchiness of health care in the Wild West.

The key proposal made by Cortez et al is to regulate apps that inform the "diagnostic-decision process," e.g. apps that tell patients and doctors what actual health care actions to take. If an app is telling a patient whether or not to do something potentially dangerous, it should have been evaluated by the FDA.

That seems reasonable, though at the same time, it's easy enough to imagine a world of disclaimers parallel to those found on supplements: This app has not been evaluated by the FDA, so don't blame us for dying.