en

The VICE Channels

    Image: Jawbone

    What Happens to the Data Collected On Us While We Sleep

    Written by

    Meghan Neal

    contributing editor

    It’s only a matter of time until sleep data from an fitness tracker is used to catch someone cheating on their spouse. Or to cause health insurance rates to climb because you’re not getting a healthy amount of rest. Or reveal to your boss that you were up until 3 AM and are probably hungover at work.

    With the quantified self trend in vogue and wearables escalating, an alarming amount of users’ biometric data is being generated and collected, and there’s next to no oversight preventing it from winding up in the hands of data brokers and advertisers getting rich off your personal information.

    We already know that the major data brokers like Acxiom and Experian collect thousands of pieces of information on nearly every US consumer to paint a detailed personality picture, by tracking the websites we visit and the things we search for and buy. These companies often know sensitive things like our sexual preference or what illnesses we have.

    Now with wearables proliferating (it’s estimated there will be 240 million devices sold by 2019) that profile’s just going to get more detailed: Get ready to add how much body fat you have, when you have sex, how much sleep you get, and all sorts of physiological data into the mix.

    “Whenever there’s information that you’re collecting about yourself and you’re quantifying, there’s a very good chance that it will end up in a profile of you,” Michelle De Mooy, a health privacy expert at the Center for Democracy & Technology, told me.

    This has many privacy and security experts, politicians, and the government wringing their hands, worried that if and when all that granular personal information gathered gets in the hands of advertisers and data brokers, it could be used in ways we never intended or even suspected.

    “Biometric data is perhaps the last ‘missing link’ of personal information collected today,” said Jeffrey Chester, Executive Director of the Center for Digital Democracy.

    “The next great financial windfall for the digital data industry will be our health information, gathered thru wearables, swallowable pills and an ever-present Internet of Things,” Chester told me. “Pharma companies, hospitals and advertisers see huge profits in our health information.”

    Medical data is super valuable—a stat tossed around a lot is that it’s worth 10 times more than your credit card number. If your health data leaks beyond Fitbit, Jawbone, Apple or whichever company makes your activity tracker through a security breach, privacy policy loophole, leaky third-party apps, or user sharing, it’s up for sale to the highest bidder.

    Who’s the highest bidder? Advertisers are an obvious choice; the more personal the data, the more it’s worth to sell you targeted ads. Sensors on fitness wearables, smartwatches, or smartphones can monitor your heart rate, breathing, movement during the night, and other physiological signals to tell whether your sleep is disturbed.

    It’s not hard to imagine sleep data leaked from your Fitbit eventually leading to ads for sleeping pills or a better mattress or your local Starbucks. Or more indirectly, “marketers could derive from raised stress levels, poor sleep, and a combination of other behavior that a romance is in trouble,” experts have speculated.

    This info is also of obvious interest to insurance agencies, which are starting to partner up with fitness trackers to monitor users and reward customers who make healthy choices. That’s nice and all, but it can easily be flipped on its head: Data showing poor health practices could lead to higher insurance premiums.

    “You could see that I stay up all hours of the night, and therefore am perhaps a ‘risk,’” said De Mooy.

    Image: Fitbit

    What’s more, workplaces are increasingly teaming up with the insurance agencies and distributing fitness trackers for employee wellness programs to incentivize preventive health. But in exchange folks are basically handing all that sensitive data over to their employer, paving the way for potential discrimination if you decide not to participate. (One CVS made employees report their weight, body fat, cholesterol, blood pressure, and blood sugar levels and charged a $50/month premium for those who refused.)

    Beyond that, experts warn that sleep data could be used by burglars (or stalkers) to know when you’re home in bed. It could be used against you in court: For instance, a car insurance company could subpoena your health data to see you didn't sleep well or long enough the previous night which led to an accident, one analyst pointed out.

    “Who will be the first divorce lawyer to reveal infidelity in court proceedings?”

    Monitoring sleep—or lack thereof, rather—could even could help prove infidelity. “Fitness bands that help measure your sleep patterns can also reveal other data that most people do not want to reveal,” Andrew Boyd, a health information sciences professor wrote in an op-ed for Network World. “Who will be the first divorce lawyer to reveal infidelity in court proceedings?”

    “Some of these seem a little far-fetched now, but they’re really not,” said De Mooy. “If [health trackers] become more ubiquitous in our lives, and if these devices start talking to each other, the leakage is inevitable.”

    You may be thinking, but don’t gadgets like Fitbit and Apple Watch promise not to sell or share your personal data, or at least keep it anonymous? Yes, but that’s not the end of story, thanks to the vague and ever-changing nature of most privacy policies.

    There are various ways the data can leak out—accidentally or not—and it’s not very hard to tie the information back to you, even with anonymizing software, researchers have found.

    Look at the privacy policies of fitness trackers like Fitbit and Jawbone and smartwatches from major firms like Apple, Google, or Microsoft that double as a tracker if you download health apps. They’ll usually say something like: We don’t sell your data to third-parties unless it’s de-identified. In other words your name, address, social security or other identifying info is stripped out. But it’s not very hard to link that anonymized data back to you, explains Timothy Libert, a doctoral student researching health data and privacy.

    “Usually when industry types say ‘anonymous’ they actually mean ‘pseudonymous,’ which means there is a random identifier tied to the data, (e.g. ‘u91u232u390’). However, if there is a second database that links the identifier to a specific person (e.g. ‘u91u232u390: Meghan Neal’) it can be easily re-identified,” Libert told me. “An identifier may be associated with a specific address which is where a person lives, or even geolocation data from your morning jog can be used to figure out who you are—even if it is ‘anonymous.’”

    The more the data is shared among different gadgets and services and people, the easier it is to re-identify. Google almost certainly has enough “anonymous” data to identify users by name, said De Mooy. So does Facebook, which integrates with a lot of wearables and health apps if the user chooses to share.

    The Pew Research Center found that 34 percent of health trackers share their data with someone else. “Anytime you see the word “share” you should be pretty cautious,” said Libert. “Once it goes out the door you have no idea where it eventually ends up.”

    “Another way data leakage happens is inadvertently by users,” said De Mooy. Default sharing settings may make data public unbeknownst to the user, or a user may unknowingly give permission to share their data by agreeing to terms and conditions that can be hard to wade through. Like that time Fitbit users’ sexual activity showed up online without their knowledge.

    Plus, if a company is acquired by another firm or goes bankrupt, your data is back up for grabs, and can be a goldmine. Just look at Radioshack; after going bankrupt, it auctioned off the personal information of millions of customers to the highest bidder. Not to mention biodata is an obvious target for hackers.

    Which is all to say, we’ve yet to see the ways our sleep habits and a mess of other health data might be used beyond our control. Especially given there’s very little oversight regulating health information collected by these gadgets and apps. Unlike the info you give your doctor, data from wearables isn’t protected by the federal Health Insurance Portability and Accountability Act (HIPAA), and these gadgets aren’t regulated by the FDA.

    "Our connected health data isn’t secure or privacy protected," said Chester. "The US is one of the only countries that doesn’t have a comprehensive privacy law, such as what protects people living in the EU. The explosion of health data coming from wearables and other devices that is used for marketing doesn’t have any consumer safeguards."

    There’s no doubt the quantified self movement can lead to better health—this is good and great. But where privacy is concerned, many experts are basically just waiting for the other shoe to drop.

    “I hate to be like a digital Debbie downer,” said De Mooy, “but I think it’s fair to say that doing anything like putting a small tracker on your body, you should assume that that information is going to go somewhere out of your control.”

    You’ll Sleep When You’re Dead is Motherboard’s exploration of the future of sleep. Read more stories.