Quantcast

Apple Health Data Is Being Used as Evidence in a Rape and Murder Investigation

German authorities cracked a man's iPhone and found out what he was up to.

Samantha Cole

One of the most important witnesses to the rape and homicide of a 19-year-old-woman in Germany might be a stock app on the iPhone of her alleged murderer.

Hussein K., an Afghan refugee in Freiburg, has been on trial since September for allegedly raping and murdering a student in Freiburg, and disposing of her body in a river. But many of the details of the trial have been hazy—no one can agree on his real age, and most notably, there’s a mysterious chunk of time missing from the geodata and surveillance video analysis of his whereabouts at the time of the crime.

He refused to give authorities the passcode to his iPhone, but investigators hired a Munich company (which one is not publicly known) to gain access his device, according to German news outlet Welt. They searched through Apple’s Health app, which was added to all iPhones with the release of iOS 8 in 2014, and were able to gain more data about what he was doing that day. The app records how many steps he took and what kind of activity he was doing throughout that day.

The app recorded a portion of his activity as “climbing stairs,” which authorities were able to correlate with the time he would have dragged his victim down the river embankment, and then climbed back up. Freiburg police sent an investigator to the scene to replicate his movements, and sure enough, his Health app activity correlated with what was recorded on the defendant’s phone.

I asked Michael Kwet and Sean O’Brien, both researchers at Yale Privacy Lab who have previously written on the topic of privacy and health apps for Motherboard, whether we should expect more of these kinds of cases—where someone’s own phone essentially testifies against them—in the US.

“Yes,” O’Brien said in an email. “Digital evidence is already more common in law enforcement, not only metrics from apps but also facial recognition, recordings from smart speakers, and, of course, smart devices with cameras.”

Kwet added that a study by the nonprofit think tank Rand Corporation found that data culled from fitness trackers, smartphones, and other personal devices is likely to be used in criminal investigations, and that the legal system is ill-equipped to handle these cases.

“I believe we will see more of this as time goes on,” Kwet said in an email. “Police forces are enthusiastic about intelligence-based policing. People fear crime, and police will claim they need to gather as much evidence as they can to solve criminal investigations, now that the data is recorded.” Just a few months ago, in October, a man was charged with murder partially based on evidence gathered from his wife’s FitBit.

There aren’t any US laws specifically addressing passwords or key disclosure, O’Brien said, but there is recent case law that affirms an individual's right not to hand over a password: For example, in the case of United States v. Doe, which found that a man imprisoned for refusing to decrypt data on several devices was protected by the 5th Amendment.

Apple says Health app data is only stored on the device, but O’Brien says that this kind of data is even more easily access when it’s stored in the cloud.

“In my opinion, the creators and distributors of software should, first and foremost, have a responsibility to their users,” O’Brien said. “When and where they should hand over data to courts is a more complex question. It would be much better, in my view, not to collect such surveillance data at all. Such data is best kept locally on devices whenever possible. If it is collected, those who handle it have a deep responsibility to defend the privacy of their users.”

And even if the law catches up to the pace of progress of these data-collecting apps, and the courts strike down the use of data collection as evidence, Big Data surveillance is the real problem, Kwet said. Rules and regulations change, and what’s protected today may not be tomorrow. “It may be a tough sell, but companies should restrict data collection to the absolute minimum, refrain from Big Data personalization, and instead provide value to customers with products that truly protect their privacy.”