FYI.

This story is over 5 years old.

Tech

What It Means If Apple Makes an iPhone It 'Can’t Hack'

Apple is said to be working on a clever way around being compelled to hack iPhones.
Janus Rose
New York, US
Image: Shutterstock

Even if the US government forces Apple to help break into the iPhone of San Bernardino shooter Syed Farook, the company reportedly has plans to design its new devices to prevent it from being technically capable of assisting the feds in future cases.

Unfortunately for Apple and its privacy-conscious customers, this change would only go so far in protecting companies from being forced to write software for the government if the company loses its case in court.

Advertisement

Read more: Apple's Motion to Vacate the FBI's Order: Full Text

According to a report Wednesday night from the New York Times, the company's public spat with the FBI over encryption has accelerated efforts to improve its security even more. If you're just tuning in, Apple is currently resisting a court order that would force it to write new software that bypasses security features in iOS—specifically, mechanisms that wipe the device after 10 unsuccessful passcode entry attempts and introduce delays between each attempt. With those features removed, the FBI would be able to "brute force" the device by trying every possible passcode combination.

The court order in the San Bernardino case hinges on the government installing a special version of iOS through the iPhone's Device Firmware Upgrade (DFU) mode, a recovery mode that can be reached by pressing multiple buttons while turning the phone on.

If the government can conscript companies to make malicious updates, it would undermine the trust that underpins the software updates people regularly receive on their devices

The Times report suggests that Apple is now working on a feature that would require a passcode before installing any updates in that mode. That means that a government, cybercriminal, or other entity would be unable to install any software that could disable the device's protections, even if that software was written and signed by Apple itself. For the FBI and its long-standing campaign against strong encryption, it would mean going back to the drawing board.

Advertisement

There are actually several ways Apple could add the extra passcode protection with the firmware in its current devices, according to Jonathan Zdziarski, an iOS forensics expert. But the most comprehensive approach would be to build new hardware that changes how the system boots.

"For example if they encrypted the OS partition in the way I describe on my site then DFU wouldn't be able to mount the disk without the user passcode," Zdziarski told Motherboard in an email. "But if you're going to modify how DFU mode works at its lowest level then that's a boot ROM update."

In other words, Apple would lock itself out from modifying the phone's operating system and disabling security features by changing how the device boots at the hardware level. "Apple is said to be working on unhackable iPhone," is how CNET put it.

Putting aside that it's impossible to create anything that is truly "unhackable," this improvement sounds like a big win for Apple, encryption evangelists, and people who care about privacy. But security experts say that even with those new protections, the consequences of the company losing its case against the FBI would still be catastrophic.

For one thing, the San Bernardino shooter is dead, and these changes would specifically protect devices in similar situations where the device has been turned off and its owner is dead or unavailable. If the owner of the device the government wanted to hack were alive, that opens up a number of possibilities for the FBI to crack into a phone—and it would become a lot easier if Apple can be forced to help.

If Apple created this new phone that it can't modify software on without a password, the FBI can't ask Apple to add new software onto the phone by force. But it could orchestrate a super-targeted phishing attack. For example, it could have Apple create malicious software that looked like a normal, official software update, which it would then push to the user's phone and trick them into installing.

This highlights why experts say that Apple losing its court case would be very, very bad news for security—even with these technical improvements. If the government can conscript companies to make malicious updates and cryptographically sign them as legitimate, it would undermine the entire architecture of trust that underpins the software updates people regularly receive on their devices.

"The reason why people in the security community are freaking out isn't because the government is forcing Apple to write code, it's because they're forcing Apple to sign it," ACLU technologist Chris Soghoian told Motherboard. "It doesn't just affect Apple, it affects every single company that can deliver an update."