On the eve of the House of Representatives’ forwarding of articles of impeachment to the Senate, President Donald Trump took time to attack Apple. The president’s outburst on Twitter appears to be about the FBI’s inability to get access to the physical storage on two iPhones connected to last month’s killings at Naval Air Station Pensacola in Florida. And it is the latest ratcheting up of rhetoric from the Trump administration on device encryption.
The phones are believed by the FBI to have been the property of Mohammed Saeed Alshamrani, the Saudi Air Force officer who was the suspect in the shooting of three members of the US Navy in December. Alshamrani died after being shot by law enforcement, and the devices were locked.
But an Apple spokesperson said that Apple had provided the contents of the cloud backups of those devices to investigators within hours of the shooting, and Apple executives thought the FBI was satisfied with that—until the FBI came back a week ago and asked for additional assistance. It is not clear that Apple has refused that assistance, but the company has resisted providing a way for the government to break the encryption on devices in the past. Apple did this out of concern that breaking open devices would reduce the protection provided to law-abiding customers against theft of their personal data off stolen or otherwise targeted devices.
Trump claimed in a post to Twitter that Apple “refuse to unlock phones used by killers, drug dealers, and other violent criminal elements. They will have to step up to the plate and help our great Country, NOW!”
Trump’s digital declaration comes on the heels of a similar claim from Attorney General William Barr that Apple had provided “no substantial assistance” in unlocking the phones in the Pensacola case. Barr had previously pilloried Facebook for its plans to make end-to-end encryption the default for all the company’s messaging products, using the threat of child pornography “going dark” as a cause to pressure the tech industry to provide encryption backdoors.
Last month, the leadership of the Senate Judiciary Committee made statements to representatives of Apple and Facebook during a hearing on encryption that reflected the committee’s impatience with the firms in providing a way for warranted exceptional access to encrypted data. Senator Lindsey Graham (R-S.C.) went so far as to threaten, “You’re gonna find a way to do this or we’re going to do it for you.”
The Pensacola shooting is now being lumped in with the 2015 San Bernardino shooting case—in which the Justice Department sued Apple for access, only to then back off after investigators found the password by other means and a vendor provided the tools necessary to hack the phone—as cause for Apple to provide on-demand access to locked devices to the government.
The San Bernardino case was significantly different in that the FBI’s attempts to retrieve cloud data from the suspect’s county-issued iPhone resulted in the device disconnecting from the cloud. This made retrieving a recent backup of the phone’s data impossible, even for Apple.
How either Apple or Facebook could provide any further assistance without fundamentally breaking their respective products is unclear. In Apple’s case, this means at-rest encryption of the device and end-to-end encrypted messaging, while Facebook is only involved in the messaging piece. What the government has requested in the past is a special version of iOS that can be forced onto a locked phone and unlock the encryption—but it’s doubtful that would work with Apple’s current security architecture.
There are tools available from several vendors that can be used to brute-force access to at least some iPhones–GrayKey, for example, claims to have a device that can unlock even newer iPhones, though recent iOS changes may have lowered its effectiveness. And Cellebrite has just released a tool based on the “CheckM8” bug revealed in iPhones up to the iPhone 8. CheckM8 can allegedly be used to perform full filesystem extraction from a targeted device by “jailbreaking” the phone. But these tools rely on flaws in the iPhone’s hardware and software—and Apple has aggressively moved to close them when they are discovered.
Former Microsoft chief technical officer Ray Ozzie proposed a solution called “Clear”: Apple and other device manufacturers would use public and private keys to encrypt users’ passcodes, with the private key kept on a secure system at the companies’ headquarters. This form of “key escrow” would allow law enforcement to retrieve the encrypted passcode from the device and send it to the manufacturer for decryption when a warrant was issued for access. Matt Tait, a cyber-security fellow at the University of Texas’ Lyndon B. Johnson School of Public Affairs and a former GCHQ analyst, floated a similar scheme during his testimony before the Senate Judiciary Committee.
But key escrow schemes have been resisted by cryptographers and others as inherently too risky, because a single private key in this case would allow access to every device from the manufacturer. And key escrow schemes have been proven breakable and evadable in the past—particularly in the case of the Clipper Chip, the 1990s “solution” to exceptional access proposed by the government for voice communications.