This article is more than 1 year old
Confused as to WTF is happening with Apple, the FBI and a killer's iPhone? Let's fix that
Here's a clear, technical Q&A
Water cooler: Everyone is losing their mind over Apple being forced to help the FBI unlock an iPhone. Just what is going on?
Relax, don't spill your almond milk latte. We'll make it crystal clear for you.
The FBI wants to unlock an iPhone 5C belonging to Syed Farook, who with his wife Tashfeen Malik shot and killed 14 coworkers in December in San Bernardino, California. The couple were killed by cops in a shootout soon after.
The Feds, fearing a link between the pair and Islamic terrorists, want to break into Farook's phone. But the federal investigators don't know the passcode and fear the device will wipe itself after 10 wrong guesses. The handset – running either iOS 8 or 9 – has also encrypted its messages, photos and other data, and a valid code is needed to decrypt the files.
So the agents went to court to force Apple to create a special build of iOS that, when loaded onto the phone, will allow the FBI to brute-force its way through all possible passcodes without the data being wiped. Eventually, they'll hit on the correct one and get into the phone.
Magistrate Sheri Pym granted the order [PDF] on Tuesday. Apple has five business days to appeal.
And Apple doesn't want to help?
Apple CEO Tim Cook has written an open letter – really, a blog post – explaining why his company will refuse to craft a special iOS for the Feds. The Cupertino giant thinks it will set a bad legal precedent. It would also be terrible PR for Apple, which has repeatedly stated in recent months that it cares about protecting people's privacy and security.
What is Apple being asked to do exactly?
According to the order, Apple must create a signed firmware update that will only work on Farook's phone: identified by its unique serial numbers.
The update will be loaded onto the phone, most likely during power-up via a USB cable, and will disable the auto-wiping feature in iOS plus remove any delays to the brute-forcing process.
The mobile operating system introduces delays between PIN entry attempts, ramping up to an hour-long wait after the ninth incorrect passcode. The Feds don't want to enter thousands upon thousands of possible PINs at a rate of one an hour, and so they want this timing feature disabled.
Apple fears this custom/mutant iOS will be used in future against other phones. But Apple is the only company in the world that can cryptographically sign the firmware, meaning if the FBI tried to modify the serial numbers in the code to break into another iPhone, the handset would reject the tampered firmware. Also, the judge has said the custom firmware can remain on Apple property, so it is entirely possible for Apple to keep this weakened software out of g-men's hands.
Wait, I thought iPhones are super secure – if Apple couldn't break into a locked iPhone, it would have said so, right?
Right. Apple can do what the FBI is asking it to do but doesn't want to admit it. iPhones and iPads since the iPhone 5S, which was launched in September 2013, have a component in their processor called the Secure Enclave [PDF, page 7]. This enclave – a small microprocessor running an L4 microkernel – controls access to the keys used to decrypt files in a device, among other functions.
When you enter your passcode, the digits are used to gain access to the necessary decryption and encryption key from the enclave. No valid passcode, no valid key. The enclave also rate-limits the passcode guesses: it will wait an hour between attempts after the ninth wrong entry.
So how can Apple possibly help?
The iPhone 5C does not have a Secure Enclave. This is a crucial part of the story. The speed at which PINs can be guessed is controlled not by a Secure Enclave but by iOS – a piece of software that Apple can change through a firmware update via a USB cable. A modified iOS can reduce this delay to 80ms – the time needed by the hardware to check if a passcode is correct – and allow the FBI to enter in a string of PINs at high speed from some external tool.
The iPhone 5C's A6 processor does hold a secret key needed to encrypt and decrypt its files, and this key cannot be read by the operating system; instead it can only be accessed by presenting a valid passcode to the processor. However, the A6's hardware is not as elegant as the Secure Enclave, and can be queried a dozen times a second with passcode guesses to make it cough up the key. This allows the 5C to be brute-forced within a reasonable amount of time.
So in short, the lack of a Secure Enclave will allow the FBI to eventually guess Farook's passcode – the artificial delays and the auto-wiping after 10 wrong attempts just needs to be disabled by Apple.
All the technical info is here, provided by security researcher Dan Guido. He writes:
I believe it is technically feasible for Apple to comply with all of the FBI’s requests in this case. On the iPhone 5C, the passcode delay and device erasure are implemented in software and Apple can add support for peripheral devices that facilitate PIN code entry. In order to limit the risk of abuse, Apple can lock the customized version of iOS to only work on the specific recovered iPhone and perform all recovery on their own, without sharing the firmware image with the FBI.
If the iPhone 5C had a Secure Enclave – like any newer iPhone with an A7 or better processor – the physical hardware of the device could slow down the unlocking process to the point where it would be impossible to brute-force your way through all possible combinations.
It is possible for Apple to update the Secure Enclave firmware to switch off this delay, but Cupertino is keeping that little fact close to its chest right now.
So Apple basically doesn't want to admit that it is possible to install a firmware update on a locked iPhone?
Yes. Apple has been told to get its customized iOS onto Farook's iPhone 5C during boot-up without having to unlock it – and the company hasn't denied it can do that. According to iOS security guru Jonathan Zdziarski, it is entirely possible for Apple to install a firmware update on a locked device – an ability that may surprise some people:
- Apple has firmware signing capabilities for all of their devices, and are the only ones in the world that can boot custom software without exploiting a device.
- Firmware updates run as a RAM disk on iOS devices, which is similar to booting off of a USB stick.
- Apple CAN write a custom RAM disk (as a “SIF”), sign it, and boot it on any iOS device from restore or DFU mode to run from memory.
Chris Eng, veep of research, at infosec biz Veracode added:
The issue here is not one of creating a backdoor; nor is the FBI asking for Apple to decrypt the data on the phone. They’re asking for a software update (which could be designed to work only on that one particular phone) which would then allow the FBI to attempt to crack the passcode and decrypt the data. Such a solution would be useless if applied to any other phone.
In the past Apple has complied with requests to, for example, bypass lock screens in aid of criminal investigations. It’s only in recent years that they’ve taken an ideological stance on consumer privacy. I believe Apple is taking this position less as a moral high ground and more as a competitive differentiator, betting that Google won’t do the same.
Essentially, Apple can help cops break into your iPhone 5C, if they're holding the handset in their hands, but it just doesn't want to admit it.
OK, so why is Apple going to war with the federal government over this?
It's hard to know for certain, but some or all of the following points are likely good reasons:
- As mentioned above, it doesn't want to admit that its phones can be updated even when locked, by simply connecting a USB cable to them. Sure, you're updating it with official Apple firmware – just in this case, the firmware is deliberately insecure.
- Apple doesn't feel it can back down now that it has publicly stood up to law enforcement and politicians on matters of privacy and security.
- It fears that agreeing to this request would set a dangerous precedent for future versions of iOS. You trust Apple with every update – and now Apple's being asked to demonstrate that it can quite easily create insecure versions of its software and release them.
- It sees a strong defense of customer data as a key differentiator in the market.
- It has been waiting for a test case and thinks it can win this one, possibly all the way up to the Supreme Court.
- It is still angry about the Snowden revelations and wants to force the US government into the open over its surveillance of citizens.
So that's that, then?
Wait, there's more. According to this affidavit [PDF], the FBI has access to Farook's device backups up to October 19. It appears the killer disabled his cloud backups after this date. So all this is over a few weeks of files on an iPhone. Farook also destroyed two other mobiles before going on his murderous shooting spree; two handsets that could have included crucial evidence.
Wow. Well, we've been standing by this water cooler for ages. It's lunchtime now!
Let's grab a Sushirrito. ®