Digital forensics

An Editorial – Apple vs. FBI: What You Need to Know

February 19, 2016 by Rorot

The mainstream media was suddenly abuzz with the release of Apple CEO Tim Cook’s open letter to the Apple’s customers which calls FBI’s actions as “unprecedented”, “dangerous” and “chilling”. While Google was quick to support Apple, Microsoft too expressed similar intention (although it didn’t name Apple). This post explains everything that you need to know about the on-going tussle between Apple and FBI which is going to be a defining point for user privacy in future.


It is a well-known fact that federal and investigative agencies with a court’s order can ask private companies such as Apple or Google to provide personal data of their customers which can help their investigation. In December 2015, San Bernardino attack took place in California in which a married couple opened fire indiscriminately killing 14 and seriously injuring 22 others. Both of them were killed in a police shootout. They were later identified as terrorists radicalized by Islamic extremists, and Obama has called it an act of terrorism. During the investigation, FBI seized one of the killer’s iPhone and wanted to gain access to all the data stored in it. However, Apple’s encryption on iPhone was apparently something they could not break and this week a judge in California ordered Apple to assist the FBI in accessing the contents of that iPhone. During iOS 8 release, Apple has already stated that it would be impossible for them to decrypt any device since they don’t hold the encryption keys. The FBI then asked Apple to build a backdoor to the iPhone. Apple denied this request and its CEO Tim Cook wrote an open letter saying this cannot be done as building a backdoor would compromise its user’s data privacy in future. That’s the story so far!

Can Apple really not break into its own iPhone?

Apparently no one can, including Apple! Let us try to understand why. Although encryption is not new to Apple phones, starting from iOS 8, Apple has dramatically improved the way the data is encrypted. This came in the backdrop of revelations by Edward Snowden, who exposed the activities of NSA. Companies across the world are starting to tighten the screws on the safety of user’s privacy. Ok, so to get started this is all about encrypting the data at rest. In other words, all your data, including text messages, chats, pictures and all user data is stored in an encrypted way on your iPhone. So even if someone gets access to the data, it will be in an encrypted format and is of no use. Every encryption needs a key. The data is automatically decrypted when the user unlocks the phone. In other words on an unlocked iPhone, forensic experts can try to access all the information. Apple uses a 256-bit device unique secret key (UID) stored in the phone’s hardware, where it’s impossible to extract from. Apple also says it does not record these UID’s and cannot access them too. It also says that no software or hardware can read this key. This UID is then iterated with the passcode to generate a new key (passcode key) to encrypt data on the device. This iteration count is set to require about 80 milliseconds on the device. Here is a screenshot of the whole process taken from Apple’s iOS Guide.

Since there is no way to get access to the encryption key, only other way out is to brute force the passcode on the device itself. Even that is not possible because of that 80-millisecond rule! It dramatically slows down that process and restricts brute force to 12.5 attempts a second. This means it would take more than 5½ years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers. To discourage brute force attempts from the lock screen, there is an intention delay introduced between the failed attempts. If the device is restarted during a timed delay, the delay is still enforced, with the timer starting over for the current period. Also, if you try to brute force, there is an auto-erase function that will wipe the device after 10 failed attempts! All of these complicate the process of getting access to data on iPhone. So yes, there are some things that FBI cannot break into!

What exactly did FBI guys ask Apple?

If you followed the story over last two days, you must have understood that FBI has asked Apple to develop a backdoor for iPhone – a backdoor that will give them access to all the data. However, what kind of backdoor is this? Here is precisely what FBI has asked Apple:

  • Develop an image file that can be loaded to the iPhone. This will run from the RAM and will not change any user data present on the device.
  • The image file needs to be signed by Apple (Why? Only Apple signed software can run on Apple devices unlike Android, which allows any app to run).
  • This file can be loaded through recovery mode.
  • Once the file is loaded, it should bypass or disable the auto-erase function of iPhone and also disable any additional delay introduced by Apple between password attempts so that FBI can then brute force.
  • This can be conducted at Apple’s headquarters! This way Apple can retain the backdoor without sharing it with FBI and FBI can achieve what it wants. Great concession, isn’t it?

Apple’s Response:

Apple has out rightly rejected this proposal citing that while it wants to cooperate with FBI in every possible way, it cannot develop a backdoor because that would eventually fall into evil hands someday. Apple also called this “a dangerous precedent”. The Department of Justice was quick to respond saying this is requested only for the particular device in question (i.e. the San Bernardino’s attacker’s device). However, Apple argues “But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”

A battle for data privacy:

Apple surely does not buy the argument that this backdoor is for one-time use. Once created, there is no doubt that more requests would be raised in future (not just from the US but countries across the globe), and they clearly do not want this. FBI and other federal agencies clearly know that if there is ever a case where they can push this hard, it’s the San Bernardino case. On one hand it sounds reasonable because we would want all of those people who influenced or cooperated those terrorists to be brought to justice. Moreover, if Apple’s cooperation would help this, then we are definitely for it. However, after the PRISM episode, we can hardly have any trust in the federal agencies when it comes to respect for data privacy. An MNC cannot be just forced to build a backdoor into its own products. It’s important to look at this from a larger perspective rather than reducing it to a battle between Apple’s privacy vs. National Security. As someone mentioned it, if the USA can demand this from a private company just imagine what China or Russia could ask for!

Posted: February 19, 2016
View Profile

Rorot (@rorot333) is an Information Security Professional with 5.5 years of experience in Penetration testing & Vulnerability assessments of web and mobile applications. He is currently a security researcher at Infosec Institute. Twitter: @rorot333 Email: