Wrong Superhero – Why Apple doesn’t Deserve the Praise it’s Receiving

The “FBI–Apple encryption dispute” has come to an – preliminary, at least – end with the FBI announcing yesterday that they have no interest any more in Apple assisting them to recover user data from a seized iPhone that once was used by a terrorist. The FBI says that they have been able to recover the data without Apple’s help. Apple has received much public support for their opposition to provide software to recover the data from the phone. I believe that Apple doesn’t deserve this sympathy and is the wrong superhero to adore. Contrary to what they say in their press releases, Apple is not protecting their user’s freedom. Even though they might have put security measures into place that are distinguishing compared to those of other competitors, Apple’s products are mistreating their users just as any other product based on proprietary software. Sadly, there is no smart phone available today that runs exclusively on free software and gives control to the user instead of the vendor, which is why I don’t use or even have a smart phone.

The whole talk about the “FBI–Apple encryption dispute” is highly disturbing. If anything, it is not about cryptography. Cryptography is about securing information from unauthorized access by means of math. Math is not controlled by anybody. (Although some people might withhold interesting findings from the public.) If the FBI wanted to break the cryptography of the phone in question, it wouldn’t need Apple’s help at all. The FBI can run the math as well as anybody else. The fact that the FBI “had to” (more on that later) ask Apple for assistance is that the iPhone does not rely on math for keeping its users’ protected but rather on the authority of Apple. This is wrong. Nobody deserves for their freedom to depend on the mercy of any authority, be it Apple or anybody else. As a matter of fact, freedom means having to obey no authority in the first place.

Technical Background

Regardless of the many bad things I have to say about Apple, at least, they have published a decent document about the iOS security model and I’m referring to the information disclosed therein for my further discussion. It might be inadvertently or deliberately incorrect, and given that the software is proprietary and cannot be inspected publicly, we would never know, but I have no reason to assume it is at this point. Even if it is accurate, Apple is not to be praised.

The persistent storage of user data on a recent iPhone is encrypted with a strong symmetric block cipher (AES-256) that is believed by today’s public cryptographers to be impossible to break even with the most powerful computers available. Unlike block-level encryption as it is commonly used with Linux, Apple’s encryption seems to work on the file-system level. Each file is encrypted individually with a so-called file key which, alongside with the file’s meta-data, is itself encrypted with a key derived from the so-called file-system key and a so-called class key. If I understand correctly, then the purpose of the class key is to create different classes of files, maybe for use by different kinds of applications. The file-system key is stored in plain-text in a special region of the flash device that can be wiped at the transistor level. Once this key is effectively wiped, no data on the flash device can be accessed any more. The class key is derived from the user’s “passcode” and a hardware key. The passcode is, ideally, only known by the user and entered when unlocking the phone. The hardware key is a random key generated at the time the phone is fabricated and permanently stored in a dedicated cryptography co-processor chip. This chip is designed such that it won’t output the key directly and make it difficult to extract even given physical access to it. Apple claims that it doesn’t know or record the hardware keys of the phones it sells.

As can be seen, all data can be recovered if the hardware key and the user’s passcode are known. The passcode is a four-digit PIN that is extremely easy to guess. Even the stupidest attacker could sit down for a weekend with the phone and try entering all 10 000 possible combinations. In order to thwart this attack, Apple has programmed the operating system of the iPhone to do two things. First, it will introduce artificial delays after failed attempts to enter the passcode, which will grow exponentially up to an hour with the number of failed attempts. Secondly, it will trigger a wipe of the stored data after 10 failed attempts. As I understand, this means that only the file-system key is wiped from the flash storage and I believe that this is a security flaw. I shall discuss this soon.

Since trying to brute-force the passcode through regular user inputs is not an option and because the hardware key is said to be difficult to extract from the chip, what the FBI wanted to have is an “update” of the iOS that would not introduce artificial delays, not trigger a wipe after a fixed number of failed attempts and presumably also allow them to enter the passcodes programmatically (eg via the USB connection) instead of having to pay an official for typing his fingers wound. Patching iOS to behave like this should be a trivial task. One would simply comment out a few lines of code, re-build iOS and deploy this insecure patch to the seized phone. As I understand, the iPhone will automatically install updates without asking for user confirmation, which one might or might not consider a flaw. However, it will only do so if the patch has a valid cryptographic signature by Apple. Apple’s public key is burned onto the iPhone’s chip during fabrication. Even if the phone could be tricked to accept the malicious patch, unless it bears a valid signature, the firmware will refuse to boot it.

This mechanism is marketed as “secure boot” and could indeed be a useful feature. Unfortunately, if the owner of the device cannot change the list of trusted public keys, it does more harm than good. Not only does it prevent attackers from installing software that the device vendor has not approved of then, but also the legitimate owner of the device. This is why we prefer to call it restricted boot. It doesn’t have to be like this, but Apple prefers it that way.

So, while the FBI could easily – assuming they have access to the source code of the iOS – prepare the malicious patch, they would need Apple to sign it with their private key. Apple is, presumably rightfully, claiming that doing so would unsettle the faith of their users and that the existence of the signed malicious patch would pose a threat to all of its users as it could be deployed to any iPhone.

Speculation

I have the suspicion that the entire “FBI–Apple encryption dispute” could have been made up. It seems to me that there are numerous attack vectors that, with help of their friends from the NSA, would be well within reach for the FBI and wouldn’t require collaboration on Apple’s behalf.

  • Physically recover the hardware key from the iPhone’s chip by removing and opening the chip and inspecting it under a microscope. This possibility has been publicly discussed and seems feasible, if risky, as they would only have a single chance to either recover the key or else destroy all data permanently.
  • Copy the data from the flash device to a backup location, try entering the passcode until the phone wipes itself, copy the data back, continue. Without circumventing the delay mechanism, this would be extremely time-consuming, though. (In a more secure design, the hardware key, which, unlike the flash storage, is not readily accessible, would be wiped, too.)
  • Use a hardware-level backdoor in the cryptography chip to unveil the hardware key, if one exists.
  • Flash their own public key onto the phone’s ROM or replace it so the phone will boot a compromised kernel. Depending on how separated the ROM is from the cryptography co-processor, this seems to be a fairly low-risk operation.
  • Deploy any of the publicly known iOS “jailbreaks” to safely boot their corrupted kernel. Isn’t Apple’s restricted boot known to be broken?
  • Steal the private key Apple is using to sign official releases.
  • Use a ready-made backdoor in iOS, if one exists.

It seems unlikely to me that none of these attacks would be within their scope. On the other hand, publicly fighting with Apple had desirable implications for both parties. Apple has created the impression that it deeply cares about the privacy of its users and that its products are secure even against the most advanced attackers, which is a growing concern of many customers and will be beneficial for their sells. The FBI will also be interested in making the general public think that…

  • …they cannot break the cryptography used in the iPhone for encrypting the user data and signing the software.
  • …iOS has no backdoor accessible to security officials.
  • …there is no backdoor in the iPhone’s hardware that can be used by the state.
  • …the state does not collaborate with vendors of proprietary operating systems.

While security agencies generally like the people to believe they are powerful, at the same time, they try to keep them as ill-informed as possible so they will continue using weak technology. Maybe they decided that after the thing went quite well for a while, the United States would look too silly if they had to pretend that they were unable to recover the data so they let a friendly forensic company “help” them recover the bits from the chip, which doesn’t invalidate any of the above impressions.

I hate speculation and I’m in no way claiming that any of these possibilities is true. I do think, however, that they are too plausible to ignore them completely. The fact that Edward Snowden has claimed that the “FBI saying it can’t unlock [the] iPhone is bullshit” might be relevant, too. Although I would have preferred technical explanations over vulgar expressions.

Discussion

Apple’s security model is technically and politically broken and ethically wrong.

Despite all the effort Apple has apparently put into making the iPhone seemingly secure, it is not. At least not in a cryptographic sense. As far as math is concerned, given all information that is permanently stored on the phone, there are only the four digits of the passcode that are unknown and these are trivial to brute-force. All security comes from the assumption that parts of the storage on the phone are extremely difficult to access even when one has physical access to the device and advanced forensic equipment. Security in a cryptographic sense can only be achieved by using a strong pass-phrase that is not persistently stored on the device itself. And even then, security is only granted when the device is powered off. For example, on my Laptop running GNU/Linux, I have encrypted the hard disk with LUKS, using a strong pass-phrase that I need to enter from memory each time I’m booting the machine up. While the machine is running, I ensure that nobody gets unauthorized physical access to it. When I have to leave it alone, I make sure to power it down which wipes the decryption key from the computer’s memory. Unless there is a flaw in the implementation of LUKS or a major break-through in cryptanalysis, the data from my disk can only be recovered if I’m telling the pass-phrase. If I’m dead, nobody will ever get to know what’s on the disk. It doesn’t matter whether they take the LUKS maintainers to court, threaten or torture them. Even if they wanted, they couldn’t possibly help accessing the data. That’s the kind of security that supports freedom!

Of course, while I’m still alive and installing updates of my operating system – which, by the way, allows me to decide who’s public keys I’d like to trust – there is a risk that the maintainers of LUKS turn evil and release an insecure version of the software that will leak the key. As no group of people is perfect, this could happen to any software product. However, it is very unlikely that it will happen to LUKS or any other free software project. The reason is that – and this brings us to the core of my argument – unlike Apple’s proprietary operating system, free software grants everybody who obtains a copy of it four essential freedoms.

  1. The freedom to run the program as you wish, for any purpose.
  2. The freedom to study how the program works, and change it so it does your computing as you wish.
  3. The freedom to redistribute copies.
  4. The freedom to distribute copies of your modified versions to others.

These freedoms guarantee that the source code for free software is or can be made available to the public. This means that anybody can inspect it and audit it for inadvertent flaws or intentionally added malicious “features”. If anybody detects such defects, they can immediately publish them and ask the maintainers of the software to fix them. If they refuse to do so (maybe they’ve turned evil or were forced to turn evil), anybody who has the skill can publish a corrected version of the software and the word will spread that it is this version to continue using.

When I’m telling people this, they usually respond that they are not knowledgeable enough to read the source code of computer programs. Even if they were, they wouldn’t have the time to study the code of all software they are using. That’s obvious. Nobody can read the source code of all software they’re using. But fortunately, you’re not alone. I didn’t read the source code of all the software I am using. But I did read enough of it to gain confidence that it is generally trustworthy. What is more important, I have seen many smart people who are passionate about maintaining an infrastructure of high-quality free software. I don’t overly trust any of them individually, but I do trust the free software community as a whole to have enough people who behave with integrity to keep our software clean. This concept is not new. It is, in fact, the same idea that builds the foundation of democracy. No single instance is trustworthy enough to exclusively depend on it. This is true for kings as much as it is for Apple. But acknowledging that the majority of people, while not perfect, generally prefers the good over the evil, and given a sufficient number of observant citizens, our society can go pretty far and ensure utmost freedom for everybody. People in a free society don’t need the mercy of a wrong superhero to grant their freedom to them.

If watching the “FBI–Apple encryption dispute” has let you gain confidence that your privacy is well-protected by using Apple products, please reconsider your conclusion and think whether the money you would have spent on your next iPhone wouldn’t serve your freedom better when donated to the Free Software Foundation, the Electronic Frontier Foundation, Wikipedia or any free software project that builds something you care about.

2 Comments

  1. Franz Klammler
    Posted 2016-04-17 at 17:44 | Permalink | Reply

    Dear Moritz,

    thank you for your well-grounded and excitingly written comment. I agree to it as far as it concerns yours and my requirements.

    Apple’s basic principle – as I understand it – is to to make digital devices as user-friendly as ever possible. And they go well with this principle. This user-optimization primarily is not intended to find the best procedures for it-experts but for people not able or not willing to deal with deeper insight to procedures.

    As a matter of fact there are better methods, ensuring higher security than a 4-letter-key combined with technical barriers. But I’m sure you know world’s most popular passphrases. The common user’s standard in security is like 123456. You will agree: in comparison to a “high-quality-passphrase” like this apple’s approach might be the better choice.

    Nobody needs to buy apple-products if he does not agree to this paternalism. So neither you own an apple phone nor I do. But sometimes it seems to be usefull to help unaware users to gain a reasonable standard in security even though they do not care about themselfes. This might be in contrariety to an independent lifestyle on one’s own authority – but this is the plurality of beliefs.

    Kind regards
    Pfiffer

    • Posted 2016-04-18 at 19:04 | Permalink | Reply

      I disagree. First, I don’t think that either Apple nor the majority of its customers would confirm that the iPhone is a product optimized for the needs of dim-witted people.

      Second, and more importantly, human rights are not a privilege of the intelligent. Granted, some human rights are more valuable to those who have certain abilities. For example, the freedom of speech will mean more to you if you are capable of forming and articulating a genuine opinion. But assuming that somebody else will not be able to put a human right to good use anyway so it cannot be that important for them is a slippery slope and any public discussion should steer clear of any such arguments.

      Third, and most important, however, if Apple would stop mistreating their users, nobody would lose and many would win. Those who don’t want to read the source code of a free software operating system don’t have to do so. But they can still benefit from the flaws spotted and fixed by others who are willing and able to do so. If you don’t want to replace Apple’s firmware, that’s fine. It wouldn’t cause you any inconvenience if the phone still had an option to install your own trusted signing keys for those users who want to do so. And they are many, as the massive interest in “jail-breaking” tools demonstrates. As a matter of fact, if it were simple and secure to replace the signing keys, those who fear the bad consequences of “jail-breaking” today might want to use their phone in a way that is more suitable to them while those who are already happy today wouldn’t be inconvenienced in the slightest. How absurd is it in the first place that using the hardware you’ve legitimately bought in the way you wish is termed “jail-breaking” as if it were something criminal or immoral to do and “putting their customers into jail” is something a company would want to do?

      Replacing the signing keys should trigger a full memory-wipe at the hardware-level. If it did, the resulting system would be no less secure than the iPhone is today. On the contrary, it would be possible to install a more secure authentication mechanism for those who demand it. I also don’t believe that it is true that most humans are generally incapable of using strong passphrases. It’s more that so-called security experts have talked people into believing that a password is more secure the harder it is to remember, which is a grave oversimplification at best. For example, I don’t think that remembering a four-digit number is any simpler than remembering four random words from your native language’s dictionary’s 10 000 most common words. Why shouldn’t a smart phone have an easy-to-use interface to guide you step-by-step through the generation of a very secure Diceware (https://en.wikipedia.org/wiki/Diceware) passphrase as you first power it on? In that setup, it would actually be difficult to choose a poor passphrase. The operating system could randomly wipe one of those words from its memory for each hour you don’t use it (and everything when powered off) and ask you to enter all wiped words in order to “unlock” it so the more frequently you use it, the less effort it takes to unlock. Auto-completion features for natural-language words already shipped with each smart phone would make entering the passphrase very easy. Yet, if an attacker got physical access to a thusly secured phone, they would only have a few hours to do an extremely risky recovery operation from volatile memory that would require special equipment until all traces of the passphrase would be lost forever. And by the way, if law enforcement agencies worldwide did have legitimate reasons to access data on a given device, they could either do it themselves or it would be outright impossible. Regardless of how an ever so stylish company currently feels about it.

      It is true that “nobody needs to buy apple-products” although the available choices for smart phones are limited and a truly freedom-respecting alternative is currently not available. Yet, not having a smart phone at all is more and more becoming a handicap that can hinder your social and career opportunities. I don’t want to prescribe any company what to sell and what not nor any customer what to buy and what not. What I do want and feel an obligation to do is informing people about maybe-no-so-obvious interrelations that are usually not mentioned in marketing slides so they can make a well-informed decision how to spend their money for their own maximum benefit, regardless how tech-savvy they might or might not be.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s