Apple vs the Feds

wkleem
wkleem
Community Member
edited February 2016 in Lounge

There is a fight going between Apple vs US law enforcement. Between security and privacy, where does it begin and end? Where does 1Password fit into this?

«1

Comments

  • hawkmoth
    hawkmoth
    Community Member
    edited February 2016

    I have no particular expertise, but with respect to 1Password, I don't see what this could have to do with AgileBits. They never have the encryption keys necessary to unlock your data. They couldn't help if they wanted to. Whether the fact that it's a Canadian company might insulate them from the US government, I don't know. I don't know the international law on that. The main thing is that because the user always is the only controller of their data, there wouldn't be anything AgileBits would be able to do, even if the demand were made and even if they were inclined to comply.

    There is an argument out there that Apple could create the software needed to gain access to that one iPhone at issue. Whether you think they should do so, if that is true, seems to me to have little to do with 1Password. My $0.02 worth of musing.

  • wkleem
    wkleem
    Community Member
    edited February 2016

    Thanks @hawkmoth. There is an Ars Technica article if anyone wants to read:

    arstechnica.com/apple/2016/02/encryption-isnt-at-stake-the-fbi-knows-apple-already-has-the-desired-key/

    Cc: @jpgoldberg

    "In practice, encryption isn't usually defeated by cryptographic attacks anyway. Instead, it's defeated by attacking something around the encryption: taking advantage of humans' preference for picking bad passwords, tricking people into entering their passwords and then stealing them, that kind of thing. Accordingly, the FBI is asking for Apple's assistance with the scheme's weak spot—not the encryption itself but Apple-coded limits to the PIN input system."

    "As long as the phone uses a PIN, this would ultimately let the FBI unlock it. If it's locked with a secure password, unlocking the phone may well prove intractable even with the special firmware."

  • Pilar
    Pilar
    1Password Alumni

    Hi @wkleem,

    I think this blog post will help answer your questions about where AgileBits stand on all this:
    https://blog.agilebits.com/2015/04/29/back-doors-are-bad-for-security-architecture/

    :chuffed:

  • hawkmoth
    hawkmoth
    Community Member

    @wkleem - I has already seen the Ars Technia article, which I found very interesting.

  • jpgoldberg
    jpgoldberg
    1Password Alumni
    edited February 2016

    AgileBits supports Apple in this case.

    From a tweet yesterday

    Back doors are bad news for security and privacy. AgileBits supports @tim_cook and Apple in the need for encryption. https://t.co/GNtjKOb6rZ

    Now I should point out that as individuals we cover a wide range across the political and social spectrum. And I certainly haven't spoken with each and every individual about this specific case, but despite our political differences I think that if there isn't a complete consensus on this, there is a near one. That is because this is a technical issue about our customer's security.

    Some clarifications of misunderstandings

    The technical and legal issues in this case are not always the same as how many people perceive it. So I'd like to clarify things.

    It's not a search warrant

    Apple is not being asked to turn over information in its possession. It appears that Apple has long since complied with any warrants of that nature.

    Instead, Apple is being asked to build a weak iOS update for the specific device and sign that update so that it can be installed. Once that update is installed, it would allow the government to run a passcode guessing attempt at the rate of 4 guesses per second (because of the use of PBKDF2 in the iPhone 5C.)

    There is a big difference between the 5C and the 5S

    The 5S was a huge turning point in iOS device security. What Apple is being asked to do for the 5C would be far less effective for the 5S and later devices. The device in the FBI's possession is an iPhone 5C.

    5C and before

    In the 5C and before there are two layers of defense against guessing at the device passcode.

    1. The user interface in the operating system won't let you make repeated guesses, and if configured will even destroy the encryption keys after too many failed guesses.

    2. The passcode is processed with PBKDF2 blending in a hardware key so that guesses must take place on the device itself. On the 5C PBKDF2 is used so that each guess takes about 1/4 of a second.

    If a weakened version of iOS is installed on the device, then the FBI would be able to start guessing passcodes at a rate of 4 per second. From what I understand, there is a 6 digit passcode, so it would take about half a day to test all such passcodes.

    5S and beyond (the Secure Enclave)

    On iOS devices with the Secure Enclave, there are three layers

    1. The user interface in the operating system won't let you make repeated guesses, and if configured will even destroy the encryption keys after too many failed guesses.

    2. The Secure Enclave is a burned in system that manages all requests from the operating system to the important keys. In this case it duplicates much of what the operating system does, and limited repeated failed passcode attempts to one every 5 seconds.

    3. The passcode is processed with PBKDF2 blending in a hardware key so that guesses must take place on the device itself. On devices with a secure enclave PBKDF2 is used so that there can be about 12.5 guesses per second.

    Apple, but creating and signing a weakened operating system would still
    leave the FBI either finding a way to defeat the secure enclave (Apple can't help there beyond providing technical specifications, which they presumably have in other cases, if not this one.)

    If the FBI can't defeat the Secure Enclave on such a device (not the device in question) then it would take 5 million seconds (several months) to go through all of the possible six digit passcodes.

    The Secure Enclave is meant to provide security even if a bad or broken version of the operating system is installed.

    Not about the attackers' privacy

    I'm not a lawyer, but I don't think that anyone would dispute that the government has the right to the data that they seek. The question isn't about whether the FBI has the right to that data; the question is about what the FBI can ask someone who doesn't hold the data to do. After all, Apple doesn't hold the data.

    Let me give an extreme example to help illustrate this distinction. Suppose that the government has the legal right to a particular paper document, and nobody disputes that they government has the right to that document. But now suppose that the document is flying around in the middle of a shark-infested tornado. Would the government have the right to compel me to retrieve it for them? Obviously not.

    Sharknado

    So from a legal point of view, the question is whether the government has the right to compel Apple to help them in this particular way. Again, Apple is not being asked to hand over something in their possession. Nor are they being asked to provide technical consultation.

    For that device only

    Apple is not being asked to create and sign a bogus version of iOS that could be installed on any device. The weakened version would be tied to that specific device.

    Why object?

    Now that we have a better understanding of what Apple is being asked to
    do and what they are capable of doing, the question arises as why we feel they are correct to appeal the court order.

    After all, it is in Apple's power to do what is being demanded of them (create a weakened version of iOS and sign it as an update), and nobody disputes that the government has the right to the data on that phone. So why do we, and very much I, believe that it is vital for Apple to fight this?

    But I am going to have to leave that for later. I've got some errands to run and have not yet had morning coffee. Watch here for part two some time later today (or perhaps tomorrow).

    Update: OK. Part Two appeared more than just a day or two later.]

  • wkleem
    wkleem
    Community Member

    A new development in the case of Apple vs DoJ. Someone in the San Bernadino Office (the employer) attempted to reset the AppleID password of the phone and succeeded but then iCould stopped working properly after that and he could not get back in. Scary!

    http://9to5mac.com/2016/02/19/apple-doj-response-fbi-backdoor/

  • AGAlumB
    AGAlumB
    1Password Alumni

    This whole thing is just weird, in that actual information is coming only after the lawyers started talking. But it's creepy because a broken iOS build signed by Apple meant "only for one device" could easily fall into the wrong hands.

  • wkleem
    wkleem
    Community Member

    Hi,

    Have you read John Gruber's take on this issue?

    daringfireball.net/2016/02/san_bernardino_password_reset

    daringfireball.net

  • jpgoldberg
    jpgoldberg
    1Password Alumni
    edited February 2016

    This is part II of my comments. The first part was laying out some important facts. This second part is mostly in answer to why the case is relevant to us and the security of our customers.

    I meant to write a detailed explanation about why this is relevant to us and why we are supporting Apple, but I just haven't had the time.

    As @Pilar mentioned, we've already stated (last April in this example) that even creating the possibility of a back door reduces everyone's security:

    1. There is no such thing as a security hole that only the good guys can use.

      This point is illustrated by the more recent Juniper Screen OS case. There was this really cool back-door created by the NSA for which only the NSA has the key. Even knowing the source code doesn't give you the back door key. So this looks like an example of a back door that only one party can use. The problem is that it is possible for someone to replace the lock on that back door to use a different secret key. Someone did that, and we have no idea of who.

      I wrote a bit about that (and more on some of the math) in When back doors go bad: Mind your Ps and Qs. So even something that looked like the ideal back door turned out to create a weakness that has been used by anybody's guess.

    2. A meta back-door

      To comply with this order (and others to follow), Apple would have to set up a unit within the company to provide such compliance. This unit would have the power to install a back-doored or weakened OS on individual devices. That unit then becomes a point of attack and not just by lawful orders.

    How is this relevant to us?

    When we design a system to be secure, we have to look at the weakest points and figure out ways to strengthen them. Note that any system that has more than one part is going to have a "weakest point". That doesn't mean that the point is weak. It only means that the other parts are stronger.

    Let me start with an example before I get into the scary bits. If we've done our job right, the weakest point of your 1Password security is your Master Password. This is why we spend so much time trying to encourage people to use strong Master Passwords, our use of PBKDF2, and in 1Password Families, our use of two-secret key derivation (Master Password plus Account Key). If you have a decent Master Password, this "weakest point" is still strong, but as a the least strong point it is the thing that is my job to worry about.

    Private by design, but ...

    Now we are good, trustworthy people, but we are another "least strong" point. This is why we have gone to so much effort to design 1Password so that we don't have any of your secrets and we don't have the capability of acquiring your secrets. We want your secrets to remain safe even if we are compromised or turn evil. We've been calling this principle private by design.

    For example, we've set up the authentication process for Families so that absolutely no secret is transmitted during the process. Thus we cannot acquire any of your secrets during authentication. Similarly, we've set things up with two-secret key derivation so that we couldn't even attempt a password cracking attack against your Master Password.

    ... but

    The least strong part of this is if we delivered a back-doored client to you. Now we've gone to great lengths to build 1Password so that our claims about its security can be independently verified. What these means is that if we were to do something nasty to 1Password in general there are enough people poking around at it that it would be detected. We would then be drawn, quartered, fed to the lions, and made to listen to Vogon poetry. You really are safe from such nefarious behavior.

    Attack of an individual

    Note, however, that the protection you have in that scenario depended on the back-doored version being delivered to lots of people. But the concern is what if a special version is crafted to be delivered to just one person. Either that one target detects it or they don't. There is no longer that safety in numbers. (Furthermore, if we are legally compelled to do that, then getting caught doing what we are legally compelled to do is going to leave the lions hungry and the Vogon poets without an audience.)

    Currently what protects you from such a targeted attack is fact that, (a) we really aren't evil and have no plans to become evil, and (b) that it would be technically hard to keep such activity secret within our company while there are lots of people who would blow the whistle if we ever tried to build the mechanism for doing something like that. We couldn't build and deploy a system for doing an attack of the nature of "back-doored software for an individual target" without lots of us knowing that it is there.And plenty of us would scream our heads of if we ever moved to do anything like that.

    Now this isn't a perfect defense. And I'm sure that lots of people are thinking of ways that such a thing could happen. And so I will remind you that our concern is trying to find ways to shore up the least strong elements of your security. This means recognizing where those are and not being shy about acknowledging them.1 Again, the fact that this and your Master Password are the least strong parts of your security is a consequence of us doing our job right.

    OK. So the idea is that the only way you could be threatened by such an attack is if we managed to have a mechanism for delivering back-doored versions to particular individuals and we managed to keep that secret from most of the company.

    Punchline

    Now suppose that we were (or could be) legally compelled to set up such a mechanism. This would mean that such a mechanism would exist and you would have every reason to believe that it exists. Futhermore everyone who wanted to break into your data would know that such a mechanism exists.

    Obviously we would do what we could to protect such a mechanism, but for your safety and ours we should not have such a mechanism.

    We aren't evil!

    As I've said, we designed 1Password so that your data remains secure even if we turn evil. We aren't evil, we have no plans to turn evil, but the question has to be part of security design and decisions. Like Apple, we are trying to build products that are secure against us. Therefore we don't want to have a mechanism in place for delivering weakened systems to customers.

    Legal speculation

    Now the whole scenario I described above requires several jumps from present case. I'm hesitent to talk about this because not only am I not a lawyer, I really try not to play one on the Internet. Obviously if we were ever confronted with something like what Apple is facing we would lawyer up.

    Live targets?

    The sort of attack I described wouldn't work against someone who is dead. So if the precedent set by the current case is very narrow to devices in the possession of law enforcement, then it isn't bad for us (although it is still bad for Apple as they have to create a mechanism for subverting devices not held by the people who set the passcodes).

    Still the whole "having a procedure to install weakened systems for particular targets" is still a scary prospect. Even if that mechanism could only lawfully be limited to clients in the possession of law enforcement, it is not a mechanism that we really want in place. As I've said before and I will say again, there is no such thing as a security hole that can only be used by the good guys.

    Oh Canada!

    A United States legal decision has limited force in Canada. But it is also the case law enforcement the world over doesn't like being locked out of data that they are legally entitled to. (Let me say again that nobody disputes that the FBI has the legal right to the data on that phone.) I believe that the implications for the current case will be global even if not in some formal legalistic way.

    OK. That is enough for the short note I wanted to write.


    1. I'd prefer not to get side-tracked into discussions of open source, deterministic compilers, posting hashes, etc. Sure there are things that we can do to strengthen this part of the system, but we have to measure them against their costs and the probability of the actual threat.] ↩︎

  • prime
    prime
    Community Member

    I been reading so much about this, very interesting.

  • wkleem
    wkleem
    Community Member

    @prime, It is interesting and frightening at the same time.

  • AGAlumB
    AGAlumB
    1Password Alumni

    Indeed. Keeps getting weirder. I think this case will be playing out for a very, very long time.

  • Magne
    Magne
    Community Member

    @jpgoldberg

    If a weakened version of iOS is installed on the device, then the FBI would be able to start guessing passcodes at a rate of 4 per second.

    On devices with a secure enclave PBKDF2 is used so that there can be about 12.5 guesses per second.

    I'm not sure I understood this correctly. On a device with secure enclave and PBKDF2, there can be more guesses per second? Don't you mean fewer?

  • jpgoldberg
    jpgoldberg
    1Password Alumni

    Excellent question, @Magne.

    Because devices with a secure enclave provide rate limiting there (1 guess per 5 seconds after the second failed guess), they have scaled back the protections offered by PBKDF2 (PBKDF2 eats power). Take a look at what I wrote in my earlier comment about the differences between the 5S and the 5C.

    So whether we want to say 0.2 guesses per second (one guess every five seconds) or 12.5 guesses per second depends on your view of the ability to by-pass the secure enclave. But when speaking solely in terms of PBKDF2 it is 80 milliseconds for devices with the secure enclave and 250 milliseconds for devices without it.

    I hope you didn't come here expecting simple answers!

  • wkleem
    wkleem
    Community Member

    Thanks @jpgoldberg for your summaries. It's still a little (or a lot!) over the TOP for my liking.

  • AGAlumB
    AGAlumB
    1Password Alumni

    Sometimes when Goldberg talks, my brain just says "NOPE"...but I find myself learning a lot anyway, because he understands this stuff well enough to answer even my silly questions. :lol:

  • jpgoldberg
    jpgoldberg
    1Password Alumni

    Years ago @Nik pushed me onto Twitter. I think he wanted to watch me squirm with a 140 character limit.

    So yeah @wkleem, my "summaries" have to be published in multi-volume sets.

  • Magne
    Magne
    Community Member

    @jpgoldberg Thanks again for a clear and lucid answer. :-)

  • AGAlumB
    AGAlumB
    1Password Alumni

    :chuffed:

  • wkleem
    wkleem
    Community Member

    Couldn't the FBI force someone to use their fingerprints on a TouchID device?

    Thanks

  • AGAlumB
    AGAlumB
    1Password Alumni

    @wkleem: It really isn't out of their way, since they're taking your fingerprints anyway. Or anyone else using physical coercion, or grabbing your hand while you're passed out drunk on the train (don't ask).

    That's why it's so important that 1Password requires the Master Password (or iOS, with the device passcode) after a restart: it's easy to just shut off your iPhone. Probably a good idea going through airport security, or before doing anything stupid. :pirate:

  • wkleem
    wkleem
    Community Member

    Right you are, Brent. I'll try to remember that, of which the hard part is REMEMBERING. :(

  • AGAlumB
    AGAlumB
    1Password Alumni

    Forgetting your Master Password is bad news...but it's still better news than finding out someone else can get into your data. I think that brings us full circle in this discussion: security has its drawbacks (inconvenience, inaccessibility), but they pale in comparison to living in a world without it.

  • prime
    prime
    Community Member

    There is another password manager that doesn't require the master password, and I didn't like that.

    I stopped looking at this case as an Apple Vs. the Feds to Encryption Vs. the Feds, because this effects all companies who want to protect privacy.

  • jpgoldberg
    jpgoldberg
    1Password Alumni

    @wkleem asked

    Couldn't the FBI force someone to use their fingerprints on a TouchID device?

    In the case of the San Bernardino killer's phone this doesn't apply. It is an iPhone 5C, so doesn't TouchID.

    There are apparently sensors in the TouchID scanner that looks for signs of life. These had to be simulated by some special hardware when the Chaos Computer Club demonstrated both that it is possible and how hard it is to lift a fingerprint and use it for TouchID.

    As I frequently like to point out, every Hollywood script writer knows what's wrong with biometrics. But so do the designers of TouchID, so they were very very careful in what they did.

    First of all, we need to remember that the fingerprint is not a replacement either for a device passcode or for your 1Password Master Password. It is a convenience mechanism so that you don't have to enter your Master Password (or device passcode) as often as you otherwise would. This is why it was designed so that there are many conditions under which you can't use TouchID. For example, if the phone hasn't been unlocked in the past 48 hours you must use the passcode, or if it has been powered off.

  • wkleem
    wkleem
    Community Member
    edited March 2016

    @jpgoldberg, Do you know the differences between Touch ID sensors? I have an iPhone 5s. The iPhone 6s/6s Plus are said to have an improved version of Touch ID. The rumour is that Apple will launch another 4" iPhone this month (March 2016).

    Thanks for this informative discussion.

  • AGAlumB
    AGAlumB
    1Password Alumni
    edited March 2016

    @wkleem: I believe that the newer sensors themselves have improved hardware, which is why they read TOO DANG FAST (lol), but I don't believe that this has any impact on security: in either case, new/fast or old/slow, the Touch ID data is stored in the SoC's Secure Enclave.

  • wkleem
    wkleem
    Community Member

    It's been reported that the FBI has cracked the iPhone without Apple's help. Is there more to the issue then? A as yet unknown vuln?

  • AGAlumB
    AGAlumB
    1Password Alumni
    edited March 2016

    @wkleem: While it's impossible to say for certain without knowing the details, I still suspect that the silly 4 digit passcode will be the key. I also suspect that the FBI knew this all along, and was simply hoping to use this case as precedent. It didn't really work out that way though. ;)

  • jpgoldberg
    jpgoldberg
    1Password Alumni

    We don't know how the FBI cracked the 5C. It might be via one of the issues that was fixed in the latest update to iOS 9 or it may be that it was something that still applies to updated 5Cs today.

    We also don't know what is behind their timing. They've had the device since the attack. My speculation is that what changed is their judgement about what sort of ruling and precedent would be set. They started out confident that they would win this case and thus set the precedent that they wanted, but the combination of the legal filings and related recent rulings suggests that they might have lost the case big time. They are trying to pick a case they can win to help establish precedent for what they want. They chose this case and then they withdraw from it when they saw it might not go their way.

    So Apple hasn't actually won a victory, they just avoided suffering a defeat. The government gets to pick the case that will be tested, not Apple or any of us.

This discussion has been closed.