back to article Apple must help Feds unlock San Bernardino killer's iPhone – judge

Apple must assist the FBI in unlocking the passcode-protected encrypted iPhone belonging to one of the San Bernardino shooters in California. US magistrate Sheri Pym says Cupertino has to find a way to supply software that prevents the phone from automatically annihilating its user data when too many password attempts have …

  1. Anonymous Coward
    Anonymous Coward

    It's appalling that law enforcement types are having to get a court order to compel Apple to do this. A severe crime has been committed, and it should be a privilege for a company like Apple to help the investigation. Instead Apple seems to want to obstruct it.

    I wonder how obstructive they'd be if someone that Tim Cook was very fond of had been killed in the attack?

    Sooner or later companies like Apple, Facebook, etc are going to have to realise that their public image and reputation will ultimately depend on cooperation with law enforcement. For example, Facebook are now known as having hosted child porn without having done anything about it. That can't be something a certain new father called Zuckerberg is happy about.

    How big does a content hosting organisation like Facebook have to be for it to be able to avoid criminal responsibility for hosting such material? It would be crazy if no one at Facebook was prosecuted.

    Presumably Apple with their encrypted iCloud are probably hosting such material too, but have no way of knowing. That may change now. If they are forced to develop this tool for law enforcement it may be that they're going to be regularly exposed as having given offenders the means of committing their crimes. Now that's not going to look good in the papers.

    1. DryBones

      Really? I seem to recall a certain number of companies that were offering all sorts of goodies for exploting phones, surely one of them included device imaging. Then they could brute force it all they wanted. No?

      No latex finger trick? Surely they have the prints of the owner. If this is reality for the FBI / CIA / NSA, they're really not very good, are they?

      1. Mark 85

        Fingerprint won't work in this case. Apparently they tried it according to some spot reports on other sites. It needs the PIN.

        According CNN, they destroyed their personal phones and the hard drive from their computer hasn't been found. This phone was his work phone issued by the county.

        I'm a tad surprised that the county didn't put in a back door code into it since we put in a way to get into the company's encrypted PC's, phones, tablets, etc. if someone dies or terminates (voluntary or involuntary).

        1. Anonymous Coward
          Anonymous Coward

          I'm not surprised at all. I'm also not surprised at the court order being approved. Where it gets tricky is when this gets in front of the 9th Circuit Court of Appeals. Riverside is notoriously conservative while the 9th CCA is extremely liberal.

        2. werdsmith Silver badge

          Fingerprint won't work in this case. Apparently they tried it according to some spot reports on other sites. It needs the PIN.

          Fingerprint won't work simply because it's a 5C and doesn't have a fingerprint reader.

          1. Tom 13

            @werdsmith

            If it had a fingerprint reader would it necessarily work? Or to frame the question even more clearly: Do YOU trust fingerprint readers on mobes to work?

            I sure don't and I don't really keep anything on my phone that NEEDS protecting. Instead I use a PIN I can easily remember.

        3. Tom 13

          Re: his work phone issued by the county.

          I hadn't heard that. You're quite right, that makes this an even more interesting case.

          I work for a government agency and part of securing the phone is installing software that does allow the user to reset the PIN in case they forget it. Most of the time we in IT can't use that route because the reset is tied to the email of the person owning the phone, but in a case like this we could be authorized to change that password so we could perform the PIN reset. It's not entirely foolproof since the agency does allow people to use personal Apple ID accounts, but even then you should be able to find the appropriate account, get the court order to reset that password, then proceed with the PIN reset.

      2. NotArghGeeCee

        [Tin_Foil] Or possibly they are good, they have already cracked it, have the information they want, and are launching this lawsuit to make it look like use of Apple gear is safe-for-terrorists-so-go-right-ahead-and-use-it all-you-want-with-impunity(tm)[/Tin_Foil]

        1. Anonymous Coward
          Holmes

          Or possibly they are adequate, they cracked the sham crapto easily and quickly, have the information they want, have completed the resultant investigations and are launching this lawsuit to make the data they've been using admissible as evidence against those it implicates - without broadcasting the reality of their decraption methods to the world.

          Like the little brother of their parallel construction policy - only the methods faked and they'll actually present genuine data. Would be something of an improvement, wouldn't it?

          1. tom dial Silver badge

            The police may need to have a warrant in order to use any search products as evidence in a prosecution, although there may be uncertainty about whom it should be directed to. But Apple does not own the phone, so assistance they provide in searching the phone probably is not relevant to admissibility of anything obtained as evidence. However, they may want to be seen as properly guarding customer data and have insisted on being compelled to assist. This may benefit the government as well by leaving it somewhat uncertain whether they actually have the ability to break encryption on phone data, and if they can, how quickly.

            This has nothing to do with parallel construction as sometimes has been used to obscure use of foreign intelligence information to obtain warrants.

        2. Spleenmeister

          Quite the opposite, I suspect. They are using a false-flag attack to finally get Crapple to release the keys to the (encryption) kingdom on iPhone. Mainly so they can rifle through everyone's phones, at will, for anything they damn well please. No tin foil necessary. It's happening along with plenty of other shenanigans commonly cited as "tin foil" worthy.

      3. CapnBob

        If the 5c had a fingerprint reader that might even work.

    2. Eddy Ito

      Why shouldn't law enforcement have to follow the rule of law and get a court order? Is there some magical level of crime that automatically eliminates proper procedures we require law enforcement to undergo? Who gets to draw the line that says if the crime is "X" bad LEOs can do whatever they damn well please to get information? How long do you think it will be before that line moves far enough to make the Stasi look like reasonable folk?

      1. LucreLout

        @Eddy Ito

        Why shouldn't law enforcement have to follow the rule of law and get a court order?

        That's exactly what they have done. You may disagree with their being awarded access to the data, or not, but you can't argue they didn't follow the legal process via the courts.

        1. Eddy Ito

          @LucreLout

          I was not making any comment about what has transpired but responding to the OP's statement; which was

          It's appalling that law enforcement types are having to get a court order to compel Apple to do this.

    3. jonathanb Silver badge

      Other parts of the FBI want Apple to make it impossible to do this so that phones are useless when stolen, and therefore people don't steal them. If Apple can overcome their own security, anyone else can.

      1. David 138

        If someone stole you phone they would probably just wipe it and the encrypted data. Encryption gives most people little or no useful protection in this case. However it does benefit criminals. Apple are just protecting their marketing not their customers.

    4. Spleenmeister

      You are so missing the point it's funny. Never mind the myriad of reports from the scene while this tragedy was unfolding that point the finger of suspicion diametrically away from these two losers. Some people will believe anything, which is what US law enforcement and the US Govt depend on, I suppose.

      1. LucreLout

        @Spleenmeister

        You are so missing the point it's funny. Never mind the myriad of reports from the scene while this tragedy was unfolding that point the finger of suspicion diametrically away from these two losers.

        *yawn* another tin foil hatter; just what the web needs.

        Ok, I'll bite. One, just one, credible news agency reporting "from the scene while this tragedy was unfolding that point the finger of suspicion diametrically away" from the two dead losers. No? But then, there never is.

    5. Triggerfish

      @OP

      Because rather than them wanting just this information, they seems to be using it as an excuse to force password cracking on Apple phones so they can play around with all your information, possibly without warrants.

      As far as I am concerned Apple are doing the correct thing.

    6. Graham Cobb Silver badge

      ...it may be that they're going to be regularly exposed as having given offenders the means of committing their crimes. Now that's not going to look good in the papers.

      Why? It doesn't seem to do car companies, electricity companies, or grocery stores any problem that they are used by criminals as well as non-criminals. What makes you think it would be a problem for Apple?

  2. Anonymous Coward
    Anonymous Coward

    Squeal like a pig

    Don't the ruling classes squeal long and hard when they can't get something THEY want?

  3. Phil Kingston

    " This code must only work on Farook's phone, identified by its serial numbers, and no other handset."

    Gonna be hard to test then.

    1. Anonymous Coward
      Anonymous Coward

      This implies it is possible to reflash the phone without unlocking it first.

      Presumably this means you just power cycle, enter into the boot loader, and the boot loader will happily reflash firmware without any confirmation that you are the owner of the phone.

      I can understand why this is done - the main firmware may be non-functional and you need a reflash to fix it.

      However the fact that the boot loader is unaware of the locking/unlocking mechanism sounds like a weakness to me. The only protection you have is that the boot loader will only flash signed firmware. But what if you took someone's phone, and loaded an old version of firmware with known vulnerabilities?

      1. LucreLout
        Gimp

        @AC

        However the fact that the boot loader is unaware of the locking/unlocking mechanism sounds like a weakness to me. The only protection you have is that the boot loader will only flash signed firmware. But what if you took someone's phone, and loaded an old version of firmware with known vulnerabilities?

        But what if instead of all that you just hit them with a $5 wrench until they give you the code?

        https://xkcd.com/538/

        I'm not sure requiring unfettered physical access to something you can then break into counts as much of a weakness. It works for all physical world applications of security, from locks to safes etc.

        I'm no Apple fanboi, but I think if the only way to get in was with a court order and Apples help, I'd consider my phones security good enough.

        1. cantankerous swineherd

          the threat model here is a democracy under the rule of law. the xkcd threat model is of a fascist dictatorship or a bunch of gangsters, in which case all bets are off. hth. hand.

  4. Phil Kingston

    Correct me if I'm wrong, but isn't it a bad thing for the court to have published serial and IMEI numbers?

    Good to see some common-sense at the end of the court order though - which I read as, if they genuinely try but somehow stuff it up and the data goes bye-bye that's OK.

  5. ShadowDragon8685

    This is an absolutely chilling, apalling thing for a court to order.

    If this works, I hope and expect that their next step in the security of people's devices is "your device doesn't trust anyone, not even us, unless you tell it to." IE, no bypass, and some kind of tamper sensors that fry the phone completely if someone tries.

    I really, really hope that they either appeal it successfully, or (and this is unlikely, I admit,) take a principled stand, say "okay," and whatever they give the FBI is an uber-bricking nuke that completely melts the phone, and then tell the FBI to see them in court, before a jury of twelve.

    [e]Even if they lose the court case, any fine the FBI can levvy will be annihilated in the good press (and sales) resulting from making it clear that they will not under any circumstances comply with any outrageous demand to bypass the security on someone's phone.

    1. tom dial Silver badge

      It would be interesting to see an explanation of the rather extreme claim that "this is an absolutely chilling, apalling thing for a court to order." It is not materially different from a case in which a locksmith might be asked to assist in opening a safe to assist police in executing a search warrant.

      1. Robin

        "It is not materially different from a case in which a locksmith might be asked to assist in opening a safe to assist police in executing a search warrant."

        Isn't the analogy more like the feds asking a locksmith to provide a skeleton key that will open any safe? It's not like they're going to only use it this one time, once they have the facility.

        p.s.

        [After walking in unannounced to Dutch's office]

        Dutch Gunderson: Who are you and how did you get in here?

        Frank: I'm a locksmith. And, I'm a locksmith.

        1. D@v3

          @ Robin

          The way I understand it, from the article is that they want a tool that will only work with a device who's serial numbers match the one in their possession. Which, if they then wanted to use it on another phone, would require them going back to apple, with a new serial number, for them to create a new 'key'.

          With the locksmith analogy, to me, it is more like the police going into a locksmiths, with a safe, and the locksmith somehow making a new key from the inside of the lock. They can then use that key, to open that lock, but it won't do them any good with any other safes.

          Of course, what they say they want, and what they _actually_ want could well be very different things.

        2. tom dial Silver badge

          No, it is not asking for the logical equivalent of a skeletion or master key. It is asking for assistance to unlock exactly one phone. The appropriate analogy in the case of a locked safe is skilled assistance to circumvent a combination lock. If other posters who appear to know more than I about Apple's implementation are correct, the key depends in part on physical characteristics of the security module that are unique to each phone; there is no master key.

          It is true that the procedures, once developed, will be applicable in other cases, but law enforcement access, with a proper court order or warrant, to material in their physical possession is well within the scope of what we are used to and what was built into the US Constitution from its beginning.

      2. Anonymous Coward
        Anonymous Coward

        Explanation? Good luck with that

        "It would be interesting to see an explanation"

        The OP I'm afraid comes across as just another hysterical juvenile anti establishment SJW. Coherent arguments are not their strong point. In their deluded paranoid minds, The Man cracking a phone of a terrorist is morally equivalent to the actions carries out by that terrorist.

        1. cantankerous swineherd

          Re: Explanation? Good luck with that

          so the man's welcome to crack your phone?

          1. tom dial Silver badge

            Re: Explanation? Good luck with that

            With a warrant based on probable cause, oath or affirmation, and particularly describing what is to be obtained: yes.

            That is what we consent to under the Constitution and Bill of Rights, and is exactly what is being attempted in the case at hand.

      3. cantankerous swineherd

        the Chinese police compel a US company to break into a US device (eg that of the US ambassador that they found in a bar) on Chinese soil?

        1. Tom 13

          Re: US ambassador that they found in a bar

          Not very smart are you. The whole point of diplomatic immunity is you can't legally do anything to an ambassador. So it would actually be against international law for them to seize the phone in the first place.

          Try again.

      4. Captain Queeg

        I might be reading it differently but to me the claim stands on accessibility.

        Opening safes is a slow process - needing a trained locksmith each time or a and physical access at a physical location to a (comparatively) large static bulky object.

        Done the right way a backdoor could be blindingly fast, remotely deployed and can be more or less automated. Also, done correctly it would be invisible.

        Whatever the rights and wrongs of this one case, it's the thin end of the wedge that is chilling and appalling.

        1. ShadowDragon8685

          Exactly. Thank you. Each and every safe has to be opened, one-by-one. What they're asking Apple to do is either make, or go 95% to the way of making, a digital lightsaber that can open every safe with just a flick of the wrist.

      5. ShadowDragon8685

        Except it is. They are attempting to compel a locksmith who's staked his commercial claim in the unbreakability of his lock's security to provide them with a skeleton key which can open any of his locks. This will cause faith in his locks to take a nosedive, as there is no guarantee whatsoever that such a key, once known to be extant, will not fall into the hands of criminals, corporate competition, private malefactors and foreign intelligence services, all of whom will have vested interest in obtaining that key.

        And there is absolutely no way in hell to ensure that it will "run only on this guy's phone" and "only on Apple or FBI computers." Once a software weapon like this is created, it can be stolen, or recreated, by others.

    2. Anonymous Coward
      Anonymous Coward

      Apple has ALREADY taken that next step

      There's no bypass as of iOS 8 in Sept. 2014 - that's what has some in the government so whiny about Apple and encryption. They changed it so they don't hold the encryption key for a user's device, the only place it is stored is in the secure enclave in the phone. The only way that key can be accessed is via fingerprint (if you have that enabled) or via the password/passcode (depending on whether you use 4/6 digits or the full keyboard for an unlimited length password) Thus it is impossible for Apple to unlock, even with a special version of firmware installed on the phone.

      However, since OS updates control how many attempts you get to unlock it, they've found a loophole - compel Apple to provide them a one-off OS update that allows unlimited attempts - and I assume no delay between attempts, so they can make some poor first year agent get carpal tunnel trying 1,010,000 possible passcodes. They're lucky a password wasn't used instead, there's no brute forcing that.

      1. werdsmith Silver badge

        Re: Apple has ALREADY taken that next step

        so they can make some poor first year agent get carpal tunnel trying 1,010,000 possible passcodes.

        The FBI also want Apple to help implement a way to rapidly try different passcode combinations, to save tapping in each one manually

    3. rh587

      "This is an absolutely chilling, apalling thing for a court to order."

      No it isn't. It's the equivalent of the Feds identifying that the suspect has a safe deposit box, and getting a court order requiring the deposit company to hand it over, and render assistance in opening it (in the absence of the suspect's private key).

      For the Feds to go to a court and request this and for the courts to say "Yes, this is reasonable" is exactly how due process is supposed to work! Whether they're getting a court-signed warrant to search your house, a subpoena to compel a witness to appear in court, or a court-order for a telco to disclose your call history.

  6. Anonymous Coward
    Anonymous Coward

    Damned if you do?

    So Apple provide a back door, which I think they've previously denied is possible, and the TLA's demand it all the time, indeed, they demand it built in.

    Or.

    Apple cannot provide a back door and the TLAs howl all the way up to the highest levels about how unfair and downright dangerous it is to allow the population to have secure crypto and demand/get a ban on secure crypto.

    Don't ever forget, this is happening in a seriously fucked up country that has past history on attempting to force backdoored crypto and restricted the sale of encryption technologies as 'munitions'.

    1. Anonymous Coward
      Anonymous Coward

      Re: Damned if you do?

      Clearly the feds are going to get all the mileage they can out of this. Either "this is terrible because even in a major case like this we were unable to break Apple's encryption and access the phone" or "this is terrible because we had to go through a big hassle of getting a court order and the delay cost precious time".

      Hopefully Russ Feingold can take back the seat he lost in 2010 this fall, we need more guys like him and fewer CIA dupes from both parties like Diane Feinstein and Richard Burr who support laws requiring encryption backdoors.

  7. codebeard

    I wonder if the flash memory can simply be copied into an emulator that can rewind after every pin attempt? No need to write anything onto the firmware itself.

    Personally I think this is a reasonable request. If you "encrypt" your files with a 6 or so digit pin, that's more of an inconvenience than an actual attempt at security. For comparison, my password storage is encrypted with a random 25 character alpha + numeric + symbol password.

    6 digits = 20 bits (approximately)

    25 random characters = 155 bits (approximately)

    The latter would take 4 x 10⁴⁰ times longer to brute force.

    1. Anonymous Coward
      Anonymous Coward

      No

      The device key is actually stored in the secure enclave, which is essentially a tiny isolated computer on the SoC. If they removed the flash and copied its contents it doesn't help them, because it is encrypted with a 128 bit AES key that's unlocked via the 6 digit PIN not generated from it.

      I think if the terrorists went to the lengths of destroying their personal phones and hard drives, the fact a PIN instead of a real password was used on the work phone (and it wasn't destroyed along with everything else) means there is probably absolutely nothing useful on it.

      But that won't stop the FBI from using this as propaganda in their war against encryption.

      1. Anonymous Coward
        Anonymous Coward

        Re: No

        > If they removed the flash and copied its contents it doesn't help them, because it is encrypted with a 128 bit AES key...

        I wonder how much computing power you could crowdsource for the purpose of uncovering a group of mass murderers?

        Like the SETI thing was, but more of a search for murdering bastards rather than extra terrestrials.

        1. g e

          Re: No

          Search for

          Murderous

          Unwanted

          Gunmen

          ?

        2. Anonymous Coward
          Anonymous Coward

          Re: No

          > I wonder how much computing power you could crowdsource for the purpose of uncovering a group of mass murderers?

          Certainly not enough to try 2^127 key combinations in a feasible timescale.

          With 10 billion devices, *each* trying 10 billion keys per second, it would take about 54 billion years.

          1. Anonymous Coward
            Anonymous Coward

            Re: No

            The FBI have probably already tried this, but depending on how old the iPhone 5C was and I'm assuming the now deceased owner used it from new for months or years. It might be possible to analyze the glass of the phone screen and detect minute variations of wear, scratches etc caused by particles of grit on the users skin. The same PIN code typed repeatedly over the course of days, weeks and months might be detected under certain kinds of Microscopic inspection available to the FBI. This data could then be used to narrow down the search space for the PIN code.

            1. NB

              Re: No

              All reports seem to indicate that the device in question is locked with a 4-digit pin code. Let's say you can identify which 4 digits that is, 4! == 24, so even with 4 digits there are 24 possible combinations and you only get 10 tries before the device either locks or wipes itself on the 10th failed attempt.

              Tricky...

          2. Anonymous Coward
            Anonymous Coward

            Re: No

            >With 10 billion devices, *each* trying 10 billion keys per second, it would take about 54 billion years.

            There's always some mathematician to come along and ruin the fun.. :D

      2. codebeard

        Re: No

        The device key is actually stored in the secure enclave, which is essentially a tiny isolated computer on the SoC.

        Yes, you're right. And no doubt the counter for the number of failed attempts is stored there also. I wonder if there is some way to trick the SoC into reading the counter as 0 every time, or induce a failure to write to the value.

  8. Anonymous Coward
    Anonymous Coward

    So many potential issues:

    If Apple refuse the order saying that it is impossible for them to do, will they be ruled in contempt of court?

    What if they accidentally brick the phone? Will they be ruled in contempt of court or at least charged with impeding justice?

    If the buggers are dead, then what's the point anyway?

    1. tom dial Silver badge

      What's the point?

      The direct perpetrators of the crimes are dead, but there are others, such as his friend who bought some of the weapons, who has been charged in the case, and family members with possible advance knowledge of their intent who may be under consideration for prosecution. There may also be evidence of possible involvement of others (possibly carrier business records) that suggest involvement of others.

      This incident was a major crime by any reasonable standards, and there is no more reason to be concerned about searching the phone (or computer, if they had the storage) than there was about searching the Farook residence.

      1. Anonymous Coward
        Anonymous Coward

        Re: What's the point?

        > This incident was a major crime by any reasonable standards, and there is no more reason to be concerned about searching the phone (or computer, if they had the storage) than there was about searching the Farook residence.

        My issue is not with searching their phone, which by any standard is reasonable and they do have a court order so they have followed the proper procedure.

        I was more interested in Apple's role in this since they are an independent party. An independent party being compelled to assist in an investigation is an interesting development in my view.

        If I'm walking by a police station and the cops need some guys for a line up, as a private citizen can they compel me to assist them? They wouldn't have a court order but I'm sure that you understand where my thoughts are going.

        1. tom dial Silver badge

          Re: What's the point?

          I believe the laws support requiring citizens to assist police officers within reason and punishing those who refuse. It would be illuminating to see a lawyer's comment on that and other issues raised here.

    2. P. Lee

      >If the buggers are dead, then what's the point anyway?

      Because they are dead there is no one to complain. Once the precedent has been set, the process can be automate do, allowing non NSA agencies to trawl for data. Yes the process only works for one phone. Hello Process, meet Mr Batch Job.

      It isn't like employing a locksmith to get access in the same way torrents are not like lending a dvd to a friend.

      1. Anonymous Coward
        Anonymous Coward

        "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized."

        *against unreasonable searches and seizures*. IMHO mass surveillance is "unreasonable searches". Searching a phone - and only that - of two people who killed many people is not unreasonable - under a warrant issued for such cause, describing the *specific* place to be searched. Let's not become talibans ourselves when it comes to investigate on explicitly violent crimes.

        Otherwise you should extend the right not to be searched to any place or property, and any help - phones are not something so special.

        Laws to break encryption by default are bad. Denying by default any attempt to investigate a specific violent crime is bad as well.

  9. allthecoolshortnamesweretaken

    A bit of everything, I suppose. Test case for the legal side / setting a precedent. Test case for the technical side. Test case for 'can we make Apple play ball?' Unwillingness to disclose full abilities (and yes, this works both ways) of TLAs involved. Getting maybe data that is admittable in court (but that's rather a front than a reason - the perps are dead and so are their private phones. Sure, it's nice to have all the data to wrap things up and have a file without any loose ends, but it's not like a conviction hangs on this).

  10. Anonymous Coward
    Anonymous Coward

    Another reason why you should use a password not a PIN

    I assume it is coming up with the number screen where you type in your PIN and that's how they know it is a 4 or 6 digit PIN. Of course iPhones have supported a password using the full text keyboard for ages, if the terrorist had used that the FBI would have little chance of brute forcing it unless she used 'Allah' as her password or something else easy to guess.

    OK, maybe that's a pretty flimsy reason as most of us don't have to worry about the FBI getting a court to compel Apple to create and apply a special one-off OS update that allows them unlimited guesses to break in to your phone.

    Despite what the first post AC who got downvoted into oblivion thinks, I say good on Apple for making them work for this. I'm sure the feds are planning to use this as a showcase example of why we need a law to require backdoors, but the government has no one to blame but themselves. They're the ones who secretly violated the Constitution in multiple ways and forced Apple's hand into making it so they no longer held the encryption key for people's phones!

    1. tom dial Silver badge

      Re: Another reason why you should use a password not a PIN

      The government forced Apple's hand? Who forced Google's hand then, when they made encryption available in Android version 3, around 2011, that they did not have the key for? While Apple eventually realized that their encryption, with its back door to help their forgetful customers (and bearers of search warrants), they do not deserve much credit for closing the back door only after it became an obvious liability.

      1. Anonymous Coward
        Anonymous Coward

        Re: Another reason why you should use a password not a PIN

        back in your box fandroid...

      2. Velv
        Boffin

        Re: Another reason why you should use a password not a PIN

        @Tom Dial

        Apple has implemented hardware encryption of the iPhone since the 3GS in 2009, their biggest fault being the four digit PIN protecting the OS (which is a separate debate). Androids drawback in this instance is the diverse hardware it operates on, with not all vendors or models including the necessary hardware. It can still perform software encryption, with a similar attack surface in the OS.

        Suffcie to say both iOS and Android have been attempting to do the right thing for 5+ years (pre Snowdon), and despite there still being weaknesses, properly configured they are sufficiently secure for most personal and corporate users.

        1. tom dial Silver badge

          Re: Another reason why you should use a password not a PIN

          But by widely published reports, until iOS8, Apple retained the ability to unlock phones. My comment had less to do with the details of the implementation than with Apple maintaining a back door that enabled them to bypass it. Google, irrespective of whether you like or hate them or the robustness of the underlying encryption, did not do that. The assertion that Apple, before iOS8, was trying to do the right thing may be true, but their understanding of "right thing" changed significantly in 2014.

  11. corestore

    In next weeks news...

    iOS and Android updates released which will allow boot-time passphrases of unlimited length for added security of encrypted devices...

    1. corestore

      Re: In next weeks news...

      (point being it's useful to be able to distinguish between a boot-time password to decrypt device and a simple everyday screenlock password... the former should have a very high level of security; the latter should be convenient; we trade some vulnerability for that convenience if an adversary gets hold of a phone that is powered on)

    2. John Robson Silver badge

      Re: In next weeks news...

      Along with a requirement to unlock the phone to start it charging when it's powered on...

      The clock is a ticking...

  12. Tessier-Ashpool

    This will be fun to watch. iPhone security is pretty darned good. Since any firmware upgrade requires a passcode entry, there is a catch 22 situation here. Now, if there were only a known flaw with the boot loader...

    1. Anonymous Coward
      Anonymous Coward

      IMHO an attack to obtain access would require some sophisticated technique that involves working directly on the hardware to put it into a state where some checks are in some way bypassed.

      It's a bit funny when I think that many commentards here probably hailed at the guy(s) breaking Playstation encryption (so they could play pirated games for free, or install Linux) but are horrified if Apple breaks its own to help in a multiple murder investigation.

      And just like the PS, don't believe someone is not experimenting with techniques to attack phones and other devices directly to obtain informations from them. Apple has surely the advantage of knowing exactly how the system was designed - the only issue is if any attack is feasible (and probably is), how to keep it limited to specific cases. Of course if it requires physical access to the phone, and some complex equipment (and skills), it's not the easy backdoor for mass surveillance some agencies would like.

  13. Steve Davies 3 Silver badge

    Apple are in a no win situation.

    I hope that Google (sorry Alphabet) will support them here because if Apple are forced to allow the FBI in then it won't be long before Google will have to do the same for Android.

    Perhaps we will see two iPhone models? One for use in the USA (with a backdoor) and one 'Not for sale or use in the USA'.

    This will run and run methinks.

    1. SundogUK Silver badge

      Re: Apple are in a no win situation.

      "Perhaps we will see two iPhone models? One for use in the USA (with a backdoor) and one 'Not for sale or use in the USA'."

      Damn well hope so.

      1. Anonymous Coward
        Anonymous Coward

        Re: Apple are in a no win situation.

        Teresa May (UK Home Secretary) may suggest a hidden feature where her own fingerprint is programmed into every iPhone, which after a five second delay is tested for, so she, and she alone has the power to unlock any UK iPhone.

        She'd love that, the Power Crazed Bctih.

    2. tom dial Silver badge

      Re: Apple are in a no win situation.

      There will be no need for more than one model, as authorities in every country will take exactly the same position as the US government: that law enforcement authorities may conduct searches as prescribed by national law and may seek, and sometimes require, assistance of private parties to do so.

  14. Mr Eve

    Missing the point

    Surely the problem here is that IF Apple say yes and provide the tech to allow law enforcement to break into the phone, what's stopping law enforcement then abusing that technology and using it in future cases without a court order?

    If Apple can't find a way to say 'no', then Apple's best ploy here is to make it look like they've spent lots of time thinking about it, then make out the challenge to be incredibly hard (no matter how hard it is in reality), and then charge a large consulting fee to administer this and explain that it's a one-off and the same work would be involved if it was required to do so again in the future...

    1. Anonymous Coward
      Anonymous Coward

      Re: Missing the point

      What stops law enforcement to abuse the law for other kind of searches and seizures, without a court order? Actually it's the law itself, those overseeing it, and you, as a citizen who elects his or her representative.

      And do you believe nobody else is trying to find a way to break into a phone? Encrypting devices are not new at all, spy agencies have been working for years to break into each other devices, now there's the commercial ones also.

      Maybe the NSA is already capable just don't want to let you know, so they ask Apple to obtain the evidence.

  15. 45RPM Silver badge

    Lack of joined up thinking?

    It seems to me that weakening a devices encryption, or mandating the provision of a back door, does little to assist law enforcement - and nothing to protect the personal details of everyday users.

    The reason that law enforcement isn't hindered by encryption should be obvious. If a back door is mandated then any reasonably intelligent crim or terr'ist will roll their own, backdoorless, encryption system - and the forces of law and order will continue to be stymied and will have to work a little harder at gathering evidence.

    If the crim concerned is rather dimmer, and hasn't taken precautions, then they'll also have left a trail in the form of other evidence like a blundering elephant, and the evidence contained on the device will be unnecessary - there being more than enough to convict without it.

    If we allow our lawmakers to mandate a backdoor into our devices then we usher in a new age of fraud and theft from our bank accounts.

    We should laud the device makers for making their devices so secure - not criticise them for it.

  16. John Savard

    Security

    The cryptanalysis capabilities of the NSA and the GCHQ are a closely guarded secret. Therefore, this secret will not be unveiled simply to assist in a criminal investigation. Therefore, it cannot be inferred that these capabilities are unequal to extracting the data from an iPhone simply because of this criminal case.

    Clearly, it should be possible to bypass a restriction on how many times one can try passwords on a smartphone. Copying all the encrypted data out, and having in hand the full source code to the phone's software, it should indeed be trivial. Given this, there is no reason to endanger any capabilities the NSA may have to avoid it.

    However, one might note that, given the sophistication of today's secret-key algorithms, at least, barring some flaw in Apple's security, it would be reasonable for other reasons to feel that the NSA could not read data on an iPhone by means of cryptanalysis. But this court case is not evidence one way or the other.

  17. Locky

    Asking the wrong questions

    Why demand access to the device at all? Apple are quite right to say the phone is someone else propery, and so they can't unlock it without permission (shame they forget this when the user repairs the screen....) BUT

    All the data the FBI want will proably be uploaded to iCloud. So the FBI should demand access to Apple's servers. Sorted.

    1. tom dial Silver badge

      Re: Asking the wrong questions

      According to other news reports, the phone in question belongs to Farook's former employer, a public agency that almost certainly consents to FBI access to the data.

  18. Anonymous Coward
    Anonymous Coward

    Cupertino has to find a way

    or?

    1. BurnT'offering

      Re: Or?

      Not

      1. BurnT'offering

        Re: Not

        Downvoted? Too wordy

    2. This post has been deleted by its author

  19. PassiveSmoking

    I don't have a problem with this.

    The authorities have gone through all the proper channels and gotten the warrants necessary, and it's hardly as if the owner of the device was some innocent little angel. I don't see what's wrong with having Apple dd the machine's flash drive so that the data can be brute-forced out, or do whatever else is required to preserve the data so it won't get destroyed after x attempts to unlock with random PINs. That's perfectly fine as far as I'm concerned

    What I have a problem with is the war on encryption, the whole "Bad people use encryption, therefore encryption is bad and if you use it you must be a bad person" rhetoric, and the attempts to kill or neuter encryption to the point where it's useless in the name of "security" so we can all be spied on by the powers that be without any oversight or legal protection, innocent and guilty alike. That's a police state. It's also a recipe for disaster because if you make it easy for the security forces to access private data you also make it easier for criminals, terrorists, blackmailers, etc to do the same.

  20. TJ1

    Apple immediately contests the order

    February 16, 2016 A Message to Our Customers

    The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

    This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

    http://www.apple.com/customer-letter/

    1. AndyS

      Re: Apple immediately contests the order

      Very clear response, and straight to the point. Thanks for the link.

      Essentially they are saying that if they do this once, and it becomes public how it is done, it can be replicated by anyone, on any phone.

      If that's true, does it not imply that the security of the encryption keys is "security through obscurity?"

      1. TJ1

        Re: Apple immediately contests the order

        "If that's true, does it not imply that the security of the encryption keys is "security through obscurity?" "

        As I understand it, far from it. The whole point is that Apple's encryption scheme design is very good, to the point that the only way for the FBI to attempt to attack it is via a brute-force "enter lots of possible pass-codes" process.

        Apple is being ordered to assist that process by creating a custom firmware update of iOS that aids that process by allowing non-human programmable pass-code entry attempts.

        It is unclear from what I've read whether or not it is 100% certain that the iPhone 5c has hardware pass-code entry protection or not. Some sources claim the time-delay and quantity limits are implemented in the silicon (the Security Enclave). Others seem to suggest those limits are imposed by iOS.

        If the latter, then in theory, a custom firmware upgrade could be used to automate a brute-force attack.

        That still doesn't answer the question of whether the pass-code is required in order to use the firmware upgrade.

        The court order suggests that it can be done without needing the pass-phrase, and without writing to the Flash memory (it uses the phrase "in-RAM"), which suggests there may be a way using a dedicated hardware harness (think JTAG debugging) to run a modified firmware without installing it, and thus avoiding pass-code entry.

        Rather like on Android devices being able to hook it up to a PC and do:

        fastboot boot local/kernel-image-file-name local/ram-disk-image-file-name

        http://android-dls.com/wiki/index.php?title=Fastboot#Fastboot_Commands

        1. Anonymous Coward
          Anonymous Coward

          Re: Apple immediately contests the order

          > Apple is being ordered to assist that process by creating a custom firmware update of iOS that aids that process by allowing non-human programmable pass-code entry attempts.

          And the only thing which prevents the FBI from doing that themselves, apart from sufficient programming clue and source code, is the firmware signing key which will be buried somewhere in Cupertino.

  21. kmac499

    Apple Standards...

    A lot of this flows from Apples view of who is the 'owner' of the phone. Apple has consistently operated on the basis that the hardware maybe the property of the owner but the software is licenced and the accounts belong exclusively to the licencee.

    Hence Apple, allegedly bricks phones, which have had third party repairs and it refuses to grant access to the legal executors of deceased ithing owners.

    Apple is claiming for itself a level of confidentiality and a duty of care on a par with Doctors, Lawyers and Priests. They are a phone manufacturer not a priviliged profession or religion.

    At best I would grant Apple the same level of confidentiality as a Bank. If the Police turn up with a warrant you give them what they ask for. The Law abiding can rely on privacy the crims cannot rely on secrecy.

    As an afterthought the next time Apples legal minions go into court, demanding information from their competiotors to defend their spurious patents. I do hope the judge tells them to fuck right off.

    1. TJ1

      Re: Apple Standards...

      "Apple is claiming for itself a level of confidentiality and a duty of care on a par with Doctors, Lawyers and Priests. They are a phone manufacturer not a priviliged profession or religion."

      I disagree. Apple says it has complied with legal and other requests for data it holds (I assume that mainly means the iCloud back-ups - which were a month old) and has its engineers advising the FBI technicians.

      All that data is encrypted and the only place the key is stored is in the target iPhone 5C. The key's component parts cannot be extracted from the device's silicon which can only be activated by the device pass-code.

      This issue is about the government compelling a company to attempt to crack its own product security, using its own resources, by creating a customised operating system image and finding a way to install it onto the target device so as to avoid the pass-code entry limitations.

      1. Anonymous Coward
        Anonymous Coward

        Re: Apple Standards...

        It would not be the first time something like that happens. What about a courier, for example, that re-routes a parcel to help in an investigation? If we want to assert that investigation should work only on public available evidences, and should not access everything else (of course under a proper warrant), is another matter - stupid as the enforcement of a backdoor.

      2. kmac499

        Re: Apple Standards...

        -> TJ1

        "This issue is about the government compelling a company to attempt to crack its own product security, using its own resources, by creating a customised operating system image and finding a way to install it onto the target device so as to avoid the pass-code entry limitations."

        I agree totally with you on that. My point is that Apple has a track record of defining the common good for it's customers in ways which, the cynic in me says, could be seen as solely for Apples corporate good.

        Is TIm Cook defending the rights of citizens in more authoritarian regimes to have privacy from their state, or is he defending Apples marketing strategy of buy our kit and be protected. If he was more forthcoming in protecting his customers from some of the policies of his company I'd be more generous.

  22. GarethWright.com

    Sod the phone reset the account password and get from icloud

    ...simple really.

  23. Mattjimf

    Can they not just request access to their Apple account(s) be unlocked for them, from there they can unlock the phone via the find my phone feature on the website.

    Full access to the phone without bricking it, Apple don't need to circumvent their own security, everyone wins.

  24. Nattrash

    "terrorists-to-do" list

    I understand what the feds are trying to do... And if I were them, I'd probably do the same. But, still one thing keeps nagging me...

    Law enforcement starts using finger print evidence - even kids now know to wear gloves.

    Law enforcement starts using DNA - baddies go beyond themselves to keep "their" DNA to themselves. Extensive cleaning, using acid, torching the place.

    Now, if I plan to do an act of terrorism, and I'm a Michael Palin, "Fish Called Wanda" terrorist (which I probably am), disregarding a strict communication SOP, (let's get Hollywood here) using "burn phones" or going "low tech", "old school"...

    ...I still will have a "terrorists-to-do" list: get gun, get ammunition, get fancy latex gloves, get pantyhose or ski-mask...

    ...and on that list there isn't...

    reset/ wipe phone?

    Maybe only G-men are so smart...

    1. Anonymous Coward
      Anonymous Coward

      Re: "terrorists-to-do" list

      Yet, not all of them are able to clean every trace. Many of them leave a lot of them. Shoud we ban the use of fingerprints, DNA and so on (after all, DNA is a sort of encryption mechanism...). Let's ban every kind of lawful seaches and seizures. Let's close down the law enforcement agecies - and get rid of the Law itself. I see then even bigger, heavier phone useful, as a club to enforce mine own law....

      1. Nattrash

        Re: "terrorists-to-do" list

        No, no,no... That's not what I'm saying...

        But what I do dispute is the fact that this is now (and frequently in other cases) presented as "critical". In a "if this isn't possible, the case is never going to be solved, and we all are going to die" kind of way.

        Maybe I'm trying to shine a light on the importance of old fashioned investigative leg-work, and on the side also point the finger at something that looks like "let's use this as an argument to get what we want, because we can print the word terrorist again in bold capitals"...

  25. Robert E A Harvey

    can they?

    I see no reason to suppose that Apple /can/ do this. Who knows how they wrote their encryption?

  26. mcartoscelli

    Apple has responded

    http://www.apple.com/customer-letter/

    The fact the FBI are trying to leverage a very old statute is what makes this scary. It opens the door for them to get access to any device they want. It's not a "this will be used on only this guy, trust us!" It's a build a master key to unlock any phone so we can use it when we like, there's no mention of needed an actual search warrant to do it either.

  27. Anonymous Coward
    Anonymous Coward

    Technically I'd say it could be done.

    There are virtual machines for operating systems on PC's & servers are there none for phones ?

    Copy the entire phone into a virtual machine, which should be possible with even powering it up using diagnostic hardware.

    Then brute force the vm phone, when it locks make a new vm from backup continue with brute force, repeat until vm is unlocked.

    Or does Apple not have a virtual environment for testing new phones ?

    Whether it should be done is another question

  28. Anonymous Coward
    Anonymous Coward

    Equivalent of Wallace and Gromit penguin train track scene / Test Rig.

    This really relies on where the code for those 10 password attempts are stored, importantly, where the CMP# zero (attempts left), JPZ #xxxx / ARM Branch instruction is executed, causing the wipe routine to be executed.

    Trouble is, often this code is decoded 'in-line', the code before, acts as the decryption values for the code ahead - linked to interrupt timing routines, based on the number of machine cycles to execute a particular instruction. Modifying anything, acts as a tamper switch, altering the processing timings of the code, rendering the code ahead useless. The code ahead also often deletes/scrambles the code behind.

    Simply, its the equivalent of the Wallace and Gromit penguin scene, where Gromit lifts a piece of the rail track from behind to lay out in front of the train (he's sitting on), so the train (code in this case) continues to run.

    It sounds like a hardware test rig might be possible to image the data off the device 'in-situ' then copy this frozen data back each time a password is attempted to the hardware rig assembly, to allow multiple tries, but I don't see how Apple is under any obligation to offer a hardware test-rig to help decode its own product, or likely to co-operate.

    I'd probably think laterally on this one and 'ask' Samsung to do it. I'm sure Samsung has a test-rig somewhere where they have reverse engineered every aspect of the iPhone and its code.

    1. Nattrash

      Re: Equivalent of Wallace and Gromit penguin train track scene / Test Rig.

      Cheese...

  29. Bruce Ordway

    Our tax dollars at work

    Why should Apple need to get involved in this one?

    Why not let the FBI figure it out themselves?

    I know I had an initial, emotional reaction to their citing of the San Bernardino case. My cynical side tells me this was a ploy to gain sympathy. My brain kicked back in & I remembered reading about the suspects. They sounded deliberate, careful and secretive. Not the types to have done something incriminating on a work phone.

    Does the FBI really need this capability now? How do they apply it going forward?

  30. Anonymous Coward
    Anonymous Coward

    Am I missing something?

    Why not just clone the phone and brute it all day long on inside an emulator like qemu?

  31. Tom 13

    "It’s technically possible for Apple to hack a device’s PIN, wipe, and other functions. Question is can they be legally forced to hack," said iOS security expert Jonathan Ździarski.

    "Theory: either NSA/CIA dragnet and cryptanalysis capabilities are severely limited, or this is a test case to see how the courts respond."

    For a security expert this guy is really stupid.

    By and large the opinion of the security community has been, unless the device is truly secure, as in Apple can't hack the device, it isn't secure. If Apple can figure it out, so can somebody else, in particular state sponsored groups, but possibly including large or at least wealth criminal enterprises. So Apple set out to meet those requirements and thus far their defense has been precisely that even THEY can't hack the phones.

    As to the second part, there is no need to test the courts. Apple is not charged in the crime, nor are they married to any of its perpetrators. Therefore once the Judge signs the warrant from the FBI, Apple MUST supply the evidence demanded if they are able to. In fact, what the FBI has done negates the usual criticism of privacy advocates that the police are attempting to circumvent established legal procedures.

    That being said, I have to wonder why the FBI are so focused on the phone. If the perps had an Apple account and were backing up the phone using that account, it certainly is within Apple's ability to change the password on the account which would enable the FBI to download the data to an unencrypted device.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like