Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Iphone Crime Encryption Government Privacy Apple

Apple Deluged By Police Demands To Decrypt iPhones 239

Posted by Soulskill
from the atf-struggles-with-slide-to-unlock dept.
New submitter ukemike points out an article at CNET reporting on a how there's a "waiting list" for Apple to decypt iPhones seized by various law enforcement agencies. This suggests two important issues: first, that Apple is apparently both capable of and willing to help with these requests, and second, that there are too many of them for the company to process as they come in. From the article: "Court documents show that federal agents were so stymied by the encrypted iPhone 4S of a Kentucky man accused of distributing crack cocaine that they turned to Apple for decryption help last year. An agent at the ATF, the federal Bureau of Alcohol, Tobacco, Firearms and Explosives, 'contacted Apple to obtain assistance in unlocking the device,' U.S. District Judge Karen Caldwell wrote in a recent opinion. But, she wrote, the ATF was 'placed on a waiting list by the company.' A search warrant affidavit prepared by ATF agent Rob Maynard says that, for nearly three months last summer, he "attempted to locate a local, state, or federal law enforcement agency with the forensic capabilities to unlock' an iPhone 4S. But after each police agency responded by saying they 'did not have the forensic capability,' Maynard resorted to asking Cupertino. Because the waiting list had grown so long, there would be at least a 7-week delay, Maynard says he was told by Joann Chang, a legal specialist in Apple's litigation group. It's unclear how long the process took, but it appears to have been at least four months."
This discussion has been archived. No new comments can be posted.

Apple Deluged By Police Demands To Decrypt iPhones

Comments Filter:
  • Re:iPhones Encrypted (Score:3, Informative)

    by Anonymous Coward on Saturday May 11, 2013 @10:22PM (#43699701)

    Since the 4. The flash is encrypted with a device key. Remote wipe simply cycles the key.

    Previously parts were encrypted, but not all.

  • by jtownatpunk.net (245670) on Saturday May 11, 2013 @10:25PM (#43699723)

    The summary talks about decrypting the data on the phones. The articles talk about getting past the lock screen on the phones. Those are two entirely different things. On my phone, I have to first enter the decryption code before I'm presented with the lock screen.

  • by Sycraft-fu (314770) on Saturday May 11, 2013 @10:30PM (#43699769)

    Most phones aren't encrypted and usually the company can bypass it. For example with Android phones tied to a Gmail account, Google can bypass the lock screen. So if you forget your password, that is a recovery mechanism. Also data can be accessed if you physically removed the flash chip from the phone and put it in another reader. Lock screens are protection against most kinds of attacks, not high level security. Most people don't need high level security though, so it works well.

    You can also encrypt your phone. Well I presume you can encrypt iPhones, having not owned one I don't know. You can encrypt Blackberries and Androids. There you set a key and it does basically a full-disk encryption type of thing. You have to enter the key to access the device at all (whereas lock screen lockouts will allow some stuff to happen) and there is no recovery. If you forget the password, you're boned, flash the device and start over. Few people do that because it is not pushed and is inconvenient.

    It is also more security that is generally useful. Most people are worried about someone running up a phone bill, or getting at your account information or something if they steal a phone. A lock screen stops that. Device encryption is needed only against more serious threats, hence most don't use it.

  • by Anonymous Coward on Saturday May 11, 2013 @10:33PM (#43699785)

    Now you know and knowing is half the battle. Don't buy iPhone.

  • by Verteiron (224042) on Sunday May 12, 2013 @12:21AM (#43700175) Homepage

    Brute-forcing an iPhone's lock code is relatively trivial with freely available tools [google.com]. This puts the device in DFU mode, so "Erase device on X unlock attempts" doesn't take effect. That version of the tools only bruteforces lockcodes, but there's no theoretical reason you couldn't try at least a dictionary attack on a password, too. Since it's also possible to dump the hardware key and a complete (encrypted) image, I imagine an offline attack on the image is possible, too. You wouldn't have to rely on the relatively slow hardware in the iPhone.

    Using those tools I have successfully bruteforced the 4-digit lockcode to an iDevice running 6.0.2, and that's with no prior experience with or knowledge of iOS. I even used an emulated Mac to compile the necessary firmware patch. And that's just what I was able to do in with a few hours of fiddling. There are people who do this for a living, and tools dedicated specifically to extracting data from mobile devices. Are these PDs really saying they can't get into devices with simple lock codes?

  • by SeaFox (739806) on Sunday May 12, 2013 @01:50AM (#43700493)

    i see this story as being a GOOD thing, generally speaking. the feds are stumped by my iphone. now the only people we need to cockblock are in cupertino...

    No, I'd say this is a bad thing. A back log of getting these requests fulfilled will only be used as justification for there to be a regular law-enforcement back door built into a later version of iOS. "This process is taking too long and Apple is being burdened with fulfilling these requests, if only we had a way of accessing an iPhone ourselves without needing their assistance it would make things easier for all parties when investigating terrorism and child pornography..."

  • by gd2shoe (747932) on Sunday May 12, 2013 @03:22AM (#43700753) Journal
    The summary implies that it did only take a couple of minutes... after months of sitting on a shelf while Apple dealt with the backlog of other phones needing to be unlocked by law enforcement.
  • by AmiMoJo (196126) * <[ten.3dlrow] [ta] [ojom]> on Sunday May 12, 2013 @03:31AM (#43700797) Homepage

    The iPhone is FIPS 140-2 certified.

  • by jimicus (737525) on Sunday May 12, 2013 @04:39AM (#43700985)

    Doesn't need to be a back door - forensics products to crack phones already exist:

    http://www.msab.com/app-data/downloads/Release_Notes_(English)/XRY_release_notes_6.5_EN.pdf [msab.com]

  • by kasperd (592156) on Sunday May 12, 2013 @04:58AM (#43701055) Homepage Journal

    Apple claims that it uses AES with a 128 bit key, so if they can unlock it that quickly they MUST have a backdoor to the encryption key.

    The input provided by the legitimate user for decrypting the content has way less than 128 bits of entropy. So they just need to brute force that input. What Apple can do, which the forensics people might not know how to do, is to extract the encrypted data and put it on a computer, where brute forcing can happen without each input having to be entered through a touch screen. Any security one might think this adds, is nothing but security-through-obscurity. Real security of the encryption could only be achieved by the user entering some sort of password with sufficient entropy. A 39 digit pin code would be sufficient to make AES be the weakest point. But would anybody use a 39 digit pin on their phone? Anything less would make the pin be easier to brute force than AES.

    You can shift the balance a bit by iterating the calculation which produces a key from the pin code. A million iterations would probably be acceptable from a user experience perspective, but that would only reduce the required number of digits from 39 to 33. A milliard iterations would not be good for the user experience, since they now have to wait quite some time after entering a pin. And with the pin still needing to be 30 digits in length, they'll often need to re-enter it multiple times, before they get it right.

  • by sribe (304414) on Sunday May 12, 2013 @07:55AM (#43701531)

    No, this is overall a bad thing: Apple is able and willing to break the encryption on an iPhone, presumably through a backdoor or brute force.

    Brute force. 10 failed attempts at the lock screen results in the phone being wiped. But Apple can copy out the encrypted contents, and then keep guessing until they find the code, no matter how many tries.

    Then again, we could all be mistakenly conflating "encryption" with "lock screen", which really speaks to the level of (in)competence on the part of law enforcement.

    On the iPhone, same thing--when you set up the lock screen, it sets up a random key which is used to encrypt/decrypt data in-flight to the flash, so that nothing is stored decrypted. The passcode is used to de-scramble the key, which is stored in a special location...

  • by sribe (304414) on Sunday May 12, 2013 @08:01AM (#43701563)

    Now you know and knowing is half the battle. Don't buy iPhone.

    Right, because, as the article points out:

    Google takes a more privacy-protective approach: it "resets the password and further provides the reset password to law enforcement," the materials say, which has the side effect of notifying the user that his or her cell phone has been compromised.

    Oh, good for google! Wait, why doesn't Apple just reset the password and provide the new password to law enforcement. Oh, yeah, right, better security--they can't just reset the password. And boy, how much better it is for the suspect's privacy that google notifies him. Let's see, he's been arrested, his phone seized, a warrant obtained to examine its contents--I'm sure he'd be so much more relieved if he were to get email from Apple when his pass code is cracked, because by god that is so important to his privacy!

  • by Cwix (1671282) on Sunday May 12, 2013 @10:44AM (#43702215)

    https://code.google.com/p/cryptonite/ [google.com]

    this looks like it could help

  • by GoogleShill (2732413) on Sunday May 12, 2013 @11:34AM (#43702481)

    There is no copying of data. The data is /always/ encrypted on the device, it's the encryption key that is password protected.

    It's actually very simple. When the device is initially set up, a symmetric key is generated and all the user data is encrypted using that key. When you set a lock screen password, the encryption key is then encrypted using the password and stored in flash. Unlocking the device with the valid password decrypts the key into RAM so that the user data can be decrypted. Locking the device removes the decrypted key from memory, thus leaving all of the data in flash in a secure state.

    If the device is configured to self-erase after too many failed password attempts, the device simply deletes the encryption key from flash and the device is effectively wiped.

You might have mail.

Working...