Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Iphone Crime Government Privacy Apple Your Rights Online

Apple Can Extract Texts, Photos, Contacts From Locked iPhones 202

Posted by timothy
from the as-a-public-service dept.
Trailrunner7 (1100399) writes "If law enforcement gets hold of your locked iPhone and has some interest in its contents, Apple can pull all kinds of content from the device, including texts, contacts, photos and videos, call history and audio recordings. The company said in a new document that provides guidance for law enforcement agencies on the kinds of information Apple can provide and what methods can be used to obtain it that if served with a search warrant, officials will help law enforcement agents extract specific application-specific data from a locked iOS device. However, that data appears to be limited to information related to Apple apps, such as iMessage, the contacts and the camera. Email contents and calendar data can't be extracted, the company said in the guidelines."
This discussion has been archived. No new comments can be posted.

Apple Can Extract Texts, Photos, Contacts From Locked iPhones

Comments Filter:
  • by Number42 (3443229) on Thursday May 08, 2014 @11:20AM (#46950261)
    TFA says that the data can only be accessed at the company HQ, so no, it seems that they are referring to local data that is unencrypted. It also states that they can access some data in the iCloud, too.
  • by Sockatume (732728) on Thursday May 08, 2014 @11:22AM (#46950299)

    Apparently not. It sounds like they're limited to whatever applications are currently running though:

    Upon receipt of a valid search warrant, Apple can extract certain categories of active data from passcode locked iOS devices. Specifically, the user generated active files on an iOS device that are contained in Apple’s native apps and for which the data is not encrypted using the passcode (“user generated active files”), can be extracted and provided to law enforcement on external media. Apple can perform this data extraction process on iOS devices running iOS 4 or more recent versions of iOS. Please note the only categories of user generated active files that can be provided to law enforcement, pursuant to a valid search warrant, are: SMS, photos, videos, contacts, audio recording, and call history. Apple cannot provide: email, calendar entries, or any third-party App data.

  • iMessage? (Score:4, Informative)

    by kurowski (11243) on Thursday May 08, 2014 @11:33AM (#46950427) Homepage

    "iMessage" is a message transport. The app is "Messages". The document from Apple specifically says "SMS": it does not mention either Messages or iMessage. While it's possible that Apple leaves iMessages unencrypted on the device, it would be surprising given how much trouble they go through to protect then in transit. So while this document doesn't explicitly say iMessages are safe, it also doesn't say they're vulnerable.

  • The actual article (Score:5, Informative)

    by rabtech (223758) on Thursday May 08, 2014 @11:40AM (#46950497) Homepage

    Hey, let's link to the actual document in question! What a novel concept!

    http://www.apple.com/legal/mor... [apple.com]

    Good news:

    - Apple cannot track a phone via GPS, nor forcibly enable Find My Friends/Find my iPhone

    - Apple cannot monitor FaceTime or iMessage conversations since they are end-to-end encrypted

    - Apple cannot provide third-party app data that is encrypted since the files are encrypted with the user's passcode.

    - It appears if the user does a remote wipe before law enforcement can get a warrant and ship the phone to Apple (or fly it there), then there is nothing that can be done. I wonder if they power up the device in an anechoic chamber so it can't receive the remote wipe signal? I would guess no because most people aren't smart enough to do an immediate wipe.

    - We already knew the only trick they have as far as encrypted files goes is a custom firmware that bypasses the max attempt auto-erase and rate limit feature, so it can attempt to brute-force passcodes quickly. However it requires the attempt be made on-device, since the keys are stored in the secure storage with no facility to get them off-device. So even a moderately complex passcode is effectively unbreakable, let alone a good strong password.

    Questionable:

    - user generated active files (this is what SMS/call logs/photos/etc are listed under). Normally if a device is powered off and rebooted, I was under the impression that these things were not available because the files are encrypted. It seems that iMessage is at least encrypted here, but I would be curious to find out what the situation is. Everything except photos, videos, and recordings is a moot point because you can get stuff like SMS history and call logs from the carrier anyway so those are the only ones I'd be concerned about.

    There are some definite good points here - Apple has chosen not to build themselves backdoors or workarounds, presumably because they can't be ordered to disclose information they don't have access to... same reason they built iMessage the way they did. A court would have to order them to refactor their software before it could order them to intercept messages, and at least in the US there is no precedent or law that can compel them to do so.

    However I would expect the âoeuser generated active filesâ to be encrypted after a device reboot until the passcode is entered. If that is not the case, Apple should fix it pronto.

    I would also expect Apple to refactor the storage of those things to be segmented, given the NSA revelations and increasingly authoritarian behavior of law enforcement; for example, photos pending background upload could be kept unencrypted, but once uploaded they should be rewritten as encrypted so they require the passcode to access. They already have the ephemeral key tech and per-file key support so you can generate a key for the unencrypted file while the device is unlocked, then toss the passcode key when the device locks and only hold onto the file key until the upload is finished, then toss it. Thus no risk to the main key but you can still encrypt the file in the background.

    I won't bother discussing Android phones - they are almost all trivial to break and access all the user's data, when people like Samsung aren't coding back doors directly into the firmware.

  • Mod parent up (Score:4, Informative)

    by OneAhead (1495535) on Thursday May 08, 2014 @11:51AM (#46950609)
    The AC nailed it; this is an utter non-story. Last time I checked, locking an iPhone does not enable full -disk encryption. Raise your hand if you thought the iPhone contains some magical Steve Jobs fart that would prevent someone with hardware access (leave alone Apple with hardware access!) from ripping the unencryped data (which, in a default setup, is essentially everything except [luxsci.com] your e-mail [zdziarski.com]) from the flash chips. And yes, hardware access is necessary even if it isn't explicilty stated in the summary. Anyhow, those that did raise their hands earlier, please hand in your geek card and don't let the door hit you in the ass on the way out.
  • Maybe not anyone (Score:3, Informative)

    by Anonymous Coward on Thursday May 08, 2014 @12:09PM (#46950837)

    At least not trivial task. Per the iOS Security white paper:

    "The device’s unique ID (UID) and a device group ID (GID) are AES 256-bit keys fused into the application processor during manufacturing. No software or firmware can read them directly; they can see only the results of encryption or decryption opera- tions performed using them. The UID is unique to each device and is not recorded by Apple or any of its suppliers. The GID is common to all processors in a class of devices (for example, all devices using the Apple A5 chip), and is used as an additional level of protection when delivering system software during installation and restore. Burning these keys into the silicon prevents them from being tampered with or bypassed, and guarantees that they can be accessed only by the AES engine."

    Hence, needing some specialized equipment, ergo, ship to 1 Infinite Loop to get the data.

  • by swb (14022) on Thursday May 08, 2014 @12:11PM (#46950861)

    Look at the source code and see.

    Even if I had the source code, it wouldn't do me personally any good as I couldn't grok what it did just from reading it. It would do me as much good as it did 99.99% of OpenSSL users.

    Gag letters prohibit what they can say, they don't require them to make false statements of fact. You might make the argument that they could in fact be strong-armed through some extralegal method of making false statements of fact to engender false confidence in potential targets of spying, but that's getting a little into tinfoil hat territory.

    In fact, I think an Apple statement of what little they can extract is pretty good and serves as a kind of interesting statement on what they believe is recoverable. It doesn't include third-party techniques or equipment that you might find in an NSA laboratory, but I don't know that Apple makes that kind of penetration test of their own devices.

  • by kthreadd (1558445) on Thursday May 08, 2014 @12:23PM (#46950979)

    https://support.apple.com/kb/h... [apple.com]

    If passcode-protected whole phone encryption is enabled, no one should be able to access that without the key. I guess they know how it works more than I do. They've even redefined encryption. It's "encrypted" just like everything else these days. I guess it's still technically encrypted even if everyone has a key.

    Not everything is encrypted. According to the guidelines:

    Specifically, the user generated active files on an iOS device that are contained in Apple’s native apps and for which the data is not encrypted using the passcode (“user generated active files”), can be extracted and provided to law enforcement on external media.

    So, data can only be extracted if it is not encrypted. Sounds reasonable. Of course it would be better if everything was encrypted.

  • by Anonymous Coward on Thursday May 08, 2014 @12:29PM (#46951039)

    Blackberry... wasn't that the company that sends all your mail and everything you ever communicate through their servers?

    You don't understand how blackberries work.

    Yes, they send your data though their servers, in the same way that your data goes through your cell phone company.

    BUT, with a blackberry enterprise server, Blackberry does NOT have the decryption keys. That is the relevant point - even if Blackberry wants to hand over information to law enforcement, Blackberry isn't able to decrypt the data.

    Blackberries were designed by intelligent people who understand security.

  • by MachineShedFred (621896) on Thursday May 08, 2014 @02:22PM (#46952425) Journal

    They don't supply shit to law enforcement - their policy [apple.com] says that the device has to be shipped to Cupertino in good working order, where they will do the data extraction only with a proper search warrant or court order. The data is then provided on optical media:

    Specifically, the user generated active files on an iOS device that are contained in Apple’s native apps and for which the data is not encrypted using the passcode (“user generated active files”), can be extracted and provided to law enforcement on external media. Apple can perform this data extraction process on iOS devices running iOS 4 or more recent versions of iOS. Please note the only categories of user generated active files that can be provided to law enforcement, pursuant to a valid search warrant, are: SMS, photos, videos, contacts, audio recording, and call history. Apple cannot provide: email, calendar entries, or any third-party App data.

    See section I of the linked document, entitled "Extracting Data from Passcode Locked iOS Devices".

A computer lets you make more mistakes faster than any other invention, with the possible exceptions of handguns and Tequilla. -- Mitch Ratcliffe

Working...