Details of iOS and Android Device Encryption 146
swillden writes: There's been a lot of discussion of what, exactly, is meant by the Apple announcement about iOS8 device encryption, and the subsequent announcement by Google that Android L will enable encryption by default. Two security researchers tackled these questions in blog posts:
Matthew Green tackled iOS encryption, concluding that the change really boils down to applying the existing iOS encryption methods to more data. He also reviews the iOS approach, which uses Apple's "Secure Enclave" chip as the basis for the encryption and guesses at how it is that Apple can say it's unable to decrypt the devices. He concludes, with some clarification from a commenter, that Apple really can't (unless you use a weak password which can be brute-forced, and even then it's hard).
Nikolay Elenkov looks into the preview release of Android "L." He finds that not only has Google turned encryption on by default, but appears to have incorporated hardware-based security as well, to make it impossible (or at least much more difficult) to perform brute force password searches off-device.
Matthew Green tackled iOS encryption, concluding that the change really boils down to applying the existing iOS encryption methods to more data. He also reviews the iOS approach, which uses Apple's "Secure Enclave" chip as the basis for the encryption and guesses at how it is that Apple can say it's unable to decrypt the devices. He concludes, with some clarification from a commenter, that Apple really can't (unless you use a weak password which can be brute-forced, and even then it's hard).
Nikolay Elenkov looks into the preview release of Android "L." He finds that not only has Google turned encryption on by default, but appears to have incorporated hardware-based security as well, to make it impossible (or at least much more difficult) to perform brute force password searches off-device.
So what you're telling me (Score:3)
Is that the NSA still has their backdoor.
Re:So what you're telling me (Score:5, Insightful)
What did you believe ? Who's enough of a fool to believe acronyms agencies will let Apple, Google or Microsoft decide on their own ? Not at this scale. Agencies won't mind if a tech entrepreneur try to get his share of the pie, but he will forever be insignificant. Once he become big enough, he'll join the group as everybody else, just to stay in business because the feds will certainly have found a non-compliance from a dusty law book.
This whole thing is just a marketing show to re-gain the average customer not-so lost trust...
Re: (Score:3, Interesting)
Devil's advocate (and I could be wrong on this count.) Part of the TLA groups function is not just spying, but the other element. Preventing that country's interests from getting snooped. This is why NIST has guidelines [1] as well as info on the latest exploits that are around.
With the possibility that the other guys may have just the backdoors as the NSA does, it makes upping the ante logical. Better nobody be able to read info than just your enemies. Again, this is pure conjecture, but the NSA/NIST
Re: (Score:3)
Actually, the chips are designed in the US (Apple has acquired numerous ASIC designers and companies) and fabbed in either
Re: (Score:2)
Who's enough of a fool to believe acronyms agencies will let Apple, Google or Microsoft decide on their own?
As one of the guys who builds this stuff at Google... I am. You can choose what you believe, of course, but keep in mind that excessive cynicism can be just as effective as rose-colored glasses at misleading.
Re: (Score:2)
what i find amazing is that in the many years that mobile devices have been common place, no one's yet actually produced evidence that they're being used for nefarious purposes. Just lots of claim and bullshit.
Yes, yes, there's PRISM and god knows what else. But show me a case where the NSA or the CIA or the FBI used a built in backdoor off an off the shelf product.
Re: (Score:2)
Oh, it's a Samsung. Well, that concern is negated then.
Re: (Score:2)
Possibly, or it may be that Google and Apple are trying to mitigate the blow-back. I remember reading a lot of the Yahoo stuff that got declassified showed that they tried hard to oppose the directive they were given; not that it mattered because we found out later that the NSA tapped their fibre backbones anyway.
I have a feeling that Google/Apple want to go down this route because it will mean that they technologically can't comply with certain NSA letters. Of course, government agencies may already have t
Re: (Score:3)
Yeah really. I canot believe ANYONE would think that LEO is all up in arms that now devices are "encrypted" and the kiddie porn will run rampant.
If you're familiar with public relations tactics, these articles are so damn blatantly obvious.
It's a good deal, google and apple who already work with the government anyway, are now allowed to act like their devices are safe, and the govt. can pretend to act like they're styimed and will never break a case again.
Seriously. Is there anyone stupid enough to believ
Re: (Score:3)
Where in the summary did you get that?
Disingenuous. Try reading TFA. Oh, wait, that's not allowed around here. I'll be off mucking through 20 lines of legacy Perl code as punishment.
Re:So what you're telling me (Score:4, Informative)
So again, where does this article say that the NSA still has a backdoor?
Re: (Score:2)
The NSA/GCHQ won't bother trying to decrypt the phone. They will simply sift through all your emails and text messages that they captured as they were sent unencrypted over the networks. I'm sure they have plenty of custom iOS and Android malware ready to infect you with too. Maybe they even intercept the phone before it gets to you if you order online, like they do with network gear or hard drives.
Re: (Score:2)
Most schemes that encipher data with multiple keys make it obvious upon examining the output ciphertext or encrypted session key blocks that the data has been enciphered with more than one key.
With symmetric key algorithms like AES, it’s not possible to encrypt the data with two keys and have it decryptable by only one of them. The exact same key must be used on both ends, and leaking that key would be obvious.
Symmetric ciphers are often used with a session key that’s wrapped by an asymmetric c
Re:So what you're telling me (Score:5, Interesting)
for me, and I think for a lot of /. readers, the biggest issue is comparing a solution that currently exists to vaporware that may exist in the future. The iOS solution is already implemented in the new OS which is on 50%+ of all iDevices after just one week. In contrast, Google's solution is promised for the next version of Android which will be released on TBD. This version will be used by new devices but likely trickle back to just a small percentage of old devices.
As a community we've always been skeptical of vaporware, especially when a lagging company announces vaporware in response to an innovator releasing a tangible product. Can we hold android to this standard?
Re:So what you're telling me (Score:5, Insightful)
Google's solution is promised for the next version of Android which will be released on TBD. This version will be used by new devices but likely trickle back to just a small percentage of old devices.
This is true, of course, but it should also be remembered (as Elenkov explains in detail) that Android's encryption features aren't new, they exist in hundreds of millions of devices already deployed. And, L isn't really "vaporware". If you want you can go download it now, though not in its final form. Elenkov's evaluation was based on real code running on his device.
I should make clear where I stand here: I'm an Android security engineer. I work on the hardware-backed crypto infrastructure (the infrastructure that Elenkov says is used in disk encryption for L).
If you'd like me to compare Android and iOS encryption, I'm glad to: Apple's is better. Android's, even without the upgrades in L, is strong enough in many contexts and it'll be even better after L, but still not as strong as iOS. One thing Android has done better is to encrypt the entire user data partition. With iOS8, Apple has (probably) addressed the issues it had there. I can't talk about future plans, but I will say that I'm not yet satisfied with Android's disk encryption. It's good, but it can and should be better. I'm not sure we can ever match Apple in this area, since Apple has the luxury of focusing on a single device and has complete control of the hardware. But we can get closer.
Can we hold android to this standard?
I would hope so. I do.
Re:So what you're telling me (Score:5, Interesting)
Thank you for your open response. A Q that you may have addressed elsewhere in the thread: How can Google put out a security solution that relies on a hardware feature, when it has minimal influence on the hardware designs? Does this mean that encryption will work *if* compatible hardware is provided, and if it's not provided the coverage is much weaker? Will the user be informed of this? On what percentage of devices is it expected to have the needed hardware?
kthx.
Re:So what you're telling me (Score:5, Informative)
I can't confirm that L is using hardware security for disk encryption. Elenkov decided that we are in his analysis, but you'll have to read his words to decide if you believe it.
However, I can definitely confirm that there is a hardware-backed crypto service in most of the better Android devices. It's called keymaster. Google creates the API and the code that uses it, and device makers have to implement it, or not. To see if your device has it, go to Settings->Security->Credential Storage->Storage type and see if it says "Hardware-backed".
Re: (Score:2)
Thank you.
I just checked my LG Nexus smart phone.
It has hardware-backed security.
Re: (Score:2)
Re: (Score:2)
I just checked my GalaxyS3, it says "software backed", so I guess it's not implemented on at least S3's.
I'd like to see people using hardware-backed credential storage as a factor when purchasing devices. It would actually take very little consumer pressure to make all devices provide it.
You can be certain that all current and future Nexus devices will.
Re:So what you're telling me (Score:5, Insightful)
I think it's worth noting something here about the Android implementations. Based on the articles and other published documentation, the Android "hardware backed" key stores are in fact not hardware backed at all, but rather based on the ARM chips TrustZone technology. This creates a secure world inside the CPU which is isolated from the main operating system. The secure world can store data and do computation on the main CPU without being exposed to viruses or root level access from Android itself.
But this comes with a huge caveat. This "secure world" is in fact just the same CPU running a program written in C. Such programs can of course have exploits. And in the past this is exactly what has happened, I believe some Motorola device was rooted in this way in fact, because the TrustZone protected program had some kind of overflow bug in it and that was enough to take control of the secure space.
What's more, I think it's deeply uncertain how exposed programs running in this secure space are to side channel attacks e.g. via timing or cache line games. People keep discovering clever ways to recover secret keys when running on the same physical CPU that shouldn't be possible according to the rules of the sandbox. And where does this secure program get its entropy from? A hardware RNG? Maybe, but as far as I can tell that's entirely up to the phone manufacturer, and in a competitive environment where everyone is trying to get costs down I suspect some manufacturers would choose to save money by skipping it. After all bad randomness looks the same as good randomness.
The Apple implementation, in contrast, appears to have the per-device key blown into the chip at manufacturing time, and then hard-wired to the AES circuitry. That is, it's actually hardware based and there are no chances for a "VeriLog overflow" bug or equivalent breaking the security of the system.
Anyway, I'd like to give kudos to swillden here for taking part in the discussion and being honest about how his work on Android currently stacks up with Apple. That takes some bravery. Also, there's more to security than disk encryption. The Apple celebs drama wasn't caused by the NSA breaking disk encryption, it was a bunch of pimply 4-channers phishing or guessing account recovery details on the cloud service. Whilst Apple has historically been ahead of Android in on-device security, they have been behind Google on cloud account security and this is in many ways equally as important.
Re: (Score:1)
Slight tangent, but I'm going to go there anyway ;)
The Apple celebs drama wasn't caused by the NSA breaking disk encryption, it was a bunch of pimply 4-channers phishing or guessing account recovery details on the cloud service. Whilst Apple has historically been ahead of Android in on-device security, they have been behind Google on cloud account security and this is in many ways equally as important.
Don't forget that the tools they were using were law enforcement tools [wired.com] designed for this purpose. Anything that can be done to prevent backdoors / third parties holding on to master keys is a good thing in my opinion as encryption should be viewed as protecting one's privacy and not a criminal act [maroeste.com].
Re:So what you're telling me (Score:5, Informative)
Based on the articles and other published documentation, the Android "hardware backed" key stores are in fact not hardware backed at all, but rather based on the ARM chips TrustZone technology.
Yes, this is correct. The Android crypto HAL (keymaster) can be provided by any "secure" device, but at present I believe all of them use TrustZone, at least on ARM devices (Intel has something similar, but it's not the same).
But this comes with a huge caveat. This "secure world" is in fact just the same CPU running a program written in C. Such programs can of course have exploits.
Sure they can. The benefit, though, is that the secure world code can and should be dramatically smaller and simpler than the non-secure world, and therefore amenable to much deeper security auditing. This is actually no different from what Apple does with their security chip... which still runs software written by people and can have exploits. There are some security benefits to being on a truly separate CPU, but that doesn't change the fundamental fact that exploits are always possible.
What's more, I think it's deeply uncertain how exposed programs running in this secure space are to side channel attacks e.g. via timing or cache line games.
The same issue applies to separate CPUs, and the same countermeasures apply as well, though I'll absolutely grant that it's a harder problem on a virtual secure CPU.
And where does this secure program get its entropy from? A hardware RNG? Maybe, but as far as I can tell that's entirely up to the phone manufacturer, and in a competitive environment where everyone is trying to get costs down I suspect some manufacturers would choose to save money by skipping it.
All the devices I'm aware of provide a real hardware RNG. The cost is negligible, in fact it would probably cost more to remove it from the SoC designs than to leave it in. However, that still leaves open the question of how good the hardware RNG is. For the next generation of keymaster I'm trying to define some requirements around how it's used that should mitigate many possible weaknesses, and I'm also defining a mechanism that apps can use to inject some entropy of their own, in case they don't trust the RNG (assuming they have a place to get entropy). Note that injected entropy is required to be securely mixed into HW-generated entropy, so it should not be possible to inject data that actually decreases the available randomness, nor to manipulate the outputs, unless the attacker can also manipulate/predict the HW RNG.
The Apple implementation, in contrast, appears to have the per-device key blown into the chip at manufacturing time, and then hard-wired to the AES circuitry. That is, it's actually hardware based and there are no chances for a "VeriLog overflow" bug or equivalent breaking the security of the system.
TrustZone-based devices also have fused per-device keys which act as the root of trust. The devices that I'm familiar with also have a hardware AES coprocessor which can load and use these per-device keys but will not reveal the actual key bits, not even to secure world code. Secure world code can request operations be performed with the keys, but not see them. Non-secure world code can't do anything except make requests of the secure world code.
Anyway, I'd like to give kudos to swillden here for taking part in the discussion and being honest about how his work on Android currently stacks up with Apple.
Thanks for the insightful response.
Also, there's more to security than disk encryption.
Vastly, vastly more. It's one very small piece of a large, complex and ever-shifting puzzle. With respect to mobile device security I think it's actually one of the less-important pieces, because the set of problems it solves is pretty narrow. It gets a lot of press and discussion because it's easy to understand.
Re: (Score:3)
I did not know this. That changes a lot - if even the TrustZon
Re: (Score:2)
I did not know this. That changes a lot - if even the TrustZone can't access the per device key directly then it would appear to give equivalent security (or actually better) to what Apple is doing.
I'd say it could give equivalent security, if it were applied in the right way. I'm not saying it is applied the right way to achieve that in L :-)
It would be nice to know which devices implement exactly what kind of security, but it seems everything is heading in the right direction, which is very good to hear.
It's not too hard to figure that out from looking at device logs with "adb logcat". I'm hoping to get some UI changes eventually to make it more clear without resorting to developer tools. And, yes, I completely agree that it's heading in the right direction -- up to and including a little competition with Apple to see who can lock their devices down the most tho
Re: (Score:2)
Not stupid at all.
The answer is: Hardly any performance impact. It's measurable when reading big files, but not noticeable, even without any hardware acceleration.
When you first encrypt your device, especially on older devices with larger storage, it can take a while. Sometimes up to an hour in really extreme cases. The long time isn't because the encryption is actually slow, though... it's I/O bound, not CPU bound. Reading and re-writing every byte of your storage takes a while. L does some clever thin
Re: (Score:2)
if you're designing the keymaster, who's designing the gatekeeper?
Re: (Score:2)
My older Asus tablet does not.
Re: (Score:2)
This is true, of course, but it should also be remembered (as Elenkov explains in detail) that Android's encryption features aren't new, they exist in hundreds of millions of devices already deployed.
I hope that Google plans to actually do better. Android does not support encryption unless you have a screen lock password turned on, and it does not support having the encryption key set to something other than the screen lock password.
Device encryption and screen locking are two different solutions to two different, but related, problems. It can make perfect sense to enable one without the other, and it makes almost no sense to use the same password for both.
The result is that most people don't bother e
Re: (Score:2)
I can understand implementing screen locking without device encryption; that's the state my phone currently is in, and it provides exactly the level of protection I require at this point in time - prevents casual snooping or misuse, but does not protect against a dedicated attacker.
Under what situations would device encryption be useful without a screen lock? Your phone data can be read by anyone who gets their hands on it, since the unencrypted data is exposed to anyone who swipes right...
I can't think of
Re: (Score:2)
Under what situations would device encryption be useful without a screen lock? Your phone data can be read by anyone who gets their hands on it, since the unencrypted data is exposed to anyone who swipes right...
Simple - you keep the phone on your person, and turn it off when you leave it in an unsafe place. You still want to turn off the screen so that your battery isn't wasted, but you don't need locking. You can power off when leaving the phone unattended, giving you the device lock.
I can't think of any good reason that your screen lock password should be weaker than your device password...
Your device password should be difficult for a machine to crack, since it is designed to protect against direct access of the flash hardware. That means it needs a strong password. The machine gets an unlimited number of attempts
Re: (Score:3)
Android's, even without the upgrades in L, is strong enough in many contexts and it'll be even better after L, but still not as strong as iOS.
Could you expand on that? FDE with a hardware keystore is pretty decent. What more is iOS doing?
Re:So what you're telling me (Score:4, Informative)
Android's, even without the upgrades in L, is strong enough in many contexts and it'll be even better after L, but still not as strong as iOS.
Could you expand on that? FDE with a hardware keystore is pretty decent. What more is iOS doing?
Unfortunately, I can't expand on it, not until L is actually released. Once the code is available I'll be happy to talk about the pros and cons, but it's not my place to reveal details of unreleased Android code, and I can't usefully discuss the differences without getting into the details. Ideally, a real discussion would be based a deep understanding of the details of both. We'll probably never have that deep insight into iOS, but once L is released we can at least compare known details of L with good guesses about iOS.
Re: (Score:2)
Re: (Score:2)
I have a question about Android encryption: I've "encrypted" this morning, but I can't tell if it actually worked. Instead of an hour, it only took about 2 minutes. Is there a way to verify that the phone actually did anything?
What does it say under "Encryption" in Settings->Security? If it says "Encrypted", it is.
If you want confirmation, probably the best way is to enable USB debugging, install adb on a handy laptop or desktop, plug it in, reboot the device and run "adb logcat Cryptfs:V *:S" (the Cryptfs:V means show verbose logs from Cryptfs and the *:S means make everything else silent). Then read the log messages.
There may be a more user-friendly way, I don't know.
Re: (Score:1)
As a community we've always been skeptical of vaporware, especially when a lagging company announces vaporware in response to an innovator releasing a tangible product. Can we hold android to this standard?
A lagging company? The last I heard, it wasn't Android that automatically uploaded naked selfies to iCloud to only have them leak to the public. Also, please note that this encryption feature doesn't even address the original iOS problem. If your phone automatically uploads pictures by default to an insecure infrastructure, then it really doesn't matter if the copy you have on your phone is encrypted or not.
Re: (Score:3, Informative)
a couple corrections to your inaccuracies (intentional?):
* iphones back up automatically to icloud.
* the exploit in #celebgate #thefappening was taking advantage of weak passwords and/or reset questions. it's not that the infrastructure was insecure, it was user error in selecting weak passwords / reset questions.
* in response apple has widely rolled out two-factor. some people will always set their passwords to be '12345', but at least with 2FA being very easily accessible then people have less and less of
Re: (Score:2)
a couple corrections to your inaccuracies (intentional?):
You tell me. Are you intentionally ignoring the claims [appleinsider.com] from this security researcher? Or was your ignorance unintentional?
* iphones back up automatically to icloud.
This point needs some explaining. Which parts get backed up? Plus, I'm not sure how it contradicts what I've said already. Are you implying that the default is not to continue to upload pictures to iCloud once you've uploaded at least one?
* the exploit in #celebgate #thefappening was taking advantage of weak passwords and/or reset questions. it's not that the infrastructure was insecure, it was user error in selecting weak passwords / reset questions.
That's a pretty lame defense. Can you point to an analysis or an explanation to back that up? I've heard the same denial by the CEO of Apple on Charli
Re: (Score:2)
* iphones back up automatically to icloud.
This point needs some explaining. Which parts get backed up? Plus, I'm not sure how it contradicts what I've said already. Are you implying that the default is not to continue to upload pictures to iCloud once you've uploaded at least one?
the point is, you make it sound like celebrities were posting naked selfies on the internet and then got hacked. what happened was people took private pictures on their private phones, and assumed that because the phone was in their possession their private photos were safe. they didn't intend to make them accessible online. so stop trying to slutshame them.
* the exploit in #celebgate #thefappening was taking advantage of weak passwords and/or reset questions. it's not that the infrastructure was insecure, it was user error in selecting weak passwords / reset questions.
That's a pretty lame defense. Can you point to an analysis or an explanation to back that up? I've heard the same denial by the CEO of Apple on Charlie Rose, but I didn't believe it. Our infrastructure is secure, is not enough of an explanation.
the point is, the hack was due to a weakness in the security protocols, not a technical exploit of the servers or something. Also the hack was targeted
Re: (Score:2)
the point is, you make it sound like celebrities were posting naked selfies on the internet and then got hacked. what happened was people took private pictures on their private phones, and assumed that because the phone was in their possession their private photos were safe. they didn't intend to make them accessible online. so stop trying to slutshame them.
I was criticizing Apple, not the celebrities. Admittedly, I do not have an iPhone. I only heard second hand accounts of what the iPhone was doing with the pictures.
If what you're saying is correct about the automatic backup, then my original statement about Apple stands.
the point is, the hack was due to a weakness in the security protocols, not a technical exploit of the servers or something. Also the hack was targeted at these people. look up 'spear fishing" when you have a chance.
Finally, you're willing to admit there was some weakness in the security protocols. That's far better than what I've heard the current CEO of Apple admit to.
Yes, I know what 'spear fishing' means in the context of security. And yes, the hack
Re: (Score:2)
Reset questions - how to turn a semi-effective security system into something that can be hacked by anybody (even Matthew Broderick :-) ) willing to do simple research into your background using publicly-available information. Usually added by systems where the owner doesn't want to hire enough trusted staff to help with password reset.
Most of the time you're better off putting random noise in there that can't be guessed to disable the functionality. It gets really annoying when 2 or 3 typo'd attempts force
Re: (Score:2)
a couple corrections to your inaccuracies (intentional?): * iphones back up automatically to icloud.
By default only. Mine doesn't.
Re: (Score:2)
The iOS solution is already implemented in the new OS which is on 50%+ of all iDevices after just one week.
I see 47% and holding steady. [9to5mac.com]
Re: (Score:1)
The iOS solution is already implemented in the new OS which is on 50%+ of all iDevices after just one week.
I see 47% and holding steady. [9to5mac.com]
And just how many Android Devices are running the latest OS?
You really don't want to get into that particular pissing-contest, do you, Mr. Fandroid?
Re: (Score:2)
That devolves to the manufacturer & the service provider, both of whom will sit on the update until they manage to fill it with their own bloatware (A process that can take 6 months or more).
Re: (Score:2)
very interesting. this set of articles (the one you linked to and similar ones on other blogs) were all posted late yesterday, around the same time I posted my original comment. I was relying on earlier articles from a week ago saying that ios8 was at 40%+ and growing rapidly. the dangers of extrapolation!
Re: (Score:2)
for me, and I think for a lot of /. readers, the biggest issue is comparing a solution that currently exists to vaporware that may exist in the future. The iOS solution is already implemented in the new OS which is on 50%+ of all iDevices after just one week. In contrast, Google's solution is promised for the next version of Android which will be released on TBD. This version will be used by new devices but likely trickle back to just a small percentage of old devices.
As a community we've always been skeptical of vaporware, especially when a lagging company announces vaporware in response to an innovator releasing a tangible product. Can we hold android to this standard?
You mean something like supporting full-disk encryption, which I enabled on my tablet a couple years ago? If I remember correctly, Android has had full-disk encryption since 4.0, maybe even 3.0.
Re: (Score:2)
apple has had FDE for a long time, but what has changed is apple used to have a LEO backdoor where they could unlock a phone when they physically had the phone in hand and a warrant was provided. I assume all phone makers have this loophole. What apple did with ios8 was close the loophole and take themselves out of the equation. it's actually very little difference for the end user, but a big eff u to the govt.
Re: (Score:2)
Billions of Android devices have the encryption capability already implemented. It just isn't turned on by default. Thus, it is not vaporware at all.
No, more like smoke and mirrors. Present but off is an illusion of security.
Re: So what you're telling me (Score:1)
Being off might as well be not existing as far as most users are concerned. It's the tyranny of the default.
Re: (Score:3)
The biggest question I have is the part about Google incorporating hardware-based security since Google controls the software, not the hardware. Are they now dictating that for Android to run a specific hardware protection setup must exist on the devices?
Not a specific hardware protection setup, no. Android defines an abstract API to hardware-backed crypto services, called keymaster, which device makers have to implement. They can implement it in whatever way they like, though AFAIK all of them currently use ARM TrustZone. And not all of them do implement it. If you'd like to find out if your device does go to Settings, then Security, then scroll down to the section on "Credential Storage" and see if it says "Storage type" is "Hardware-backed".
The documen
Re: (Score:2)
I wonder if the hardware based security can be used in addition to splitting the passphrase that mounts /data into the long phrase that unlocks the device, and the short PIN to unlock the screen. This way, even though there is protection against brute forcing similar to what Apple has, I am still packing my own parachute with a very long passphrase.
STOP THE VIDEO ADS SLASHDOT! (Score:1, Informative)
STOP THE VIDEO ADS SLASHDOT!
THEY EAT ALL MY (meager) BANDWIDTH AND RELOAD CONSTANTLY!
I CANT LEAVE A SLASHDOT TAB OPEN WITHOUT HEARING RANDOM SOUND 15 MINUTES LATER!
THIS SITE IS BECOMING UNUSABLE.
IVE NEVER NEED THE REMOVE ADS FEATURE AND NOW THAT I NEED IT I CANT FIND IT. HAS IT BEEN REMOVED?
IVE NEVER USED ADBLOCK IN MY LIFE AND IM GOING TO HAVE TO DOWNLOAD IT FOR SLASHDOT! NEWS FOR NERDS INDEED. MORE LIKELY ILL JUST STICK TO REDDIT, I SEE THE SAME STORIES ON THERE DAYS EARLIER.
sTop it sTop IT stOP IT stOP i
Re: (Score:3, Informative)
Just get Adblock already. It will spare you bandwidth, and increase your security (and your sanity).
Re: (Score:2)
If you can't crack the password, then don't. (Score:2, Interesting)
Presumably, the apps on the phone have access to the encrypted data on the phone, right? So there's a simple solution. The user is happily using their iWhatever. The government sends a Nation Security letter to Apple forcing them to put a backdoor into the phone of the target, such that this app can read whatever data it wants on the phone. So when the user boots up his/her phone, and enters the password, the rougue app should be able to read all the data on the phone.
Can anyone tell me why this WOULDN'
Re:If you can't crack the password, then don't. (Score:5, Interesting)
Presumably, the apps on the phone have access to the encrypted data on the phone, right? So there's a simple solution. The user is happily using their iWhatever. The government sends a Nation Security letter to Apple forcing them to put a backdoor into the phone of the target, such that this app can read whatever data it wants on the phone. So when the user boots up his/her phone, and enters the password, the rougue app should be able to read all the data on the phone.
Can anyone tell me why this WOULDN'T work?
Because National Security Letters [wikipedia.org] cannot be used for that. They can only be used by the FBI to demand the handing over of data in the possession of or passing through the control of the receiver, not the performance of actions (and how the data is produced is up to the company receiving the NSL, not the FBI).
Now what is in the Cloud is a different matter since Apple would have access to that, though again it may be encrypted with a key only the iDevice possesses so Apple wouldn't be able to decrypt it for the FBI.
Re: (Score:2)
They might not use an NSL, but I wouldn't count on it. The other blunt instrument the government has at it's disposal is the
Authorized use of Military force [wikipedia.org], which doesn't even mention surveilance or data and is about military force, but which the government has cited in its warrantless wiretapping when sued by the http://en.wikipedia.org/wiki/A... [slashdot.org]">ACLU. Kind of a stretch, but the government has long tried to get away with whatever they want and let the courts rule on it later.
So I have no problem bel
Re: (Score:2)
I agree that Apple can't give an agency access to the device.
There is still some question around any icloud backup. You can lose a device and restore to a replacement. You can forget your password and go through the reset process. These two mechanisms tell us that in fact Apple could if pressured hand over an iCloud backup with the means to decrypt it, provided that they intercept the forgotten password process.
Of course there could be some legal reason why the agency cannot change the password. If inclined
Re: (Score:1)
Can anyone tell me why this WOULDN'T work?
If your phone does not have auto-update or -download enabled, or is not attached to iCloud at all, then there's no way for Apple to push apps down. While a lot of things can be done automatically on iOS for convenience, you can turn off a lot of it as well.
I'm sure there are potential base band attacks as well, but I'm not sure how closely linked that chipset is linked with the main CPU in iPhones. Probably less linked than most Android phones since Apple uses their own CPUs.
Re: (Score:3)
It's impossible to cut a hardware vendor out of the trust system, unless you audit the hardware of your device. But set this aside.
This won't work because apps never see your password or have access to the decryption keys. The CPU itself doesn't have access to the decryption keys and do
no key needed when you have the data (Score:3)
I think you may have missed GP's point. The key protects the data. When the user enters the passphrase, the data is decrypted and apps can access all the data. Therefore, you don't NEED the key if you can put an app on the phone, then the user uses their phone. The encryption is useful only on a stolen or seized phone.
Re: (Score:1)
No, apps don't have access to all the data. That's exactly what iOS's sandboxing model prevents - it makes sure that apps can't read data arbitrarily across the device, instead, only the app's own data, and data that's been explicitly authorised for that app to view.
Re: (Score:2)
There is that, but if that were trustworthy you wouldn't need encryption. I assume, I think rightly, that any forensic app installed by Apple or a letter agency will have no trouble bypassing the sandbox. I've yet to see any sandbox model, on any OS, that didn't leak like a sieve. See Java and Flash for well-known examples of that approach.
Re: (Score:2)
Defense in depth.
Re: (Score:2)
There is that, but if that were trustworthy you wouldn't need encryption. I assume, I think rightly, that any forensic app installed by Apple or a letter agency will have no trouble bypassing the sandbox. I've yet to see any sandbox model, on any OS, that didn't leak like a sieve. See Java and Flash for well-known examples of that approach.
Idiot. It's called "defence in depth". The most basic knowledge of security will tell you that you will have multiple protections. Having three layers of security isn't an admission that the first two layers are unsafe, only to a clueless idiot. The third layer provides more security.
error in depth (Score:2)
I'm about to go into a meeting where we're going to design a security architecture for a government agency involved in cybersecurity. While designing it, I'm going to watch out for the error GP made by implication, which is:
Today's topic is encryption of data on mobile devices.
A method of bypassing the encryption was suggested.
Gp (and you) essentially argue that the encryption doesn't need to be solid because sandboxing.
Next week, the same logic would argue that sandboxing doesn't need to be strong because
how I do it (Score:2)
Here's how I do it:
http://osxdaily.com/2014/02/16... [osxdaily.com]
Timtowtdi of course
Re: (Score:2)
Here's how I do it: http://osxdaily.com/2014/02/16... [osxdaily.com]
Timtowtdi of course
Yeah, finding a way that doesn't require the user to turn on Automatic App Download and you having full access to the Apple-ID account (not to mention the user not noticing the new icon with a blue dot before the name) sure would be better than this one.
Pretty sure Apple already has access (Score:2)
AC asked: How would apple or said letter agency install ...
Plumpaquatsch replied:
a way that doesn't require access to the Apple-ID account sure would be better
I'm fairly sure that Apple already has access to your Apple account, and the NSA or other three-letter agency can get access whenever they feel like it. Heck, I only work for a FOUR-letter agency, and I can personally access most Apple accounts.
Re: (Score:2)
Re: (Score:2)
Do
http://www.ibtimes.com/icloud-... [ibtimes.com]
you
http://securitywatch.pcmag.com... [pcmag.com]
think
http://www.thehothits.com/news... [thehothits.com]!
iCloud
http://www.businessinsider.com... [businessinsider.com]
is
http://www.troyhunt.com/2014/0... [troyhunt.com]
secure?
http://hollywoodlife.com/2014/... [hollywoodlife.com]
Three major hacks in the last few months, one by a preteen.
Re: (Score:2)
A blinded fanboi, I see (Score:2)
Google's security isn't awesome. Therefore, you reason, Apple's must be perfect, because you're a fan. Do you throw your panties on stage at Apple events?
When you grow up a little bit, you'll come to understand that a) fanatacism toward a company like Apple (or Google) just means you're easily manipulated by marketing and b) all companies, all products, and all services have limitations - especially security limitations. The fact that users routinely forget their passwords makes it extremely difficult to
Re: (Score:2)
Re: (Score:2)
Actually... The passphrase only decrypts the key that is used to protect the data.
Plus, each app is sandboxed and can not access other apps data. (with a few controlled exceptions)
Plus, warnings a thrown up if your app starts trying to access things. (Contacts, Microphone, Photos etc)
What about iCloud? (Score:1)
Plus solving the brute-force problem, of course.
Re: (Score:2)
containment (Score:3)
Encryption can be rock solid -- still the pass phrase be sniffed
Anything typed into a device that has connectivity is floating out there.
Re: (Score:1)
Depends, places like Huffingtonpost/Facebook/Guardian are actually able to record what you type prior to you submitting it.
Re: (Score:2)
Those places use javascript on webpages to upload what has been typed so far so they can do predictions and make suggestions. When you are entering the phone's passcode or phrase it is a very different matter since that isn't being entered into a browser, it is being entered into the phone OS's native interface. Still, as long as the software was created by someone else, in theory they can do anything they want with it, including after using it to unlock the storage, store the passphrase somewhere on the
Here's how it works (Score:3, Interesting)
The NSA (and other agencies) have noticed a significant drop in data, and an increase in the use of encryption/VPN/proxies/TOR since Snowden went public.
They realized more people were starting to take care with their data, so how to fix (read stop) it?
OK, first we have the NSA complaint corps (Apple, Google, Facebook, Twitter) code some "encryption" made out of tissue paper, then they send out the FBI (and other agencies) talking heads to publicly denounce this "encryption" as though they were seriously concerned.
Now people thinking they have encrypted their devices and are safe will once again become complacent.
But the real story is even more absurd, the fact that the average person believes they are of any interest to anyone but marketers.
Re: (Score:2)
Seems like a very risky strategy. Android and iOS encryption have already come under a lot of scrutiny, not least from companies that make software to extract data for law enforcement. If there were weaknesses there is a good chance they would be found, as they were when problems like Goto Fail were discovered.
Even if the encryption is compromised, it would be effective against corrupt law enforcement agencies. The FBI isn't going to start cracking iOS encryption and then going to court with the evidence be
Re: (Score:1)
Seems like a very risky strategy
Like hiring outside contractors while you violate the letter of the law and gather up everyone's Internet and phone communications, including sticking backdoors into firmware, hardware, and major brands of software? Some of said brands were working with the NSA.
Risky like that?
Think about it.
Re: (Score:2)
But the real story is even more absurd, the fact that the average person believes they are of any interest to anyone but marketers.
Nobody is of interest until something happens that makes you commited.
Re: (Score:1)
And how likely is that for the average person.
Re: (Score:1)
Cool, I don't know about iOS for sure, except that iOS 8 fucking lags on an iphone 4s, so I have to assume it's doing *something*.
In addition, enabling encryption on my i9300 (android, of course), led to tangible lag in device usage. If you're going to make "tissue paper" encryption, you'd at least omit the lag, surely? Not that I'm fully refuting your claim -- just saying that if it's true, someone went to a lot of effort to waste cpu cycles so it seems as if there's something happening.
Re: (Score:1)
I have no proof of what I say, I am basing everything off previous experience and the obvious "conspiracy" between governmental agencies and certain corporations.
I would bet a large amount of money that I am close to, or right on top of the truth of the matter.
Your mileage may vary.
Re: (Score:1)
Like I say, I'm not disputing that there are a bunch of nefarious fucks trying to run the world; just that, if iOS and Android encryption are bunk, they either went to a great deal of effort to make them resource-intensive or they just plain hired a bunch of PHB's, pointed them at Scratch and said "go code encryption, 'cos you can!". Because seriously, Apple's planned obsolesence is working and I had to turn off encryption on my s3 after I started getting the urge to smash it.
Re: (Score:1)
Additionally consider Apple's security track record of late, (get fail?) and frankly I don't think Google/Android were ever designed to keep things private.
What about the data? (Score:1)
Encryption is only one part of the announcement. Apple also said that they're not going to sell your data, for the most part. What did google say about that?
Re: (Score:1)
Encryption is only one part of the announcement. Apple also said that they're not going to sell your data, for the most part. What did google say about that?
Google is an advertising company. Of course their going to sell your "anonymous" data.
Dead-man switch required. (Score:1)
What's now required in laptops and mobile phones is a "dead-man" switch where if the user stops doing something after a certain period of time, the device switches off and all crypto keys in memory are lost. That may not be convenient for mobile phones, but if you look at how the Dread Pirate Roberts was caught - in a library by FBI determined to not let him turn off the laptop - then a dead man switch (software or hardware) could have shut his laptop down before they had time to dump its RAM to disk or sim
I tried device encryption on CM11 nightlies... (Score:1)
the other day. Here's what happened:
1) Performance sucked ass, despite reports to the contrary (i9300 -- I know it's no G3, but hey, it should damn well be enough, at quad-1.2 gHz with a gig of RAM)
2) My TWRP restore didn't include my home partition so I lost all data on there. Sucks to be me.
I'd welcome this if it didn't come at the massive lag that I experienced on a device which is normally quite spritely. I get that encryption doesn't come for free, but adding 1-3 seconds lag to every tap is not, in my
The android key is at danger on running devices (Score:2)
Try to change it with vpc. You are NOT asked for your old password.
With for example LUKS this is not possible, as the linux kernel does not give you the password of the unlocked device, which is needed to encrypt it with the new key.
Re: (Score:1)
There have been backdoors in consumer electronics since the 90's.
TPM is no exception.
Re:Backdoor in TPM chips? (Score:4, Interesting)
There are a handful of companies able to do this, and a handful of well-funded amateurs with the capability to do this for 80's video game chips that everyone is desperate to pay £10k just to acquire (let alone emulate).
If you keep up with any of the MAME scene, the decapping projects are few and far between and pretty much one guy rules them all and it can take YEARS to decap a chip, and years more to understand what it does. On commercially available stuff from the 80's.
Security chips, etc. generally fall to statistical analysis and modern computing power, not someone looking inside the chip. And the keys in a TPM chip won't be stored where they can be "seen" while decapping - that reveals only the structure and to find out where to hook into in a complex encryption chip is quite a skill. To do so without disrupting the chip's operation (even if it has no security) is difficult. To extract actual useful data, even more so.
I'm not saying it couldn't happen, if a case came up with the military or NSA needed to get access to something stored on a TPM chip. But it's not something they'll be doing routinely. And you can "force" chip manufacturers all you like - most of the stuff we use does NOT originally come from the USA or from US-influenced countries. I'm not saying they're not compromised too, in some fashion, but it's not as easy as saying "We are the NSA, put in a backdoor". Most countries will tell you where to go, and get you into the international press for even trying (could even be considered an act of war if you tried that on the Chinese, for example).
Things aren't as simple as "everything is backdoored", "acres of supercomputers", "listening to every phone call", etc. That's all hyperbole. And it works on exactly the people it's intended to - the general public who have nothing of import to hide, and thus feel reassured that they're safe from nasties / scared that shouldn't become a nasty in case they get caught.
Any encryption you might use as a consumer is NOT intended to defeat a well-funded military adversary with reason to decrypt it. That might be possible, it might even be what encryption was invented for, but that's NOT the use-case of 99% of encryption out there. Don't think it is. Hell, we had the world's most popular SSL library have a flaw it in for 10 years and NOBODY NOTICED.
If anything, elliptic-curve cryptography is the weak-link. We're being forced onto it by being told everything else is weak. It's not as well-researched or understood as the algorithms that have been attacked for nearly 40 years. Implementations are few and based on published curves. And there's NOTHING being said about our move to it.
If anything, when you think the trick is happening, it's already been done.
Re: (Score:3, Interesting)
If you search YouTube for "Black Hat DC 2010: Hacking the Smartcard Chip", you'll see a series of eight videos where a guy shows how he spent six MONTHS decapping TPM chips, finding ways around their layers of security, and eventually gaining access to the core's memory pins where everything is unencrypted. He was eventually able to come up with a process that takes only hours to tap a few pins into the TPM and dump everything.
The kind of backdoor I think is built into these chips is secret undocumented i
Re: (Score:2)
I want to address this statement here, because I feel it's misleading.
Elliptic curve cryptography has been researched since the 1980's. It is not at all crazy or new. It is th
Re: (Score:2)
Though I agree with your post, and am aware of the majority of it but maybe couldn't reel off the exact curve names, there are problems here.
There are a number of areas where EC is being "forced".
Perfect Forward Secrecy is one (one that you hadn't heard mention of in general IT circles). Yes, there is a non-EC way to do it, but it's horrendously slow compared to the alternative. The only way to get PFS is to use EC. We're being told to abandon RSA and move to EC, in no uncertain terms in other areas. Bu