Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Android Cellphones Google Handhelds Security IT

Researchers Find Big Leaks In Pre-installed Android Apps 136

An anonymous reader sends this quote from an article at Ars Technica: "Researchers at North Carolina State University have uncovered a variety of vulnerabilities in the standard configurations of popular Android smartphones from Motorola, HTC, and Samsung, finding that they don't properly protect privileged permissions from untrusted applications (PDF). In a paper just published by researchers Michael Grace, Yajin Zhou, Zhi Wang, and Xuxian Jiang, the four outlined how the vulnerabilities could be used by an untrusted application to send SMS messages, record conversations, or even wipe all user data from the handset without needing the user's permission. The researchers evaluated the security of eight phones: the HTC Legend, EVO 4G, and Wildfire S; the Motorola Droid and Droid X; the Samsung Epic 4G; and the Google Nexus One and Nexus S. While the reference implementations of Android used on Google's handsets had relatively minor security issues, the researchers were 'surprised to find out these stock phone images [on the devices tested] do not properly enforce [Android's] permission-based security model.' The team shared the results with Google and handset vendors, and have received confirmation of the vulnerabilities from Google and Motorola. However, the researchers have 'experienced major difficulties' in trying to report issues to HTC and Samsung."
This discussion has been archived. No new comments can be posted.

Researchers Find Big Leaks In Pre-installed Android Apps

Comments Filter:
  • by Anonymous Coward

    How about a link to the quoted article?

  • Cyanogenmod (Score:5, Insightful)

    by Skarecrow77 ( 1714214 ) on Friday December 02, 2011 @01:05PM (#38240342)

    What does it say when I trust a bunch of random coders on the internet to give me a better performing, more secure, and overall more pleasing experience with my smartphone than the company that created it.

    • by NatasRevol ( 731260 ) on Friday December 02, 2011 @01:07PM (#38240374) Journal

      That they stood on the shoulders of giants, and combed their hair?

    • Re:Cyanogenmod (Score:5, Interesting)

      by iluvcapra ( 782887 ) on Friday December 02, 2011 @01:13PM (#38240460)

      People who own and use phones have a greater incentive to make a good phone OS than people who sell and provide service to phones.

      • Re:Cyanogenmod (Score:5, Insightful)

        by clarkn0va ( 807617 ) <apt.get@NosPAm.gmail.com> on Friday December 02, 2011 @02:33PM (#38241654) Homepage
        You're right, and what a sad statement that is on the current state of affairs when a group of companies can treat their consumer base with something between indifference and contempt and yet continue to profit from them.
      • Re:Cyanogenmod (Score:4, Insightful)

        by jasno ( 124830 ) on Friday December 02, 2011 @02:53PM (#38241934) Journal

        Look, the people who develop the phones use them too. The reality is that there just aren't that many smart, motivated, capable engineers out there. Even when you have a few alpha-engineers on a team, their time is usually spent trying to squash those hard-to-fix bugs instead of doing a thorough security analysis. They're rushing to get the damn thing to production so they can move on to the next big thing.

        I've spent my career developing embedded applications and not once has anyone paid me to address security. Bugs - user experience issues, stability problems, content security, standards compliance - those get the money. No one in management values security or privacy and they won't unless security researchers and hackers make the consumer aware of it.

        • Re:Cyanogenmod (Score:4, Interesting)

          by jasno ( 124830 ) on Friday December 02, 2011 @02:56PM (#38241978) Journal

          Actually - I wonder if there is a certification agency for security/privacy? I've never heard of it, but if someone like the EFF got together with a testing lab and established a logo-certification program for various classes of devices(phones, operating systems, set-top boxes, networking equipment, etc.) you'd have a way for the consumer to evaluate security and make decisions accordingly.

    • by Drew M. ( 5831 )

      What does it say when I trust a bunch of random coders on the internet to give me a better performing, more secure, and overall more pleasing experience with my smartphone than the company that created it.

      We refer to that as the "Open Source" development model

      • by Anonymous Coward
        I believe it means you have misplaced trust. You can't trust random internet coders either. Do you know who writes most malware? Yes, some is written by corporations - see Sony's root kit and this current Carrier IQ. But most is written by random coders on the internet. Me? I don't trust the company that made my phone. I don't trust random coders on the internet either.
    • That was my first thought - are privacy and commercial success mutually exclusive for mobile devices? It seems that once a mobile OS is adopted by a large market the sleazebags move in to load it up with shovelware that siphons off your personal data (in the case of Android, that includes the carriers and even the manufacturers). Meanwhile the geek-oriented OSes (custom Android builds, Maemo/MeeGo, Ubuntu Mobile, etc) running open-source apps with funny names exist happily with no problems.

    • by blair1q ( 305137 )

      It means they can root your phone so you can install their, um, rootkit, essentially....

    • What does it say when I trust a bunch of random coders on the internet to give me a better performing, more secure, and overall more pleasing experience with my smartphone than the company that created it.

      Except that the phones from the company that created the OS, Google, didn't have any security issues. CM is cool, but it's less secure than a standard Google build, not more. Although it probably is more secure than Android as delivered by the carriers.

    • Would using this software have protected a person against Carrier IQ's rootkit? I don't know anything about smart phone tech. I was actually thinking of getting one until this story broke. If I can gut the software and use something like a cyanogenmod to protect myself against privacy abuse that would be golden.

    • That you realize random coders on the internet are more likely to care about the end results of their product than a monolith like Samsung?

  • We need automated tools to catch obvious security errors in software much like grammer and spelling checks in Word processors.

    The use of automated source code review tools should become more popular, especially as a well-linked resource from inside SourceForge and other sites that promote software development. Based on the number of security vulnerabilities so frequently found in software, there's got to be some signature-based checking that could catch the common mistakes, which could be made available by

    • by Anonymous Coward

      > grammer

      Spill cheque woks grate!

    • They exist (though they're extremely immature at the Android end of the spectrum), but they're breathtakingly expensive. I'm not allowed to cite specific products or prices, but we're talking "annual licensing fees comparable to the salary of a full-time human employee for 3-6 months" expensive.

      • The static analyser in clang is free and would catch several of the things that people who R'd TFA say were mentioned.
        • Unless something has massively changed in the very recent past, Clang doesn't analyze Java (directly). EVEN IF you somehow managed to coax gcc into outputting something Clang can analyze, it's questionable whether the output would even be relevant. At best, you'd be scanning for vulnerabilities due to bugs in Android itself (which obviously exist, but are unlikely to be easily found as low-hanging fruit). At worst, you'd be completely wasting your time looking for vulnerabilities that might hypothetically e

  • Carriers (Score:3, Insightful)

    by bonch ( 38532 ) on Friday December 02, 2011 @01:10PM (#38240420)

    The lack of control the carriers have over iOS is just one of the reasons I prefer it over Android. They wanted to pre-install a bunch of junk on the iPhone, and Apple wouldn't have it. The difficulty reporting these vulnerabilities to HTC and Samsung is not surprising.

    • by robmv ( 855035 )

      Not again the iOS vs Android mantra about carrier installed crap, do you want a new clean phone? buy it unlocked without carrier intervention. Expensive? need financing? Use your credit card financing services, problem solved. And this worked since the old smatphones generations, I used Nokia and Sony Ericsson phones before the Nexus One

    • by Ihmhi ( 1206036 )

      Sure, but what about when some sort of vulnerability is found in iOS? It's not like Apple is somehow magically invulnerable to software security issues. At least with a lot of Android phones you can do something about it without getting too much shit (if any) from your carrier.

      • iOS is not any less vulnerable than Android when it comes to security. This specific issue highlights that the vulnerabilities can occur because the manufacturer screwed up Android implementation. Apple is less likely to do so as they control the hardware and software. Also with as much profit as Apple makes, they don't have any excuses. With Android manufacturers, they might have a lack of QC due to lack of money or that they really need the revenue generated by crapware.
        • Apple is less likely to do so as they control the hardware and software.

          I would agree that they have more incentive to make sure things work, and they have more liberty to make the necessary changes, but these don't always translate into better outcomes. They have so far, though.

      • by Karlt1 ( 231423 )

        Sure, but what about when some sort of vulnerability is found in iOS? It's not like Apple is somehow magically invulnerable to software security issues. At least with a lot of Android phones you can do something about it without getting too much shit (if any) from your carrier.

        Every iOS device introduced after June 2010 is supported with OS updates and security fixes from Apple. Can Android users say the same?

        Let's see....
        Android
        1. A security vulnerability is found
        2. Google patches the vulnerability
        3. And

  • by mtrachtenberg ( 67780 ) on Friday December 02, 2011 @01:18PM (#38240534) Homepage

    I hope all of the people thinking it would be very cool and convenient to vote via smart phones (or the internet, or the telephone, or the mail system) will notice that smart phones might not yet be perfect.

    Voting is a classic example of a situation where the requirements cry out for appropriate technology.

    The requirements are unique: you must not be able to prove how you voted, you must not be able to sell your vote or be coerced by anyone, you should be able to have complete confidence that your vote was counted properly along with everyone else's.

    The technology that is required is completely straightforward -- people have to go to protected locations, create physically countable and non-traceable artifacts that represent their uncoerced opinions, deposit these artifacts into a locked box at the location, and know that the contents of the locked box are properly reflected in the results.

    The best way to accomplish the last step is to count the contents in public before the contents are moved, and to generate and digitally sign images of the artifacts so that anyone who wants to confirm your count is an accurate representation of the contents is able to do that.

    All attempts to modernize voting for convenience's sake are misguided. All opinions that making a simple approach more complex to speed up the distribution of results are misguided. Something that is convenient but cannot be checked is not appropriate for voting. And any time a computer scientist tells you how secure something is, introduce them to real people and the way they protect their passwords.

    • by Dr_Barnowl ( 709838 ) on Friday December 02, 2011 @01:42PM (#38240872)

      The appropriate technology for voting is a pencil.

      Anything mechanized or computerized might be splendid, efficient, and offer a whole host of other benefits. But they all lack the absolutely vital feature; the average man on the street must be able to audit it. And verily, should be required to do so.

      Making a voting system where only a limited set of technocrats can audit it's veracity is madness.

      • Even if the voting machine is a pencil, as long as the counting machine is a computer, we run into the same issue. Sure, it can be audited, but that's not going to happen the majority of the time.

        • And that's why, in addition to hand counting of the ballots at the precinct, there ought to be (at least) a digital image backup enabling complete redundant counting off the images.

          The best approach would be to generate the image collection, on unchangeable media, at the precinct at the close of polls. This should probably be generated from an independent scanning station, so that the ballots can be shuffled prior to scanning. This copy should be created at the precinct, because that is where workers can

        • If you computer count all ballots and randomly hand-count some segments of ballots, you catch errors in the counting.

      • by blair1q ( 305137 )

        The only appropriate technology for voting is raising your hand. Any device such as a ballot that can be locked in a box and carted into a locked room to be counted by strangers is eminently hackable.

        The secret ballot is the single greatest threat to verity in elections since we stopped allowing the outright buying of votes.

      • by elsurexiste ( 1758620 ) on Friday December 02, 2011 @03:11PM (#38242242) Journal

        Let's be honest: the average man can't audit anything. In the end, it's more about trust than technology.

        Can I trust that no one will fold the ballot in a certain, unique way that would allow someone to tell it apart? Can I trust that no one will add a doodle that will equally provide a "signature"? If I can't, then I must admit there are ways to prove how someone voted.

        Can I trust that no one will use the signatures describe above to identify a voter and pay/coerce? Can I trust that everyone will uphold the secrecy? If I can't, then I must admit that votes may be up for sale or manipulation.

        Can I trust that no one will miscount? Can I trust that the people counting are impartial and not subject to coercion? Can I trust that, even if I'll never be present at the counting and audit the system myself, it will be carried out perfectly? If I can't, then I must admit that the whole counting thing will eventually be rigged.

        There's only one reason an average man on the street trusts the system (if he does): it's familiar. Just like his trust on https, credit cards, or the expiration date of his food. Regulations for voting give trust to Average Joes and Janes because they are familiar with those measures and can somewhat understand how are they supposed to prevent rigging, not because they are effective (this is true for a lot of situations, TSA comes to mind). If people trust electronic voting systems, then they'll become the appropriate technology.

        I'm sick and tired of hearing "You can't be 100% sure of X with electronic voting systems! The whole system is crap!" or "Aha! The 7th step in your chain of validations can be manipulated! The whole system is crap!". Well, it isn't. Look at elections worldwide: they are done in P&P, yet everyone says they are rigged, regardless of international (and supposedly impartial) auditing. Regardless of analysis. Just because people don't have trust in it.

        We can't, therefore, judge a voting system just on how inexpugnable they are: the only thing we can do is put enough checks and barriers to make it really hard to break the main requirements, we do enough information campaigns to explain in layman terms what's going on, and we friggin' trust on the outcome. We are losing some great stuff (i.e. precision and accuracy) just because we demand things we never had and never will.

        Now, let the /. crowd proceed to mod me down. But before that, my ad hominem. Your comment is group-think at its finest. Only a few people bring nice arguments to the /. table nowadays; the rest just repeats whatever the consensus is and are happy to maintain the status quo. Use your friggin' brain and don't follow the herd.

        • I think you're right that auditing is a non-solution. But that only means that the counts must be easily verifiable by average persons, when motivated. The trick is to provide a solution where the average motivated person has access to enough information -- correct information -- to confirm things for themselves. The glories of redundancy will then take over, because there will often be people who are inclined to check.

          As long as the signed ballot images can be confirmed to match the ballots -- and that

          • Well, after reading your response, my mind is at ease (I also took a nap and reached my WHO-recommended 6 hours of sleep :P ).

            On the question of how easy it is to verify, I really don't know. Not because I can't come up with ideas, but because I don't know what the Average Joes and Janes would find acceptable. Maybe distributing a photo of all the motherboards involved, and using acrylic cases to display them on election day, is enough to earn their trusts. Of course, it won't do for us, tech-savvy people,

      • You should look into Scantegrity, developed by security researchers David Chaum and Ron Rivest (the latter is the 'R' in RSA).

        It is an automated, scanner-based voting system which is more secure than pencil and paper systems, precisely because it's more auditable. It actually enables each voter to verify that his or her ballot was counted correctly in the final tally -- but without giving the voter the ability to prove who they voted for to anyone else, to eliminate issues of vote coercion. The system a

    • by blair1q ( 305137 )

      All voting systems have holes.

      Ballot boxes can be stuffed. The counting room can be infiltrated by partisans. The entire process can be a sham run by the state's Secretary of State.

      And the issues on which you base your votes can be complete bullshit designed to distract you from the true issues on which you should be making your decisions.

      Nobody said plural voting wasn't a logical fallacy in the first place. It's just better than letting a guy make the decisions because he killed the last guy who made th

    • Voting should not be another phone app. It should take a deliberate effort on the voter instead of something you can do while sitting on the crapper.

      It's absolutely scary how people vote now. They blindly vote along party lines and have no clue who the candidates are, their history or even their position on key issues (not the fluff crap the media tells you is important). You'd see a radical difference if the ballots just listed names and did away with the little (D) and (R) labels. Don't even put a descri

  • if (x < 0) {do_sfuff(); exit;}
    if (x == 0) { do_other_stuff(); exit;}
    if (x > 1) {

    ... establish restrictions ...

    perform_secure_operation();

    }
    .
    .
    .
     

    So... what happens when x == 1

  • by ThatsNotPudding ( 1045640 ) on Friday December 02, 2011 @01:38PM (#38240814)

    However, the researchers have 'experienced major difficulties' in trying to report issues to HTC and Samsung.

    No problem. Just repeat your findings into one of their phones: they'll literally get the message via CarrierIQ.

  • by Anonymous Coward

    A year ago I was excited about Android. Today I would not touch it.

    • by wierd_w ( 1375923 ) on Friday December 02, 2011 @02:25PM (#38241552)

      The real problem with android, is that handset makers release closed source binary drivers.

      This creates a powerful barrier to entry against rom hackers like the cyanogen team.

      Personally, I would like to see google smack some bitches by demanding either open source drivers only, or supplying feature complete whitepapers for all devices released with closed drivers intended for the android platform.

      This would create a permanent hole in the current software lockdowns carriers and handset makers use.

      My own phone, a samsung sidekick 4g, is basically a galaxy series device inside, but is not supported by cyanogen because of binary drivers issues, and a not fully documented cpu variant. I would very much like to ditch the stock rom, and not have to rely on cooked roms based on it, and finally get something newer than froyo with a facelift.

      Requiring open drivers or feature complete white papers would fix that.

      • by tlhIngan ( 30335 )

        Personally, I would like to see google smack some bitches by demanding either open source drivers only, or supplying feature complete whitepapers for all devices released with closed drivers intended for the android platform.

        This would create a permanent hole in the current software lockdowns carriers and handset makers use.

        Won't work. If Google imposes it as part of "With Google", the only chips available would be ones with open-source drivers. And there's very few of those, even fewer with 3D accelleratio

        • I didn't say the license needed to be gpl, just open.

          As for "we don't want another xda-dev sprouting up", that strikes me as the rhetoric of a dinosaur. If they said "we don't want to be put in the position where we might be forced to support 3rd party modifications" I would be sympatheric, but when they essentially epoxy the hood shut on my sportscar, because they "don't want another community-mechanics site popping up" I don't see them as anything but officious asshats.

      • The real problem with android, is that handset makers release closed source binary drivers.

        While that may be a real problem, it's got nothing to do with the problem discussed in this paper.

        • On the contrary.

          A fully open system is more easily audited. A more easily audited system is harder to hide garbage code in.

          The carriers and handset makers are putting quick and dirty kludges in effect, because they want to race to market, and they feel they can get away with it.

          If you make it harder for them to feel more secure (emotionally) by hiding dirt under the rug, they will make themselves feel more secure by actually sweeping up, even though that is work they apparently don't want to do.

          Further, pro

          • On the contrary.

            A fully open system is more easily audited. A more easily audited system is harder to hide garbage code in.

            Read the paper. All the code that was examined to find these issues was Dalvik bytecode, which is very easy to analyze and audit. In this case it was probably easier to audit at that level than to audit the source.

    • There. Fixed that for you.

  • So, if I never agreed to the permissions.. how can I disable their use?

    And don't answer with 'root'. Rooting is not an option.

    How legitimate, or legal, is it for these built in applications to access my data when I have never accepted the permissions?

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...