Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Android Bug Cellphones Handhelds Software

Popular Android Apps Full of Bugs: Researchers Blame Recycling of Code 150

New submitter Brett W (3715683) writes The security researchers that first published the 'Heartbleed' vulnerabilities in OpenSSL have spent the last few months auditing the Top 50 downloaded Android apps for vulnerabilities and have found issues with at least half of them. Many send user data to ad networks without consent, potentially without the publisher or even the app developer being aware of it. Quite a few also send private data across the network in plain text. The full study is due out later this week.
This discussion has been archived. No new comments can be posted.

Popular Android Apps Full of Bugs: Researchers Blame Recycling of Code

Comments Filter:
  • Laziness (Score:4, Insightful)

    by Anonymous Coward on Monday July 28, 2014 @12:03AM (#47547425)
    Code recycling is one thing, but not understanding what that code does when you put it into a production app or not following best practices is another. As Android gains popularity as a platform to develop for, we're going to lose quality as the new folks jumping onto the band wagon don't care how their apps work or look beyond the end goal. This mentality is already popping up with Android Wear developers who cram as much information as they can on the screen and claim that design guidelines are "just recommendations."
    • Re:Laziness (Score:5, Insightful)

      by AuMatar ( 183847 ) on Monday July 28, 2014 @12:24AM (#47547493)

      Design guidelines are just recommendations. Frequently bad ones. A developer should design the best UI he can, not follow what Google says regardless of whether it fits. And most developer guidelines, Google and Apple both, are crap.

      The problem is that the whole app movement has brought in a whole slew of crappy developers who's idea of coding is to search stack overflow or git for stuff to copy paste. They don't read it, don't understand how to use it right, and expect it to magically work. Worse half of the people writing that code fall into the same category, so its the blind reading the blind. If you pick a library off of github and assume it will work, you deserve what you get. Unfortunately your users don't.

      These people have been around for a while (they used to be "web developers" and program by copy pasting big chunks of javascript). The problem is that on a phone they can do more damage. In a world where the number of quality programmers is fixed and far less than the demand for programmers, how do you fix it? Making it easier to program actually hurts, you end up with those crappy coders trying to do even more. Maybe its time to raise the barriers to entry for a while.

      • Re:Laziness (Score:4, Informative)

        by Lennie ( 16154 ) on Monday July 28, 2014 @07:23AM (#47548597)

        Crappy developers usually means: uneducated developers.

        They can get simple things done without understanding the whole system. That deliver something that sort of works. This makes them cheap labor.

        Why do we need cheap labor, because of competition and a race to the bottom driven by consumer buying decisions.

        In a talk by Gabe Newell from Valve said that a free game got you 10x more users and 3x more profit (they for example get some money from people selling items inside the game). Not that they use cheap labor, they actually do the exact opposite. But it is just to illustrate how price is important.

        So free like the above is a profitable model, free and ad-supported might actually not be as profitable. I don't know how much money companies get for selling personal information. I assume it is more than the ads.

        So how do you solve that.

        I see a few possible ways:
        - education
        - create good open source libraries that prevent most of the bad things and cheap developers want to use.

        Now comes the kicker:

        Do you think HTML5-apps without any permissions by default on phones would be a better model ? :-)
        That would be a model similar to Javascript-code running in the browser on the desktop where the user is asked to allow access to the camera when needed.

        Actually, I do, but then again I actually do use a FirefoxOS phone to see what it is like.

        A lot of the time the hardware is bit underpowered so it can be sold in countries that currently still have a large number of feature phones or people not willing/able to pay for more expensive hardware.

        But still pretty impressive what they can get out of that cheaper hardware.

        • > They can get simple things done without understanding the whole system. That deliver something that sort of works. This makes them Java developers.

          Fixed That for You.

          [ Note grammatically correct but confusing capitalization, another of my pet Java peeves. ]

        • by AuMatar ( 183847 )

          I think that HTML5 would make it far worse. Where do most of these bad programmers start? Where the barriers to entry are lowest-- javascript. You'd be making the problem worse, not better.

          I do think that there's much improvement to be made with permissions on mobile phones. But that's a separate problem, and one a lot of the Android custom ROMs do well.

          • by Lennie ( 16154 )

            You misunderstood.

            I meant is: if we are going to have these developers, no matter what. Then giving these developers a sandbox where they can't do any harm that would be an improvement, right ?

    • Re:Laziness (Score:5, Informative)

      by dgatwood ( 11270 ) on Monday July 28, 2014 @01:43AM (#47547725) Homepage Journal

      Code recycling is one thing, but not understanding what that code does when you put it into a production app or not following best practices is another. As Android gains popularity as a platform to develop for, we're going to lose quality as the new folks jumping onto the band wagon don't care how their apps work or look beyond the end goal. This mentality is already popping up with Android Wear developers who cram as much information as they can on the screen and claim that design guidelines are "just recommendations."

      The exact same thing happens on every other platform, though perhaps to varying degrees. I refer to it as the Stack Overflow effect. One developer who doesn't know the right way to do something posts a question. Then, a developer who also doesn't know the right way to do it posts how he or she did it. Then ten thousand developers who don't know the right way to do it copy the code without understanding what it does or why it's the wrong way to do it. By the time somebody notices it, signs up for the site, builds up enough reputation points to point out the serious flaw in the code, and actually gets a correction, those developers have moved on, and the bad code is in shipping apps. Those developers, of course, think that they've found the answer, so there's no reason for them to ever revisit the page in question, thus ensuring that the flaw never gets fixed.

      Case in point, there's a scary big number of posts from people telling developers how to turn off SSL chain validation so that they can use self-signed certs, and a scary small number of posts reminding developers that they'd better not even think about shipping it without removing that code, and bordering on zero posts explaining how to replace the SSL chain validation with a proper check so that their app will actually be moderately secure with that self-signed cert even if it does ship. The result is that those ten thousand developers end up (statistically) finding the wrong way far more often than the right way.

      Of course, it's not entirely fair to blame this problem solely on sites like Stack Overflow for limiting people's ability to comment on other people's answers unless they have a certain amount of reputation (a policy that is, IMO, dangerous as h***), and for treating everybody's upvotes and downvotes equally regardless of the reputation of the voter. A fair amount of blame has to be placed on the companies that create the technology itself. As I told one of my former coworkers, "The advantage of making it easier to write software is that more people write software. The disadvantage of making it easier to write software is that... more people write software." Ease of programming is a two-edged sword, and particularly when you're forced to run other people's software without any sort of actual code review, you'd like it to have been as hard as possible for the developer to write that software, to ensure that only people with a certain level of competence will even make the attempt—sort of a "You must be this tall to ride the ride" bar.

      To put it another way, complying with or not complying with design guidelines are the least of app developers' problems. I'd be happy if all the developers just learned not to point the gun at other people's feet and pull the trigger without at least making sure it's not loaded, but for some reason, everybody seems to be hell-bent on removing the safeties that would confuse them in their attempts to do so. Some degree of opaqueness and some lack of documentation have historically been safety checks against complete idiots writing software. Yes, I'm wearing my UNIX curmudgeon hat when I say that, but you have to admit that the easier programming has become, the lower the average quality of code has seemed to be. I know correlation is not causation, but the only plausible alternative is that everyone is trying to make programming easier because the average developer is getting dumber and can't handle the hard stuff, which while p

      • Although you certainly have a point, the core problem is often that the documentation is poor. I find that if there is a proper writeup of the solution somewhere on the net, Stack Overflow will mention it (eventually). If there is no proper writeup, sometimes someone bright posts a solution that is right, and sometimes people stumble upon a voodoo solution that nobody understands properly, but sort-of works.

        The Android APIs are susceptible to this problem, because they are often poorly documented, have glar

        • by dkf ( 304284 )

          Amazingly, security libraries are often in this category. Is there a really good writeup ANYWHERE about SSL, certificates and signing practices? And IPSec with all its intricacies?

          Funnily enough, on Stack Overflow! Not all of the security-related questions are overflowing with shitty misinformation. (SO might not be great, but it's better than the squillion shitty places for question answering that preceded it.)

        • by mpe ( 36238 )
          Although you certainly have a point, the core problem is often that the documentation is poor.

          A not uncommon problem being "solutions" which omit steps or assume that everyone knows how to find, what is in practice, an obscure option. Sometimes also having "boilerplate" which overexplains another part of the process.

          Amazingly, security libraries are often in this category. Is there a really good writeup ANYWHERE about SSL, certificates and signing practices?

          That would also have to include TLS, STARTTL
      • The problem is worse on Android than on many other platforms because there are very few native shared libraries exposed to developer and there is no sensible mechanism for updating them all. If there's a vulnerability in a library that a load of developers use, then you need 100% of those developers to update the library and ship new versions of their apps to be secure. For most other systems, core libraries are part of a system update and so can be fixed centrally.
        • by mpe ( 36238 )
          The problem is worse on Android than on many other platforms because there are very few native shared libraries exposed to developer and there is no sensible mechanism for updating them all. If there's a vulnerability in a library that a load of developers use, then you need 100% of those developers to update the library and ship new versions of their apps to be secure. For most other systems, core libraries are part of a system update and so can be fixed centrally.

          It used to be very common with MS Window
      • You are going to hate what the Neovim folks are trying to do to VIM's learning curve:
        https://github.com/neovim/neov... [github.com]

        I fear the day when Eternal September comes to VIM.

      • by mpe ( 36238 )
        Case in point, there's a scary big number of posts from people telling developers how to turn off SSL chain validation so that they can use self-signed certs, and a scary small number of posts reminding developers that they'd better not even think about shipping it without removing that code, and bordering on zero posts explaining how to replace the SSL chain validation with a proper check so that their app will actually be moderately secure with that self-signed cert even if it does ship. The result is tha
        • by dgatwood ( 11270 )

          A self-signed certificate is never more secure than a CA-signed cert. Period. The only benefit to self-signed certs is cost. Any other perceived benefits are merely side effects caused by forcing you to do extra security checks to make up for the lack of a CA—checks that you could do anyway, but probably won't.

          For example, if you're paranoid about a CA issuing a cert for your organization to someone else, then you might add code in your app to do your own set of checks to decide whether a cert is v

    • Probably mostly speed. Understanding every tool you use means you must invest time to understand it. In the swift and agile world of app development security is the first victim. Taking time to understand what you are doing seems to be outdated.
      The only thing the users can do is not install apps that request rights they have no need for. Sadly most users do not care.

      • by Lennie ( 16154 )

        It's the price that is driving this, when an app is free or just 1 dollar, this gives a lot of reasons to not spend a lot of time on it.

        • by narcc ( 412956 )

          How about "pride in your work"? Remember that old maxim "anything worth doing is worth doing well"?

          I simply can't believe that money is the only thing that motivates people.

          • by Lennie ( 16154 )

            People that aren't very good developers are proud of their work too. They are proud they made something that works.

    • Code recycling is one thing, but not understanding what that code does when you put it into a production app or not following best practices is another.

      No developer completely understands everything that happens on a system, that's impossible. You do your best and you verify as well as you can that it's acting as you expect. Because where else do you stop..? You can't verify every library that you use, otherwise why bother using them, you might as well write your own. You can't verify the system itself because it's far too big.

      Not that I'm saying things couldn't be written better, but programming is not a "correct / incorrect" binary choice, any nontrivial

      • by dgatwood ( 11270 )

        You don't have to understand everything, but you do need to at least understand the basics, like how networking works, how crypto works, etc. at a conceptual level. I feel like too many developers learn how to program by learning JavaScript and other scripting languages on their own, then jump into app programming thinking that it's only one step harder because you can sort of do it in Python/Ruby/other Obj-C bridged languages/other .NET languages, or because Swift looks like JavaScript, or whatever their

  • by Tony Isaac ( 1301187 ) on Monday July 28, 2014 @12:12AM (#47547447) Homepage

    It doesn't matter if it is Windows, Mac, iOS, Android, or Linux, all software is full of bugs.

    For that matter, all of everything constructed by human beings...is full of defects, or potential defects, or security vulnerabilities. Your house, for example. You have a lock on your front door, but it takes a thief just a few seconds to kick the door in. Or your car...a thief can break into it in seconds, even if you have electronic theft protection. I'd call those "security vulnerabilities."

    It's the nature of all human creations, software or hardware, electronic or mechanical.

    So what do we do? We improve security until it becomes "just secure enough" that we can live with the risks, and move on.

    • Our add features to a language that help the programmer prove that certain defects are not present. Bounds checked arrays are a big one compared to plain C, but others exist. Rust, for example, has separate types for "pointer that can never be null" and "pointer allowed to be null", and it is a compile-time error to pass the latter to a function expecting the former outside of a construction that means essentially "if null then do X else do Y".

      Or research methods of containing the damage that a defect ca

      • by Anonymous Coward

        Until the saviour rust browser engine descends from heaven to rule the earth we have to live with the sign of the gecko beast on our forehead. Before servo delivers us from the zeroth day on the last day, a huge fox with fire will be cast into the hdd: and the third part of the hdd shall become blood; And the third part of the tabs which were on the gecko, and had life, died; and many processes will die of the memory, because the memory will be have been made bitter.

        From The Book of Mozilla, the word of LOR

    • by Greyfox ( 87712 ) on Monday July 28, 2014 @12:50AM (#47547567) Homepage Journal
      But we don't do that. We never do that. As developers, we hide our head in the sand until we absolutely can no longer ignore then problem, and then we say "Whoops! My bad!" As consumers we assume that professionally published software should be reasonably free of bugs or exploitable code. And people start being held accountable by law for their shitty software, the status quo will never change.

      I was demonstrating to a shitty software developer the other day how all his input sanitizing routines were in the javascript front end to his web application and anyone bypassing the javascript could essentially have their way with the back-end database, and he told me "Oh you're making a back-end API call, no one will ever do that!" No one except the guy who's hacking your fucking system, jackass. People like that make me want to sign on as Linus' personal dick-puncher. Whenever someone writes some shitty software that pisses Linus off, I will find that person and I will PUNCH THEM IN THE DICK. Because I swear to god, that's what it's going to take. Congress is going to have to WRITE A LAW allowing me to HUNT PEOPLE DOWN and PUNCH THEM IN THE DICK over the SHITTY SOFTWARE they write. And when that day comes, with God as my witness, I will PITCH A TENT outside MICROSOFT HEADQUARTERS, and that will be the LAST TENT EVER PITCHED at MICROSOFT HEADQUARTERS!

      • by Lennie ( 16154 )

        1. Why are you excluding women ? isn't that discrimination ?

        2. Some people just don't know this yet, they don't have a hacker mentality (which is what is needed to understand whole systems and how things can be used in ways they were never intended). A hacker mentality is not taught at educational institutions, so they need to still learn it. It usually isn't malice or laziness it is not understanding what you are doing. All they have learned is is how to get the task completed.

      • I was demonstrating to a shitty software developer the other day how all his input sanitizing routines were in the javascript front end to his web application and anyone bypassing the javascript could essentially have their way with the back-end database, and he told me "Oh you're making a back-end API call, no one will ever do that!" No one except the guy who's hacking your fucking system, jackass.

        That actually happened in one of the online games I used to play. The game company decided to run a promotio

    • For that matter, all of everything constructed by human beings...is full of defects, or potential defects, or security vulnerabilities. Your house, for example. You have a lock on your front door, but it takes a thief just a few seconds to kick the door in. Or your car...a thief can break into it in seconds, even if you have electronic theft protection. I'd call those "security vulnerabilities."

      So what do we do? We improve security until it becomes "just secure enough" that we can live with the risks, and move on.

      Who cares about the security of an untrusted and untrustworthy app in the first place?

      What difference does it make if it was written by the most competent team of programmers in the world if while operating as designed still treats the end user with contempt?

    • by gringer ( 252588 )

      For that matter, all of everything constructed by human beings

      You might not be terribly surprised to know that our genes (and the genomes of pretty much everything) are also full of bugs. We have a whole raft of deleterious genetic variants in our DNA that are just waiting for the perfect time to activate and say "hey, you know that life thing? I can make it worse." On top of that, we have a few viral genomes in our DNA (possibly some that are still active), and rely on bacteria and mitochondria to provide

    • by jmv ( 93421 ) on Monday July 28, 2014 @02:28AM (#47547835) Homepage

      Software on Internet-connected devices is a bit different from your examples though. No matter how insecure cars are, it would be really hard for me to steal a million cars in one night, let alone without being caught. Yet, it's common to see millions of computers/phones being hacked in a very short period of time. And the risk to the person responsible is much lower.

    • That's trivial. It's like saying, there are only two numbers, "zero" and "many". It simply isn't true that all languages and all platforms are full of bugs in any meaningful sense. Some platforms are more buggy than others. This is a function of how old the platform is, how serious the creators are about preventing bugs, etc. That's meaningful.

      For example, the well known OpenBSD aims to be much more secure than other OSes. The well known Windows family doesn't care about security, only as an afterthought.

    • So what do we do? We improve security until it becomes "just secure enough" that we can live with the risks, and move on.

      Who's perspective are you talking about?
      The risk of the user being compromised? Or the risk of the programmer being held accountable?

      For the most part we're not talking about fixing all bugs. For the most part the argument isn't even about being "secure enough".

      No. For the most part some of the bugs are outright inexcusable.

  • We can just edit the source, and compile new versions, that work properly?

    Seriously, a system is only as good as its process. And the open-source process is not necessarily any better, it can be, but it need not be.

    • by tepples ( 727027 )
      No major video game publisher is going to let unaffiliated individuals see the game's source code.
      • Correction: I meant within the first few years. A few PC game developers such as Id release source code a couple engine generations back.
  • Code Academies (Score:5, Insightful)

    by Fnord666 ( 889225 ) on Monday July 28, 2014 @12:17AM (#47547467) Journal
    This is the sort of thing that you can expect when you put developers through a whirlwind coding course. They learn to use library after library without understanding the ramifications of their use. Need an ad network? Slap in a library. Need geolocation? Slap in a library. What you end up with are flashlight applications that want permission to read the low level system log. Then again, that's coding in the instant gratification world that we live and develop in today.
    • I wonder what the opposite would look like.

      Just imagine a world where you had no libraries and had to manually code everything. What would that world look like? No developers? No consistency for end users? Do you think security would be better when developers are forced to write more code?

      Somehow I don't think the libraries are necessarily the problem.

      • by swb ( 14022 )

        I'd guess it would look like the Apple ][ or the very early days of the IBM PC and there would be just less functionality.

      • you'd have a vast library of libraries. Something like CPAN or something you'd get in the C world. Libraries written to perform some task and nothing more. Then documented with care and the API published.

        Anyone wants to do something, they take the library that appeals to them and adds it to their program and build up a program from these bits.

        Now the problem today is that a) some only use libs that come with the OS or language framework, b) the libraries that are out there are shit, written quickly and for

  • by Animats ( 122034 ) on Monday July 28, 2014 @12:17AM (#47547469) Homepage

    Let's see this list of spyware. Will Google kick them out of the Android store? Will the FBI prosecute the developers for "exceeding authorized access" under the Computer Fraud and Abuse Act? If not, why not?

    • by tlhIngan ( 30335 )

      Let's see this list of spyware. Will Google kick them out of the Android store? Will the FBI prosecute the developers for "exceeding authorized access" under the Computer Fraud and Abuse Act? If not, why not?

      Easy, the summary says they analyzed the top 50 downloaded apps. So your list of spyware will be those.

      As for Google, well, Google owns online marketing advertising market, so those apps really are helping Google in the end...

      • by GuB-42 ( 2483988 )

        As for Google, well, Google owns online marketing advertising market, so those apps really are helping Google in the end...

        Only if they use Google's spyware.

  • by Anonymous Coward

    When the user has zero control over what is running on their device and what is communicated by their device, things like this will happen over and over again.

    • by Desler ( 1608317 )

      These are apps people have to choose to install and run. How do they have zero control when they chose to install them?

  • Ignorance is bliss (Score:3, Insightful)

    by WaffleMonster ( 969671 ) on Monday July 28, 2014 @01:21AM (#47547665)

    TFA is being much nicer than Google and many app vendors deserve.

    The whole ecosystem system is engineered to reward bad behavior /w complete lack of usable access controls speaking for itself.

    They need only do the minimum required to keep all hell from breaking loose and too many people bailing on the platform as a result.

  • by janoc ( 699997 ) on Monday July 28, 2014 @04:26AM (#47548135)

    The entire article is harping on 3rd-party ad network libraries stealing personal data and phoning tracking info home. As these are libraries and developers are re-using open source libraries, then it follows that "Open source is no free lunch" and is stealing your data. What a majestic leap in logic!

    They conflate open source libraries with various ad-network code stealing personal data, basically trying to portrait open source code as being responsible for it. Never mind that the ad-network code is almost never open source.

    Granted, OSS is certainly not bug-free, but the spyware has little to do with it.

    What a load of ...

  • yeah. as long as the custoemers dont even care about any security, but about a shiny interface and are not willing to pay, focusing on the interface and not on the security of the app seems like a reasonable economic decision to me.

  • by Tanuki64 ( 989726 ) on Monday July 28, 2014 @05:26AM (#47548257)

    Tomorrow: Researchers Blame "Not invented here" mentality.

    Instead of using tested standard libs, developers constantly reinvent the wheel.

  • The choice seems to be between the flexibility of Android vs. the (arguably?) better security on iOS.

    I'd like to be able to install Android apps without having to accept all of the permissions they require, but without rooting my phone that's impossible. As a result, there are many apps I just won't install (it took me ages to find a torch app that didn't need anything beyond access to the camera, for example).

    On the other hand, I love widgets - quick access to information and actions from the desktop is re

  • Who's paying these researchers at Codenomicon to research Android vulnerabilities? Howard A. Schmidt, Chairman of the Board [codenomicon.com] at Codenomicon.

    Some people might have been providing a vulnerability on purpose in order to do something nasty .. Who are they working with? Do they have sideline jobs somewhere else? The developers might be getting their dollars from ad networks"

    Is this what slashdot has been reduced to, regurgitating anti Open Source FUD on behalf of a most probably a false-front for the
    • Utter crap. Codenomicon are very friendly to FLOSS and FLOSS developers. They're also great guys. They have been providing free test services to the Samba project for many years now, and have helped us fix many many bugs.

      In case you hadn't noticed, the code they're reporting on here is closed source proprietary code...

  • Why on earth would you recycle code, that is rookie programming error 101. Every program you write needs to use a fresh and clean set of functions and structures, because how else can you get everything to fit together perfectly.
  • The article mentions sandbox tools that allow admins to test applications and see what the code and the libraries are really doing, but doesn't name any of them. Any /.ers know if there are FOSS or BSD tools of this sort? Or even cheap proprietary ones? I read the code for any library I use, and I try to add some assert() like statements where the lib dev might have felt them unneeded to be certain that nothing gets past memory boundaries. But everyone misses something now and then, and just look at the IOC

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...