Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Cellphones Security

Malicious App In Android Market 340

dumbnose writes to let us know that a fraudulent app that attempts to steal bank information has made it to the Android app store. From the alert: "NOTICE: Users of mobile devices with Android software may have noticed several applications available for download in the Android Marketplace. If you see any applications provided by the user Droid09, please do not download these applications. Android applications provided by Droid09 are fraudulent. Please remove any applications by Droid09 from your mobile device and contact your mobile provider to evaluate whether any other applications or information stored on your mobile device have been compromised." Multiple marketplaces are possible in the open Android ecosystem. Might we see the emergence of a marketplace distinguished by an iPhone-like app vetting process?
This discussion has been archived. No new comments can be posted.

Malicious App In Android Market

Comments Filter:
  • by LostCluster ( 625375 ) * on Sunday January 10, 2010 @06:41PM (#30717830)

    This is something that is far more unlikely to happen on the iPhone because of Apple's strict control and testing of all apps. Even the "jailbreak" stores will reject things that aren't as advertised.

    Allow open development, and you've basically got a platform that the bad guys can target. There's already standards for signing code to prove that an app came from who you thought it did.

  • by RobertM1968 ( 951074 ) on Sunday January 10, 2010 @06:45PM (#30717866) Homepage Journal

    Wow, second post and already we've got the "iPhone vs Android" debate started! Kudos!

    That aside, or the apps Apple has had to remove aside... I'm happy with 99% of the quality control on the Android Apps.

  • by bcmm ( 768152 ) on Sunday January 10, 2010 @06:46PM (#30717874)
    An iPhone-like vetting process would be "we'll reject it if we don't like the look of it". How about "Linux-distro style vetting process"?
  • Re:No sandboxing? (Score:5, Insightful)

    by LostCluster ( 625375 ) * on Sunday January 10, 2010 @06:51PM (#30717908)

    Sandboxing is an "always deny" tech that keeps legit applications from working easily. Effective, yes. Going to catch on with the average user, no.

  • by dumbnose ( 190140 ) on Sunday January 10, 2010 @06:54PM (#30717942)

    Sounds like a really easy way for your standard user to administer their phone. My mom would totally get wait....I think I meant the opposite of that. Yeah.

    Seriously, though, how do you communicate this to your standard, non-techie user?

  • by broken_chaos ( 1188549 ) on Sunday January 10, 2010 @06:56PM (#30717956)

    How about "Linux-distro style vetting process"?

    Impossible, unless all apps are required to be open source (which would not be popular with many commercial developers). I'd even bet a large number of commercial developers would even be annoyed enough to stop developing for Android's app store if required to turn over their complete source code only to Google employees for review -- Apple doesn't even require this for their app store.

  • by LostCluster ( 625375 ) * on Sunday January 10, 2010 @06:57PM (#30717966)

    iPhone's vetting process has a "AT&T doesn't like it, so Apple will deny" clause that the jailbreak stores don't. Apple then claims that jailbroken apps could be trojans that will overload AT&T's network.

    Google seems to be taking a "we'll do what we want and carriers can't stop us" attitude. Good luck with that.

  • by slifox ( 605302 ) * on Sunday January 10, 2010 @07:01PM (#30717998)
    This app is just another vector in the long history of internet phishing attacks

    The problem isn't technical, but rather lack of user training

    The internet is not a safe place. If you want to use it openly, you better not be gullible and hand out your info to anyone who asks.

    One solution would be to setup the phone for your non-techie friend, and whitelist all the apps that they'll need that should have internet access. Yes, this means they'll have limited use of new apps, but if they can't figure out when not to give out her bank details, they aren't sufficiently trained to safely use the internet.
  • by QuantumG ( 50515 ) * <> on Sunday January 10, 2010 @07:07PM (#30718046) Homepage Journal

    No, the iPhone vetting process is unashamedly "that competes with us, denied!"

  • by Anonymous Coward on Sunday January 10, 2010 @07:14PM (#30718130)

    One caveat: Droidwall doesn't work on Android devices which don't have iptables, such as the CLIQ, DEXT, or others. So, if you don't have an HTC phone, don't bother with this app until the handset maker pushes out 2.1, or until your favorite rom cooker bakes the iptables/ipchains functionality in.

  • by Darkness404 ( 1287218 ) on Sunday January 10, 2010 @07:16PM (#30718150)
    However, there is balance. Look at Ubuntu's repositories, they rarely really "reject" any applications and everything in there is more or less malware free. I can see there being a market for trusted repositories in Android also.
  • Reserved words? (Score:3, Insightful)

    by Darkness404 ( 1287218 ) on Sunday January 10, 2010 @07:21PM (#30718200)
    What if the Android market would reserve a few words for only legitimate organizations? For example, apps would need to be certified to appear in an online banking part of the store, and there would be no certification other than Google contacting the company and making sure this is the app they made. For example, if someone submits an app with "Bank of America" in the description (or something) the Android market puts a big red heading saying This app was not developed by Bank of America, do not give out sensitive financial details over the app? It isn't restrictive because it still is open development yet it weeds out phishing apps.
  • by ducomputergeek ( 595742 ) on Sunday January 10, 2010 @07:23PM (#30718222)

    Tragedy of the Commons comes to mind here. People around here like to bitch about Apple's policies with their app store, but I understood the reasoning behind it from the beginning. The average consumer doesn't know better. A cute app that is malicious can spread to millions of users before someone wises up. And it only takes one or two to make people fearful of the platform.

    It will be fun to see if the carriers take advantage of this and try to get control over the handsets back in their court as opposed to that of Google. If it happens a couple more times, I can the Verizon App store popping up and a Verizon UI required on all android phones that only allow users to use their store. And I'm sure a lot of the apps will require extra "monthly" fees.

  • by LostCluster ( 625375 ) * on Sunday January 10, 2010 @07:24PM (#30718236)

    Open source is another way to stop malware... not every user looks at the source, but enough curious ones will put out the warning should anything not be as its marked.

    Nice feature, but most software houses see the downside.

  • Re:Reserved words? (Score:4, Insightful)

    by LostCluster ( 625375 ) * on Sunday January 10, 2010 @07:29PM (#30718266)

    "Bank of America" is already a reserved word under trademark law. You could say that "bank" is a reserved word, but then you'll accidentally block "iBank" and such. Such problems.

  • by A1rmanCha1rman ( 885378 ) on Sunday January 10, 2010 @07:40PM (#30718344)

    An iPhone-like vetting process would be "we'll reject it if we don't like the look of it". How about "Linux-distro style vetting process"?

    The iPhone vetting process is closer to Slifox's "error on the side of caution" method on his outbound firewall, with the default being set to DROP (deny the app), followed by a specific whitelist (approved apps subject to continuous monitor for "good behaviour").

    Quite a number of approved apps in the iPhone App Store have been caught out doing naughty things like accessing and sending "home" users' Contacts - email addresses, phone numbers and home/work addresses - where they really had no business requiring such information for their function (battery charge display apps, games etc) and have promptly been expelled from the app store - quite rightly in my opinion.

    The price of true freedom is eternal vigilance, not laissez-faire do-what-you-please laxity...

  • by LostCluster ( 625375 ) * on Sunday January 10, 2010 @07:46PM (#30718392)
    So who do you let into the "partner" program without being called biased against a "too small" programming shop?
  • by mjwx ( 966435 ) on Sunday January 10, 2010 @08:49PM (#30718788)
    This is just the same old phishing attack moved to a new platform. This is no different then directing a web users to a fraudulent banking site.

    The fault here lies primarily with the user, but seeing as we cant force the users to be smarter the onus for defeating this attack relies on the bank. Banks can do a variety of things to prevent such phishing attacks from working such as using 2 factor authentication and One Time Passwords. OTP works best when being used for transactions rather then logins, my bank will SMS me a code when I want to make a transaction to another account so even if a phisher has my password, they need my phone to do anything (plus this is a dead give-away that a phisher has gained my password). Banks could also issue a private key to official applications and block any application that does not have the key (granted this is less useful and may be easily defeated)

    Iphone style lock downs will not work as they do not address the real problem of phishing and only serve to limit the platform. This isn't a fault with Android, this requires the user to initiate the attack, nor is it self replicating.
  • by JSBiff ( 87824 ) on Sunday January 10, 2010 @08:52PM (#30718806) Journal

    Why on Earth would you download a 'bank' app from anyone other than *YOUR BANK*? I'm only gonna do online banking from the website or apps provided to me directly from my bank. I'm not gonna download anything from the Android market, from some random user, and do banking with it. Who thinks that it's a good idea to do 'banking' with an app by a random developer? I mean, *maybe*, maybe if it was someone large and established, like IBM, Google, Microsoft, or Apple, I *might* consider using third party software, but certainly not anyone I've never heard of before.

  • by farble1670 ( 803356 ) on Sunday January 10, 2010 @08:56PM (#30718822)

    iPhone has youtube and pandora among many other apps that have very high network usage. sort of shoots a hole into the theory that AT&T is rejecting based on potential network overload.

  • by LostCluster ( 625375 ) * on Sunday January 10, 2010 @09:10PM (#30718898)

    That prevents the problem of somebody bringing in a mobile device and claiming to be you... but doesn't stop you from giving your main password to a false app that asks for it.

  • by Anonymous Coward on Sunday January 10, 2010 @09:25PM (#30718980)

    "People around here like to bitch about Apple's policies with their app store, but I understood the reasoning behind it from the beginning. The average consumer doesn't know better."

    I don't understand the reasoning behind it.

    People seem to assume that a mobile phone app needs to be more controlled than a desktop application. What makes "mobile" so different from the desktop? I would suggest that I am actually much more likely to have sensitive things (banking, personal, or business information) on my desktop than on a mobile device. Yet no one is advocating that someone set up an app store for the desktop.

  • by FrankieBaby1986 ( 1035596 ) on Sunday January 10, 2010 @09:54PM (#30719122)

    Seriously, though, how do you communicate this to your standard, non-techie user?

    You don't. This is NOT A PHONE. This is a little computer with a phone IN IT. The same level of knowledge required to use a computer and install apps safely, etc is necessary here.

  • by Anonymous Coward on Sunday January 10, 2010 @10:12PM (#30719190)

    thats not how the world works, probably the "validation" that apple do serve apple beneficts, and is not made for the safety of the users or other romantic option, maybe with the adition of safety theater

  • by __aasqbs9791 ( 1402899 ) on Sunday January 10, 2010 @11:15PM (#30719488)

    You make a good point, but that doesn't really do anything to the OP point. Most people who use computers are not techie users. They fall for scams all the time.

  • by LostCluster ( 625375 ) * on Sunday January 10, 2010 @11:22PM (#30719528)

    How do you know the binary you install is the same as the source?

    MD5 hash for the win! If your hash doesn't match the published hash, something's up.

    Unless you propose that all software be compiled and signed by a trusted authority or be compiled on the end user's device...
    Already happening on several platforms. MS Office VBA, MacOS, etc. Unsigned code is allowed, but requires a user's approval to a warning that the publisher is unknown.

    And if someone introduces the ability to download and execute arbitrary code, perhaps via a clever and well-hidden exploit?

    Would require an app that asks for rights to contact the network, and network traffic can be monitored. Somebody will notice.

  • by furball ( 2853 ) on Sunday January 10, 2010 @11:31PM (#30719570) Journal

    This explains the explosive spread of viruses on the Apple platform!

  • by dotgain ( 630123 ) on Sunday January 10, 2010 @11:59PM (#30719726) Homepage Journal
    Um, which people will notice?
  • by Anonymous Coward on Monday January 11, 2010 @12:49AM (#30719962)

    >Do the Underhanded C Contest and Obfuscated C Contest ring any bells?

    If you were trying to make a point, you failed miserably. Those are about writing malicious code not searching for it.

    Use your brain, dipshit. The point of the Underhanded C contest is to write code that, when read, looks perfectly normal but contains underhanded code. Someone searching for bad code will have a difficult time spotting it because the whole point is to hide the malicious parts from someone who does a code review.

  • by SuperKendall ( 25149 ) on Monday January 11, 2010 @01:38AM (#30720158)

    This is not the case. Apple don't perform in-depth testing in this manner; they don't have access to the source code and some developers have already successfully bypassed the rules of the App Store by hiding functionality as easter eggs. It is trivial to put malicious code in an iPhone app that won't be triggered until after the application is already in the App Store.

    Hey, what was that old saw about Macs not having any viruses? Wasn't it something like, the platform is not popular and that's why they do not have viruses?

    Well here we have a wildly popular mobile platform. Yet the most egregious exploit in an app to date is something that sent your address book somewhere without permission (something that's explicitly allowed by the API).

    So given the number of apps there are, perhaps the lack of problems like this is an indicator it is not as "trivial" as you claim to put a malicious app in the store.

    What would a malicious app really do anyway? It couldn't delete user data. It can't send passwords not entered in the app (passwords are not stored in the keystroke cache). And what makes you think Apple would not give extra scrutiny to an application that asked for something like your banking details? What makes you think they don't roll the date forward a month or two when testing apps just to see what kind of extra activity might be triggered?

    Furthermore, because you have to go through some paperwork to be a registered developer in the first place, you have a lot more exposure to liability if you try something. Apple the has valid bank account details for you (if you registered to sell paid apps), along with your address and other things. So if something like this exploit were found, you'd be pretty screwed.

    There are more aspects of protection in a closed system than just the review cycle...

  • Ask (Score:3, Insightful)

    by SuperKendall ( 25149 ) on Monday January 11, 2010 @01:50AM (#30720232)

    Why on Earth would you download a 'bank' app from anyone other than *YOUR BANK*?

    Actually there's a very good reason (for the user) - banks cannot write user interfaces to save their lives.

    In fact they are so horrible at it, that flourished with tens (hundreds?) of thousands of users, despite you needing to give Mint the passwords to EVERY SINGLE BANK you do businesses with.

    Would you or I ever, ever do that? Nope. No reasonable person would you would think. Yet many have (and continue to), just because the experience of using bank websites and mobile platforms was so horrific, and honestly I cannot blame them - in fact I envy them the peaceful bliss of ignorance and nice software.

    The whole point of using mobile applications is to make your life simpler, something that lots of developers are good at but not banks. So it's no shock someone would be willing to try an app not written by the bank they use.

  • by mjwx ( 966435 ) on Monday January 11, 2010 @04:07AM (#30720772)

    And then what do you do about the fact that you have given Apple and address they have verified

    Quite easy to give and verify a fake address, especially if it's in a foreign country.

    and paid for a $99 developer account via some means they can tract back to you

    Once again, easy to do with a foreign bank.

    There are plenty of easy ways to prove addresses that can be easily faked, bank statements, utility bills. Plus there is the idea of using someone else's identity entirely.

    Let me put it this way, anyone smart enough to develop a scheme like this is smart enough to defeat Apple's rudimentary address/credit checks.

    That's a lot of exposure for a scam that's likely to be shut down in under a day.

    You seem to have a lot of faith in Apple's ability to detect a hidden scam once it has already penetrated their security (the app store). It's entirely plausible that this kind of phishing go on for weeks or months without anyone noticing, especially seeing as Apple are the only watchman and considering what the average iphone user understands about information security.

  • by QuantumG ( 50515 ) * <> on Monday January 11, 2010 @06:25AM (#30721226) Homepage Journal

    Uhhh, no. You said "actually, I just went to the apple website and found this..." and I said "oh yeah, I remember that happened last year."

    You honestly don't remember Apple only last year admitting that getting some anti-virus might be a good idea? You don't remember how much shit they got for it? I can't really say I'm surprised, being that no-one buys anti-virus for Macs, even now.

    Please now, kindly fuck off fanboi.

  • by Ginger Unicorn ( 952287 ) on Monday January 11, 2010 @07:39AM (#30721556)

    phone providers/google could set up a "safe mode" in android that only allows signed apps to run. if the user wants to leave safe mode to install an unknown app they can but be shown a warning of the consequences. That way people who want to be safe can be safe and people who want to run what they like can run what they like. Kind of like apple putting a jailbreak button on the iphone. That way people can choose between safety or freedom.

    given time as more apps get checked and signed, people would have less and less reason to leave safe mode.

    it reminds me of the software repositories on ubuntu - for about 2 or 3 years there was essential stuff missing that forced you to manually install dodgy software that potentially broke your system, but now that it's matured there often no reason whatsoever for a home user to stray outside the repos

  • by Svartalf ( 2997 ) on Monday January 11, 2010 @09:09AM (#30721844) Homepage

    That's because it's an easy target, in spite of all it's "security measures".

  • That could work quite well, if the testers can't see the source. You could put a timebomb in an app that activates its malicious payload. This would also work better because it could allow the app to become popular and spread before it turns nasty. A datamining app that collects everything into an encrypted file (just very simple encryption in a file with a large initial size would be enough to keep people from "grepping" the contents or getting suspicious...say it's a cache file or something) and sends it off on a specific date and time could do a lot of damage.
  • by selven ( 1556643 ) on Monday January 11, 2010 @09:44AM (#30722234)

    People which use software installation systems that check MD5s by default. Even Windows does something like this, but so many applications don't bother with signatures that "warning unsigned application" is pretty much meaningless.

  • by 2obvious4u ( 871996 ) on Monday January 11, 2010 @02:22PM (#30725902)
    As a droid owner, any app you install lets you know what services it has access to. I don't have many apps installed because most of the time I'll load an app and it will have access to something it has no reason to access.

    The freedom of the droid is nice; but at the same time it requires more responsibility on the owner.

"Let every man teach his son, teach his daughter, that labor is honorable." -- Robert G. Ingersoll