Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming

Why JavaScript On Mobile Is Slow 407

An anonymous reader writes "Drew Crawford has a good write up of the current state of JavaScript in mobile development, and why the lack of explicit memory handling (and a design philosophy that ignores memory issues) leads to massive garbage collection overhead, which prevents HTML5/JS from being deployed for anything besides light duty mobile web development. Quoting: 'Here’s the point: memory management is hard on mobile. iOS has formed a culture around doing most things manually and trying to make the compiler do some of the easy parts. Android has formed a culture around improving a garbage collector that they try very hard not to use in practice. But either way, everybody spends a lot of time thinking about memory management when they write mobile applications. There’s just no substitute for thinking about memory. Like, a lot. When JavaScript people or Ruby people or Python people hear "garbage collector," they understand it to mean "silver bullet garbage collector." They mean "garbage collector that frees me from thinking about managing memory." But there’s no silver bullet on mobile devices. Everybody thinks about memory on mobile, whether they have a garbage collector or not. The only way to get "silver bullet" memory management is the same way we do it on the desktop–by having 10x more memory than your program really needs.'"
This discussion has been archived. No new comments can be posted.

Why JavaScript On Mobile Is Slow

Comments Filter:
  • Easy (Score:5, Interesting)

    by ArcadeMan ( 2766669 ) on Wednesday July 10, 2013 @05:48PM (#44244221)

    Stop loading dozens of fucking libraries and frameworks and learn to really code.

    • by Jartan ( 219704 )

      Stop loading dozens of fucking libraries and frameworks and learn to really code.

      In other words consider the memory cost of your actions and don't rely on the GC?

    • by Anonymous Coward on Wednesday July 10, 2013 @05:59PM (#44244341)

      Since JavaScript is so damn lacking, those libraries are ESSENTIAL for anything beyond the smallest JavaScript app.

      Even if you don't use jQuery, for example, you're going to need to find and then use some other library that does the same thing, or write a whole shitload of code yourself to implement the same functionality. Zepto works as an alternative for some people, but even it still has some overhead.

      That applies to almost anything you want your app to do. If you want to work with objects, arrays or even strings in any way beyond the simplest of manipulations, you're going to need to use some third-party code, or write a whole lot of it yourself.

      JavaScript developers are so wholly dependent on these third-party libraries because the JavaScript implementations themselves are so bloody lacking. It's totally different than a language like Python, where there's a rich, yet still compact and efficient, standard library that developers know will be available on just about every user's system. JavaScript programmers have to provide this basic infrastructure with each and every app they write.

      • I don't know that it's the purpose of JS to offer quite so much... I know that, for example NodeJS offers a lot, but outside the core, there's not too much there. Underscore takes care of most of the inconsistent bits for current browsers as far as JS itself goes... which is itself relatively light. Most of what Zepto/jQuery offer are with regards to the BROWSER specific bits.

        I know it is semantics, but it really bugs me that JS gets such a bad rep because of the browsers' shortcomings.
      • by ShanghaiBill ( 739463 ) on Wednesday July 10, 2013 @06:39PM (#44244813)

        Even if you don't use jQuery, for example, you're going to need to find and then use some other library that does the same thing

        Furthermore, popular libraries like jQuery, Mobile-jQuery, etc. are much more likely to have clean, efficient, memory-lite implementations that some "roll-your-own" code. If you choose your libraries carefully, learn to use them, and avoid "rolling-your-own" unless really necessary, your code will certainly be smaller and cleaner, and usually be faster, smaller, and use less memory.

      • Re: (Score:3, Interesting)

        by bucktug ( 306690 )

        I really do appreciate a good framework... My favorite one currently is Vanilla-js http://vanilla-js.com/ [vanilla-js.com]

        Check it out. Amazing performance.

      • Re: (Score:2, Insightful)

        by loufoque ( 1400831 )

        Yet C developers have no problem using C, which is much more minimal language, to do much more than what you do with JavaScript, and they rarely depend on shitloads of libraries.

      • Python..... efficient....

        First time I've heard those two words together in sentence without "is not" between them.

    • Re:Easy (Score:5, Insightful)

      by Anonymous Coward on Wednesday July 10, 2013 @06:08PM (#44244453)

      We already know how to "really code". We just got sick of reinventing the wheel every time we start a new project. Now we let the libraries do the tedious crap, and we focus our attention on where it's actually needed.

      You're going to use our library-heavy code, and you're going to like it. You already do, in fact. You're lying when you pretend otherwise.

    • Re:Easy (Score:5, Insightful)

      by girlintraining ( 1395911 ) on Wednesday July 10, 2013 @06:09PM (#44244471)

      Stop loading dozens of fucking libraries and frameworks and learn to really code.

      If memory management was so easy, we wouldn't have devoted so much of our programming guides, style manuals, etc., to it. It's not a simple matter of "I wave my hand and the problem goes away." It has existed since before there were "dozens of fucking libraries and frameworks" and at a time when people did know how to "really code"... it has existed since the very. first. computer. And it hasn't been solved to this day.

      The main reason, I suppose, is the same reason why we haven't yet found The One True Concrete that all things can be built out of, or the One True Operating System upon which everything can run, or the One True... you get the damn idea. Men much smarter than you have devoted their entire careers to trying to solve the problem, and it's incredibly pretentious of you to toss off a one liner like it's (puts on sunglasses) just a simple matter of programming.

      • "... and it's incredibly pretentious of you to toss off a one liner like it's (puts on sunglasses) just a simple matter of programming"

        Agree. But the memory management thing really is an issue, too.

        Take Android, for example. Android was designed to allow apps to remain in memory until you manually kill them, or the OS gets around to doing it, if ever. And the OS is notoriously lax at doing so. And yes, it was designed that way on purpose. Google doesn't want people killing apps... it cuts off their data stream and ads. Yes, really. So they built the whole OS that way. According to some people I spoke to who worked on Android.

        Fortunat

        • by EdZ ( 755139 )
          There is a strange obsession among many that the only good RAM is empty RAM. Don;t shunt stuff out of memory until you need to, and it'll still be in memory next time you need it. Unless you want to page everything (rather than saving the important parts on sleep and assuming whatever is left in RAM might not be there again at wake but probably will), then there's no reason to turf anything out of RAM until you need that space for something else.
          • by steveha ( 103154 )

            There is a strange obsession among many that the only good RAM is empty RAM. Don;t shunt stuff out of memory until you need to, and it'll still be in memory next time you need it.

            Hmmm, not sure I agree with this as a blanket statement. I guess it depends on what you mean by "until you need to".

            I just built my wife a new computer. The old one was only five years old, with a quad-core 64-bit 2.5 GHz CPU, but it had horrible performance issues running Firefox. The problem was that my wife is a "power user"

      • Re:Easy (Score:5, Funny)

        by loufoque ( 1400831 ) on Wednesday July 10, 2013 @07:04PM (#44245051)

        Memory management is easy. Just program in C instead of JavaScript, problem solved.

        • Memory management is easy. Just program in C++ using smart pointers instead of JavaScript, problem solved.

          FTFY.

          • That's for plebs.
            Better use real RAII.

            • Point is of course, you can't just forget about memory. And garbage collection has no place on a mobile device.

              • shared_ptr is reference counting, which is pretty much garbage collection.
                Just manage your memory without relying on this but by designing your application taking into account which objects are responsible for the lifetime of other objects (ownership).

      • What he meant is the waste of memory that frameworks cause. And if you really knows what you are doing, is better to do your own specialized code than using a generic-framework one.
      • This is true, but it's JavaScript we're talking about. Project requirements are rarely scoped and developed from the ground up. Most apps and sites are dependent upon bloated frameworks and libraries that are not tailored for mobile capabilities; said frameworks were developed for the desktop.

        Not to mention very few JS developers know how to properly manage memory.

    • Re:Easy (Score:5, Funny)

      by Anonymous Coward on Wednesday July 10, 2013 @06:10PM (#44244479)

      Stop loading dozens of fucking libraries and frameworks and learn to really code.

      Because *REAL* programmers don't use libraries or frameworks. In fact, *REAL* programmers don't even use wussy text editors like vi or emacs; they use butterflies.

    • by ADRA ( 37398 )

      Thank you so much Ada, you've enlightened an entire generation of developers how wrong we've been our entire careers. Please please teach us the holy grail of never reusing code. We're all listening.

      • Thank you so much Ada, you've enlightened an entire generation of developers how wrong we've been our entire careers. Please please teach us the holy grail of never reusing code. We're all listening.

        We'll get to that, but first we need to talk about these things called deadlines, managers, and paychecks. After I'm done with the Q&A about those three things, anyone who still wants to seek the Grail may sign up on the sheet here on the desk... (blows away some dust)... Now, open your text books to page 25...
        -- Ada

    • Fucking true.
    • by jythie ( 914043 )
      Eh, what do you expect. The bulk of the people moving to mobile development are coming from the worlds of desktop or web applications. Barely any have significant embedded experience and just treat the system as a mini-desktop.
  • always (Score:5, Insightful)

    by Spaham ( 634471 ) on Wednesday July 10, 2013 @05:50PM (#44244249)

    You always need to think about memory. Like you need to think about what you're doing.
    Too bad for the "write app get rich" idiots.

    • by egr ( 932620 )

      You always need to think about memory.

      +1, you have to always think about memory, no matter what language and garbage collection or not, memory leaks will still bring your system to a crawl.

      • by ADRA ( 37398 )

        Most high level languages these days don't leak unless you leave explicit permanent handles to things laying around. Is that what you're talking about, or not-quite garbage collectors which are really just poor substitute reference counting solutions. I haven't worried about true 'leaks' in code for years. Occasionally (like yearly maybe) which are leaking bad references (generally due to greedy singletons). The most interesting part of memory management I and probably most people deal with is are the trad

        • by egr ( 932620 )
          If the language leaks memory (without any misdeed by the programmer) I consider it to be a bad language. Actually I was exactly talking about dead/bad references, caching and frequent re-allocations.
        • Re: (Score:2, Insightful)

          by Anonymous Coward

          Garbage collected languages have live-leaks that can have exactly the same memory bloat consequences that other memory leaks do. It's where you keep around a reference to an object sub-graph that you aren't actually using. This gets extra bad if you end up building up a linked list of such leaked objects and the linked list grows in size as your application runs. So you do need to think about live leaks every time you store a reference, it's just that the consequences are likely to be less dire if you get i

    • by AuMatar ( 183847 )

      Yup. When I worked at Amazon the #1 question on internal mailing lists was "my Java webservice feezes up and breaks SLA whenever GC kicks in, how do I fix this?". GC is not a silver bullet, and you're going to end up thinking about memory on anything non-trivial.

    • by Kjella ( 173770 )

      You always need to think about memory. Like you need to think about what you're doing. Too bad for the "write app get rich" idiots.

      But there's plenty "good code, crap idea" that won't make you rich either, most that have gone viral haven't been massively complicated, complex state of the art games. They've been simple, fun and easy to get into while being rather run-of-the-mill technically. Sure, you can't be hopeless but a lot are sufficiently skilled while the l33t coding skillz won't do any good on their own.

  • If you don't release all references to it, it will never be collected (that includes circular references if you're dealing with a reference counting garbage collector, like some IE browsers)

    • Reference counting is not garbage collection, please understand that.
      E.g. a group of objects that is referencing each other, but none of them is referenced from the stack or a "global" variable will never be deleted. But a garbage collector would find those objects and free them.

  • way back when one of the routines I depended on, and wrote myself, was something that stored all memory allocations, checked before deallocating, and checked on the end of a routine. It basically patched malloc() and dealloc(), or whatever. It would throw an error if I made a mistake. Still, even with such help, that can be easily turning off for production, memory allocation is tough and it one of those things that separates a skilled developer from a script kiddie, so to speak.

    But in the real world, i

    • Memory management is simple to any C or C++ developer. If you have difficulties with it, you are a bad developer.
      In C++, exception-safe memory management can be a bit tricky, but since C++ globally makes things simpler if you follow the right idioms, it still ends up being easier.

      I work in high-performance computing, and my focus is on the in-core and shared-memory optimization of numerical code. There is no need to ever go to assembly. Just write the right C code, using attributes or built-ins if necessary

  • by s7uar7 ( 746699 ) on Wednesday July 10, 2013 @06:10PM (#44244483) Homepage

    The only way to get "silver bullet" memory management is the same way we do it on the desktopâ"by having 10x more memory than your program really needs

    Give it a couple of years and that's exactly what will happen. Problem 'worked around'.

  • You could just 'force' people to use a language with explicit memory management, like by offering [better] support for that particular language (C/C++ is best but I understand people do not enjoy these lower level languages as much). I always thought that the best form of garbage collection is not having garbage collection at all, but managing your memory efficiently and having good allocators. Yet even on languages such as Java/Javascript you can be smart about your objects so to minimize the underlying
    • Don't generate garbage; this way you won't have garbage to collect.

    • Yet even on languages such as Java/Javascript you can be smart about your objects so to minimize the underlying allocations.

      The article explains that game developers do exactly this (when they choose to write in Java at all): they allocate all their objects at the beginning, so the garbage collector has nothing to do. In other words their being forced to manage memory in an environment that wasn't designed to support it.

  • by 0123456 ( 636235 ) on Wednesday July 10, 2013 @06:18PM (#44244569)

    Breaking news. Full story at 11.

    Garbage collection is supposed to stop dumb programmers doing dumb things, but in reality it just gives them different ways to do dumb things.

    • Man you try using closures with manual memory management. Quite a nuisance. GC is really quite nice.
      • by 0123456 ( 636235 )

        GC is really quite nice.

        So long as you don't mind spending weeks trying to eliminate the pauses and bloated memory usage and creating internal caching schemes to avoid having to allocate more objects to be garbage collected.

  • by slacka ( 713188 ) on Wednesday July 10, 2013 @06:18PM (#44244581)

    This is one of major flaws behind these Web based Mobile OS’s, you think that after WebOS, beautiful as it was, Mozilla would have learned their lesson. Instead, they’re trying to drive underpowered hardware with a HTML/JS. All the web technologies are being shoehorned into areas they were never designed for. From DOM being used for Applications to the lightweight scripting language, JavaScript, being used for Apps, to a bloated HTML render as the platform's UI toolkit.

    JavaScript is a nice little scripting language that’s got some nice functional programming features. When you need to need to write heavy applications that require performance, low memory usage, and multithreading, it’s the wrong choice.

    • Funny, you don't mention Android being doomed in the same breath... considering even the article mentions that it, being a GC platform has many of the same issues.
  • the approach to JavaScript development.
    Everybody is to trying to use 1 library for all platforms (mobile, desktop) etc.

    The first thing is to stop this non-sense.

    Use server side technologies to sniff out the client. When working with mobile phones, create or bastardize a library which has the smallest footprint possible to fit your needs.
  • by Miamicanes ( 730264 ) on Wednesday July 10, 2013 @06:47PM (#44244877)

    The problem isn't that Android phones have "limited ram", the problem is that Android's garbage collection sucks miserably at dealing with short-lived objects -- a problem that was fixed & mostly a non-issue with J2SE by the time 1.4 or 1.5 came out more than a DECADE ago.

    10 years ago, when J2SE 1.5 was out and its garbage-collection problem was already a historical footnote, a laptop with 512mb, 32-gig hard drive, and 700-1000MHz CPU was fairly respectable. A Galaxy S3 has a gig of ram, 16 or 32 gigs of internal flash, and a 32-gig class 10 microSD card costs $20 on sale.

    • by cnettel ( 836611 )
      A Pentium III-M or early Pentium M was far faster, on a clock by clock basis, than a single ARM core in most modern smartphones. They get ahead a bit by sporting multiple cores, but that won't help you much in Javascript, or in the really tricky parts of GCing in most VMs (not sure about Dalvik).
  • by ducomputergeek ( 595742 ) on Wednesday July 10, 2013 @06:55PM (#44244959)

    Most the "Apps" I'm being hired to write are basically CRUD form apps that are designed to read info from tables in a database. Usually to take forms already in use by desktops written in Java or .Net or in some cases god only knows what and adapt them for use on mobile devices.

    I've frankly found jQueryMobile + HTML5 + Phonegap/Cordova makes this task farily easy to undertake client side. Actuallly in most cases the cost is still developing and deploying the API side in your choice of server side scripting language. And often that's based upon a perl script that I wrote circa 2000 to take form input, validate, and then go fetch data from a database and return in XML, YAML, or JSON these days. Other projects, the server side is in PHP or C# or Java. Just depends on what the client already has.

    Now I can see trying to buld other types of apps using HTML5/JS is asking for disaster.

    Sorry, I'm an old perl guy who thinks use the right tool for the job and there is still more than one way to do it.

  • Short of issues with CSS transforms and in some cases hardware acceleration in general on Android, its not all that slow really.

    I have a nexus 4, which isn't low end but its no Galaxy S4 or iPhone 5 either as far as web performance goes, and for the vast majority of websites, it works just fine, with a few delays for ad-heavy sites, sites making heavy use of CSS transforms and animations (which are slow regardless of what you do with JavaScript...I'm being told the situation on iOS is much better), and a fe

  • You could run certainly run Ruby on any mobile device if it had a magic garbage collector that solved everybody's problems. Except there's no such thing that's immune to idiot developers who allocate memory or variables and leave reference to them hanging around. The same problems apply to java on Android.

    TFS hasn't inspired me to read TFA, so sorry if it's explained there.

  • by WaffleMonster ( 969671 ) on Wednesday July 10, 2013 @07:29PM (#44245259)

    I've found the best way to get developers to stop being lazy is to give them shit hardware.

    The mistake is buying the latest quad core 2GB android goodness... Give them a 5 yr old piece of shit and the resulting mobile apps will rock.

  • by TopSpin ( 753 ) on Wednesday July 10, 2013 @08:19PM (#44245601) Journal

    Memory management is an issue that has me excited about Rust [rust-lang.org]. Rust memory management is explicit, easy to use, deterministic, efficient and safe. The language designers understand that garbage collection is costly and that endemic use of a garbage collector limits applicability.

    Although Rust does have reference counted heap objects on the so-called "exchange" heap, memory is normally allocated on the stack or on a "local" heap (via an "owned" pointer) that has "destructor-based memory management," much like C++ objects but without the leaks and wild pointers.

    The result is the vast majority of allocated memory is not managed by the garbage collector. Use of the exchange heap is exceptional and explicit, yet immediately available when necessary. Otherwise, memory "management" is reduced to efficient stack pointer manipulation or simple, deterministic destruction. Compile time checks preclude bad pointers and simple leaks so common with traditional systems languages.

    There is a series of detailed blog posts about Rust memory management here [github.io].

    Rust was inspired by the need for an efficient, AOT compiled systems programming language that is productive, concise and at least as safe as contemporary "managed" languages. Its memory management scheme goes directly to the point of this story.

  • by tknd ( 979052 ) on Wednesday July 10, 2013 @08:59PM (#44245843)

    GC sucks, real programmers can do memory management, blah blah blah. Tell me the last time a programmer made billions because "he could memory manage" and I'll show you plenty of poorly written websites, apps, software, that suck at memory management yet still managed to become popular and used by millions.

    The market decided long ago that fewer programmer hours was better than users waiting a few seconds everyday for their device to GC. Users don't exactly like it, but it works, they get their hands on a more than usable product faster.

    But back to the article. In the article there's some fancy charts about how iphone 4s only has 512mb of ram. Ok, a mobile device isn't going to run with a swap file because, well, the manufacturer decided to skimp on Flash chip quality so not only does writting to flash suck, but it also runs the risk of forcing the cell to over-provision (meaning shrink in usable capacity). But iphone 4s will be 2 years old in 4 months! 2 years = EOL in the phone world.

    How about a more current phone? Ok, the Google LG Nexus 4 which will become 1 year old in 5 months comes with a whopping 2GB of RAM. And its a relatively cheap phone! That's already half of my 2011 Macbook Air's RAM. Prediction? In 4-5 years, mid-range phones will be shipping with 4gb of RAM.

    Ok, let's go the other direction. Let's say we all agree and programmers should sit down and memory manage again. Hurray! Problem solved? No. Because programmers are inherently bad at memory management. Memory will leak. Look at some popular web browser named after a type of animal. So instead of your phone pausing for a second to GC, now your app just crashes! AWESOME.

    The standard software engineering practice still applies. Design your system. Build your system. Once it is usable and more importantly has a market, then profile for optimization.

    • by TopSpin ( 753 ) on Thursday July 11, 2013 @12:50AM (#44247149) Journal

      The market decided long ago that fewer programmer hours was better than users waiting a few seconds everyday for their device to GC.

      No, actually, that's not what happened. As the summary and the story itself (both of which went unread apparently,) point out, one of the most successful systems to emerge in the market recently, iOS, is not a GC environment.

      Over here [apple.com] you may learn about iOS memory management. Without getting too far into that wall of text one discovers the following:

      If you plan on writing code for iOS, you must use explicit memory management (the subject of this guide).

      Ok, so your claim that GC is the only viable solution for contemporary application development is demonstrably false. Lets look some other assertions:

      programmers are inherently bad at memory management. Memory will leak [if programmers must manage it].

      First, the vast number of iOS applications not leaking shows that a non-GC system doesn't necessary have to leak. At least not badly enough to compromise the viability of the platform, which is the only meaningful criteria I can think of when it comes to the market.

      Second, why assume programmers are inherently bad at a thing when that thing has traditionally been exposed via terrible, error prone, demonstrably awful mechanisms? It seems to me that among widely used tools we leaped from 'systems' languages with truly heinous MM primitives (C/C++) directly into pervasive GC systems. Aside from Objective C+ARC there just aren't enough good non-GC systems to make broad generalizations. Thus, you may be right about programmers, but you can't prove it, and I doubt it.

      Finally, what proof is there that pervasive GC is better at not leaking than a good explicit MM system? Anyone with an Android system and a bunch of apps will quickly discover that pervasive GC does not eliminate leaks.

      [some phone] comes with a whopping 2GB of RAM

      Goggle Glass has 682mb of RAM. There is always a new platform into which we much fit our software and the new platform is usually resource constrained, so there will never be a day when questioning the cost of GCs is wrong. Maybe the wearable you eventually put on will have 8 GB of RAM. The computers you swallow or implant or sprinkle around the lawn probably won't. The fact the next generation of phones can piss away RAM to greedy GCs just isn't particularly informative.

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...