Forgot your password?
typodupeerror
Cellphones Handhelds Portables Programming Stats

An Interesting Look At the Performance of JavaScript On Mobile Devices 157

Posted by timothy
from the down-in-the-weeds dept.
First time accepted submitter faffod writes "Coming from a background of console development, where memory management is a daily concern, I found it interesting that there was any doubt that memory management on a constrained system, like a mobile device, would be a concern. Drew Crawford took the time to document his thoughts, and though there is room for some bikesheding, overall it is spot on. Plus it taught me what bikeshedding means."
This discussion has been archived. No new comments can be posted.

An Interesting Look At the Performance of JavaScript On Mobile Devices

Comments Filter:
  • by EvanED (569694) <evaned@ g m a i l.com> on Sunday July 14, 2013 @08:08PM (#44280687)

    Don't blame Timothy for the dupe [slashdot.org]; no doubt he posted it from his mobile.

  • by Rob_Bryerton (606093) on Sunday July 14, 2013 @08:17PM (#44280719) Homepage
    From TFS:

    ...and though there is room for some bikesheding, overall it is spot on. Plus it taught me what bikeshedding means."

    But it didn't teach you how to spell it...

    • Re: (Score:3, Funny)

      by Anonymous Coward

      <Ralphie Wiggam> The missing 'D' was used for my grade. </Ralphie Wiggam>

    • Spelling isn't important in programming, lol. That's what the IDE is for. Coincidentally, most IDEs for javascript have little to no spelling assistance, lol.
      • by arth1 (260657)

        I'm glad I don't work with you, if you laugh at loud for every sentence you write.

        • by lxs (131946)

          I like the comma before "lol." Write sentence, little pause, bellylaugh. Classic.

        • Re: (Score:2, Offtopic)

          by slashmydots (2189826)
          You should be. Since I'm the IT manager, I'd fire you with an attitude like that.
      • Spell-check in IDEs (Score:5, Informative)

        by tepples (727027) <tepples&gmail,com> on Sunday July 14, 2013 @09:54PM (#44281131) Homepage Journal

        Coincidentally, most IDEs for javascript have little to no spelling assistance

        Spell-check in IDEs generally relies on static analysis of the variables in scope at any given point in the program. The more dynamic a language's type system is, the harder it is to statically find the names of symbols in scope at any given point in the program. PHP and Python are kinda-sorta OK for this because all global variables are out-of-scope (PHP) or read-only (Python) unless declared otherwise at the top of a function's definition (or, in PHP, unless the variable is one of the predefined superglobals, whose names are all uppercase starting with $_). This way, the IDE can parse a function for all variables assigned to in a function and assume they're local. JavaScript, on the other hand, defaults to making all variables global unless declared local with the var keyword.

        Spell-check also relies on static knowledge of what source code files are in scope. This is dead easy for Java. In PHP you scan for require_once, and in Python you scan for import, but even then, a module is occasionally conditionally imported, and importing has side effects. JavaScript can't include JavaScript at all except by appending a <script> element to the HTML DOM with the src= attribute referring to the other script, and the idiom for that is harder to recognize than a simple import statement.

      • Spelling isn't important in programming, lol. That's what the IDE is for. Coincidentally, most IDEs for javascript have little to no spelling assistance, lol.

        Tell that to Microsoft who had to break compatibility [microsoft.com] due to some spelling mistakes. Specifically, the MFC/ATL section.

      • "Spelling isn't important when writing. Thats what the word processor is for!"....."lol"

        You're me at 13 aren't you? :)

    • Irony overload, abort, abort!

  • by goruka (1721094) on Sunday July 14, 2013 @09:10PM (#44280915)
    I'm probably going to get downvoted as troll, but my experiences with most console developers were often strange (as a developer myself).
    Talks usually end up in most of them dismissing scripting languages, higher level APIs (such as OpenGL), or certain algorithms as useless because they are slow, use too many instructions unnecessarily, waste cache, etc.

    Any attempt to raising a point about how you don't need to optimize everything but only few critical zones of your code (what matters), or that a cache wasting algorithm can end up being faster anyway just because it's more efficient, immediately results in myself being dismissed or treated as ignorant because, something inefficient is obviously inefficient and I must be stupid for not realizing that.

    This article reminds me of that. The author claims (in his first claim) that he is determined to prove that something is less useful because it's slower, and nowhere in that huge long piece of text there is anything useful offered as proof, instead he keeps posting data about how slow Javascript really is.
    • Any attempt to raising a point about how you don't need to optimize everything but only few critical zones of your code (what matters) ... immediately results in myself being dismissed or treated as ignorant

      To be fair, if you were debating with someone who writes applications that really do need the very top levels of performance, and you claimed that optimising trouble-spots would be sufficient to achieve that, then you were ignorant. For most software, being within a factor of 2 or 3 of hand-optimised low-level code is easily sufficient, and a bit of work with the profiler to identify the most serious flaws will go a long way. The rules change when you shift from that kind of performance standard to needing

      • by OneAhead (1495535)
        Ahh, the quest for Uniformly Slow Code [c2.com]. Something the scientific programmer is deeply familiar with. Too bad there's so few people who understand.
    • by Arker (91948)

      They arent useless but they are certainly not always the best (or even simply an appropriate) tool to use. Scripting is great for the trivial-but-useful, particularly a one-off or a mock-up. But any sort of serious computer program is going to be better if you program it and most anything that is trivial-but-useful enough to have a jscript app should be quickly reproducable on any platform that you want in a native fashion. Do you need a native program to have a simple calculator? No, javascript will do. An

      • It's funny seeing how people are rediscovering the importance of programming as a result of these phones

        Very, very well said, sir.
        • It's funny seeing how people are rediscovering the importance of programming as a result of these phones Very, very well said, sir.

          And the most fun!

    • by arth1 (260657) on Sunday July 14, 2013 @09:53PM (#44281125) Homepage Journal

      The problem is that your baby is not the only thing running on the system. When you waste resources, you do it on behalf of everything else that runs too. Even if your baby isn't doing anything critical when you waste it.

      It only takes one selfish programmer to screw up an embedded system. You are he.

      • by goruka (1721094)

        It only takes one selfish programmer to screw up an embedded system. You are he.

        Even though it's unrelated with my original post, you are saying that not going native is worse because it uses more CPU cycles/battery?
        Explain to me why, for decades, the industry used J2ME, Java (Android) and now ObjC (Apple). I guess the entire mobile industry is selfish and greedy?
        You probably didn't understand GP, though, the message is that you don't need to optimize something that doesn't consume enough cycles be a performance problem.

        • by arth1 (260657)

          Even though it's unrelated with my original post, you are saying that not going native is worse because it uses more CPU cycles/battery?
          Explain to me why, for decades, the industry used J2ME, Java (Android) and now ObjC (Apple). I guess the entire mobile industry is selfish and greedy?

          Of course they are. But that's beside the point. Development is a trade-off - you have to work with the market you have, within deadlines that means you'll sell, and developers you can find and afford. So yes, you make do with what makes the task feasible.
          But you don't have to make it any worse than necessary by allowing bloat and doing things inefficiently. Adapting a mindset that you do work in a shared embedded environment, and do things frugally doesn't incur a great cost.

          You probably didn't understand GP, though, the message is that you don't need to optimize something that doesn't consume enough cycles be a performance problem.

          It's not just about cycles.

          • by arth1 (260657)

            To follow up on my own post, what we see in environments like the Android world is a tragedy of the commons. If everybody played nice, everybody would benefit. But there's no penalty to yourself for being greedy, so you are. And so are all others.

            Android really needs something like strictly enforced cgroups.

            • by goruka (1721094)
              I understand your point, but I believe it's a little too extremist.
              In the real world, It is always possible to write more efficient code, but the more you optimize, the more difficult to develop, maintain or port it becomes, exponentially.
              So in the end, it's always a trade off between performance and cost of development, added to the fact that not all code needs to be optimized, only the little portions that perform the most critical tasks.
              • by arth1 (260657)

                added to the fact that not all code needs to be optimized, only the little portions that perform the most critical tasks.

                That this is false is my point - it's only true if your app is the only app on a system. On a shared embedded system, the portions that don't do critical tasks are just as important to optimize for the rest of the system.
                Because there's no penalty to your own app, it becomes a tragedy of the commons [wikipedia.org].

                • by goruka (1721094)
                  So, what's the difference then, that your phone battery will last 18 hours instead of 20 because you didn't optimize more than the critical tasks?
                  It seems much cheaper to solve this by adding a little more battery capacity, yet keep your phone OS and applications easier and cheaper to develop.
                  No matter how you look at it, I can't see the scenario you describe as being a tragedy..
                  • by arth1 (260657)

                    You didn't follow the link, did you? It's a situation that's called "a tragedy of the commons", which doesn't mean it's a tragedy.

                    And anyhow, it's not about battery life, but applications using more than their fair share of memory, IO or other resources contribute to starvation for other apps that run at the same time, possibly causing crashes in other apps when they cannot allocate memory (because they're well behaved and allocate when needed and free when done), cannot update alarms in time, can't take a

                    • by goruka (1721094)
                      Ah, I understand. You are completely right..
                      except this has nothing to do at all with my original post, nor the article, both about processor usage.
        • Explain to me why, for decades, the industry used J2ME

          Because Sun pushed it on everyone. It sucked big time, though. Did you ever write a J2ME app? It was the kind of platform where everything was an object except for primitives, but memory management was so messed up because of the combination of GC and extremely small heap, that pretty much any serious app didn't use any objects. Instead, you preallocated arrays of primitives, and used that for everything.

          Java (Android)

          You mean, the only mobile platform that still has horrible UI latency?

          and now ObjC (Apple).

          Obj-C compiles to native code and

    • I happen to have an average of about 200 tabs open on most of my daily use machines. This tends to eat most of the resources that my machine has, regardless of how modern the machine is. If it's an old clunker, it chokes on less and I generally don't make it that far before I have to kill the browser and restart it. It seems impossible for any OS vendor and any "full featured" web browser to just deal with the limitations of the system and keep the application snappy and usable.

      Sure, it is way faster if yo

      • How do you manage 200 tabs open at the same time? How do you easily get to the tab you want?
    • The funny thing is that he's recommending Automatic Reference Counting instead, which destroys cache much more than GC during regular processing.

    • > Talks usually end up in most of them dismissing scripting languages, higher level APIs (such as OpenGL),

      A few years back I've implemented OpenGL on the Wii and did maintenance work on our OpenGL version running on the PS2. Hell, even shipped a couple of games with it. OpenGL 1.x _can_ be implemented efficiently on a console if you apply some discipline. People who dismiss a rendering pipeline probably have never implemented one. HOWEVER, their point is that memory manage CAN be an issue if one isn'

      • Where is my mod points when... Well, you got it :-) Very well said.
      • by goruka (1721094)

        your mindset is part of the problem not the solution,

        Your mentality of "we will fix it later" is the sign of an immature and inexperienced programmer.

        1. You don't understand the first rule of computing:

        2. Your boss / peers are trying to teach you an important lesson:

        you have FAILED as a programmer

        Please stop this shitty attitude

        THAT is the point -- not your uneducated rant

        And you are going to make excuses that you can't be bothered???

        I'm so sorry, I'll never do it again!

  • I learned C++ then VB.NET then C# then JavaScript. I was shocked at how "anything goes" it was and I assumed it was a memory nightmare. Considering my i5-2400 sometimes maxes out on pages with complicated javascript, I'm not surprised. I heard that JS can take 10x more memory than it needs at any given time realistically.
    The one thing I can't wrap my head around is if it was made a "real" language, would it be a gigantic security disaster? Or could it be limited enough to not turn into Flash, Java, etc
    • by Rockoon (1252108)
      I started programming at a time when GOTO was still considered "kosher" (in C, no less) as a lot of algorithms were designed as state machines. To this day I still sometimes consider a goto, but even I stand in awe of the complete retardedness that is javascript.

      What happened to languages that pick some things and then do them well? JavaScript seems to have evolved to try to do everything, and yet it doesnt do anything even close to well at all.
    • node.js (Score:5, Funny)

      by dutchwhizzman (817898) on Monday July 15, 2013 @01:25AM (#44281873)
      Try looking into node.js and see all your fears come true.
    • Good question. My educated guess is that the first problem would be security, after all you would be running a complete application simply by accessing the page. The second problem would be that as each one have a favorite language, in deciding what would be the "lingua franca" of the Web we would have a Digital World War
  • Crisis Averted! (Score:5, Insightful)

    by GreyLurk (35139) on Sunday July 14, 2013 @09:31PM (#44281021) Homepage Journal
    Whew, I'm glad we managed to get everyone to switch off of memory-leaking and CPU intensive Flash stuff over to standards compliant HTML5 and JavaScript then!
  • Animated GIFs.

    I think they're of the devil, but for some reason a lot of baseball stat heads still use them instead of a video format when they want to post a few seconds of a game for illustrative purposes. It's weird because these are generally young guys, not the old farts who you'd expect not to have changed their workflow since 1995...

    JavaScript on the iPad? That doesn't seem slow - certainly not enough to where it registers anyway.

    • for some reason a lot of baseball stat heads still use [GIF animations] instead of a video format

      Some browsers can view only H.264 and animated GIF. Other browsers can view only Theora, WebM, and animated GIF. Some, such as the latest version of Internet Explorer that runs on Windows XP, can't view anything but animated GIF without plug-ins that may or may not be installed and that the current user may or may not have privileges to install. If the only video format supported by all browsers is animated GIF, what should a site use to reach the most viewers?

      • True, animated gif is the most widely supported "movie" format if you look at all target platforms. However, there currently is no technology implemented in browsers that will take an animated gif, re-render it into something that can be accelerated by the video card and use that for output. This results in the browser pumping all the frames non-accelerated to the video card. Devices with limited (read, all devices that have more than a few tabs open, or mobile devices) will hit limitations of the hardware
        • by tepples (727027)
          If the user's browser has more than one tab open, it won't send frames in inactive tabs to the screen. If your site has a large audience on mobile devices, and you're a big enough company to license footage from MLB, you can probably afford to create a native app for iOS, a native app for Android, and a native app for Windows Phone.
        • Maybe because animated GIF has been reasonably performant on x86 browsers since the 486 days.

  • More Full Response (Score:5, Insightful)

    by El Royo (907295) on Sunday July 14, 2013 @10:41PM (#44281309) Homepage
    I made a comment [slashdot.org] to a poster over on the original posting of this. I think it's worth expanding upon in case people are persuaded by the arguments in the paper.

    First off, just as TFA predicts, I'm not going to try to conquer his mountain of facts and experts by presenting a mountain of citations. Instead, I'm going to point out where his conclusions are not supported by his facts and point out his straw man arguments and his attempt to convince us through overwhelming expert opinion.

    The straw man: In the article, he presents two scenarios (photo editing and video streaming) and claims that you can't reasonably do those because of memory limitations (on the iPhone/iPad). He then concludes you can't produce useful apps because you can't do those two. I couldn't find any citations of people attempting to do this on mobile using JavaScript. Choose the right tool for the job here. I'll give him these two use cases (and several others: 3D Games, audio processing, etc), however to extrapolate from here that no useful apps can be produced (ever!) using JavaScript is a leap too far.

    Next, he spends a lot of time diving into the particulars of garbage collection (GC). I'm going to grant him practically every point he made about GCs. They're true. And, it's true that mobile is a constrained environment and you must pay attention to this. But, this is largely known by developers who are trying to write high-performance JavaScript applications on mobile. Hell, -anyone- writing high-performance apps in any language need to be aware of this. If you allocate memory during your animation routines in a game you're asking for trouble, regardless of the language. So, to me, this part is just a call to pay attention to your memory usage in your apps. This is really useful advice and I will be paying even more attention to the new memory tools available in the latest Google Chrome dev tools.

    One of the biggest problems in the rant is the comparison of pure computing performance and his claim that ARM will never be as fast as desktop. I'm going to again grant that this is true. However, this means crap-all for most apps. Tell me: How many apps do you have one your phone that are processor bound? None? One? Two? The vast majority of apps spend their time either waiting on the user or, possibly, waiting on the network. You can write a lot of really useful apps even given constrained processor and memory. Anyone remember the Palm Pre? The TouchPad? Most of those apps were JavaScript and they worked just fine.

    This brings me to the point of all this, TFA's author focuses on performance. However, users focus on responsiveness. JavaScript is perfectly capable of producing responsive applications. Sometimes, it takes attention to detail. Nothing is ever 100% free and easy. JavaScript is not a magic solution and those of us who think that JavaScript has a future in mobile app development know this. This is why programmers get the big bucks. Writing mobile apps, you need to be aware of the effects of CSS, memory, processor, responsiveness and more.
    • One of the biggest problems in the rant is the comparison of pure computing performance and his claim that ARM will never be as fast as desktop. I'm going to again grant that this is true. However, this means crap-all for most apps. Tell me: How many apps do you have one your phone that are processor bound? None? One? Two?

      The other issue you're not addressing it runtime memory requirements. From the GC performance chart, the best performing GC that provides near native performance does so by requiring 5x more memory. This is going to impact the number of concurrently running apps on your phone/tablet before things start slowing down. Users prize snappy interfaces and if their mobile device slows down then the knowledgable users will bring up a task manager to figure out what's going on.

      Do you want to stick around to see wha

    • by Luyseyal (3154)

      I think you're overstating the case of performance verses responsiveness. He does specifically point out that GC is negatively impacting UI response times and that that is not going to work.

      I agree with you that people who work in the problem space know about memory management by experience. However, that JavaScript makes it so difficult to manually manage memory seems to be his real point.

      Lastly, if we could solve the network speed problem, you would just outsource real CPU/memory apps to the server and si

      • Lastly, if we could solve the network speed problem

        That sounds like a big problem, especially through the air....

    • If you allocate memory during your animation routines in a game you're asking for trouble, regardless of the language.

      That's not true at all. If you allocate memory during animation in a language with deterministic memory management, you have a pretty good understanding of what it'll cost and whether you can afford it (and in many cases, the answer is yes).

      Note that animations are not specific to games. One common case where you allocate memory during an animation is when the user is scrolling a list that is backed by a dynamic data store (i.e. items are generated "on the fly").

      More importantly, the problem with GC is not

  • TLDR version (Score:5, Informative)

    by Rob_Bryerton (606093) on Sunday July 14, 2013 @10:51PM (#44281345) Homepage
    Actually a pretty well written piece, if a bit wordy. I see a lot of people commenting here are perhaps missing the point, thinking that the author's angle was JS=BAD. Not at all. My take was his issue was not so much with JavaScript, but with Garbage Collected languages in general.

    An important point he made regarding GC routines and how they tend to be unpredictable in terms of when and how long they run. Also, much was discussed on his observations that, if you have several times more memory available than what your app needs, the GC routines are very non-intrusive. However, when you get into a low memory situation, the performance hit from GC is huge and causes obvious stutters in the application and/or it's UI.

    Also, some discussion on the irony of working around (or trying to "spoof") the GC by using various manual techniques, and how that almost amounts to manual memory management. All in all, a really interesting read.
  • The very same blog post was already the subject of an article on Slashdot last week. What is going on here ??
  • Opera browser (versions up to 12) was shown on one graph (iphone 4S) in the article and it kicked some serious a$$ but never mentioned again.

    One reply to the article was about Opera being only browser to run Google Wave, kind of... "The only browser that ran it with anything resembling
    “speed” for it’s first year or so was Opera, and Opera never really worked very well with it anyway."

    I enjoy security through obscurity but Opera is just too good a browser to ignore.

  • Bottom line: just because you're using a GC'd language and you CAN ignore memory management, doesn't mean you SHOULD. That goes for JS, Java or any other GC'd language in existence.

    I hate to go off on a tangent... but that won't stop me from doing so, because I think it's actually the core of the issue and is entirely non-technical:

    This all goes back to the abysmal state of many (most?) "modern" developers.

    If you grew you with computers at the time I did, the late 70's/early 80's, and you learned to progra

  • "How much memory is available on iOS? It’s hard to say exactly."

    Exactly. Because it's an appliance. And you're either willing to take it apart and learn how it works so you can make it do what's needed, or you accept that you're an appliance operator and stop bitching about it.

Cobol programmers are down in the dumps.

Working...