Supercomputing, There's an App For That 66
aarondubrow writes "Researchers at MIT have created an experimental system for smart phones that allows engineers to leverage the power of supercomputers for instant computation and analysis. The team performed a series of expensive high-fidelity simulations on the Ranger supercomputer to generate a small "reduced model" which was transferred to a Google Android smart phone. They were then able to solve engineering and fluid flow problems on the phone and visualize the results interactively. The project proved the potential for reduced order methods to perform real-time and reliable simulations for complicated problems on handheld devices."
Re:appx. (Score:5, Insightful)
It sounds like the supercomputer generated an algorithm for the smartphone to run. I guess they can call that "leveraging the power of a supercomputer" but implying the phone app is doing supercomputing stretches things a bit far. I call misleading headline.
Not even... (Score:4, Insightful)
The team performed a series of expensive high-fidelity simulations on the Ranger supercomputer to generate a small "reduced model" which was transferred to a Google Android smart phone
This is like saying that watching Toy Story on your iPhone leveraged the massive renderfarm used by Pixar.
Re:Not even... (Score:4, Informative)
Not really. In this case, the smart phone isn't simply rendering output of a supercomputer simulation.
Re: (Score:1, Insightful)
This is a university PR piece, and these are notorious for being vague about what they are actually claiming .
It sounds like the supercomputer is basically doing all the grunt work and the phone is doing something analogous to interpolating the results. For example if the supercomputer supplies pre-computed results for some question for parameter alpha=1.0 and alpha=2.0 and if the user selects alpha=1.5 then the phone will interpolate the two supplied results and get an answer that will (if the interpolatio
Re: (Score:3, Interesting)
What the phone is doing is "reduced order modeling", which (if the article is using the term accurately) finding a simple set of equations whose solution provably approximates a far more complex system of equations. It's not just interpolation (lookup tables, or machine learning from a training ensemble). Reduced order models actually have dynamics in them.
The potential innovation here, as I see it, is that it takes some supercomputing effort to build the reduced order model for a specific problem. So yo
Re:Not even... (Score:4, Informative)
I just re-read the article. What it sounds like they're doing is having the supercomputer craft a reduced order model which is optimized to a particular range of parameters. That suggests to me that they're constructing a perturbative model about some fixed solution that the supercomputer produces. Perturbative approximations are more accurate the closer they are to the "reference" solution. So the innovation appears to be: the user can specify what set of parameters they want to perturb about, and therefore construct a custom model which is optimized to perform well in the parameter range that user is interested in.
Re: (Score:2)
More like doing a rasterised live render of Toy Story rather than watching the fully ray traced version.
Re: (Score:2)
Don't worry, that's queued for tomorrow's story: "Watching Toy Story On The iPhone(TM)".
(Certainly as newsworthy as many stories we get, such as yesterday's "Choose your own adventure On The iPhone(TM)" or some time ago was "You can access this website On The iPhone(TM)".)
Re: (Score:1)
I would make the educated guess that it could easily work on other Android platforms, not just the Nexus One, depending on their implementation.
Re: (Score:2)
oh, and first post!
Well, approximately.
Butt does it hurt? (Score:1)
Does it still hurt that the G1 moved more units in its first minute in the market than the Microsoft Kin did in whole lifespan? That would be nice if it bothered you. I subsist from your pain. I'm a pain vampire that way.
The Google phone is still available through other vendors though not from Google, but it did what it was intended to do: convince vendors that there was a market for the Android candybar phone. It's proved and there's no more need for the proof because the idea has taken off. Google d
Re: (Score:2)
It's funny how a phone which didn't sell well seems to keep showing up in press releases
The Iphone?
(Apple have about 3% of the market, yet get a mention several times a day in any random Slashdot story; to put things in perspective, Nokia ship twice as many phones per quarter than Apple have ever sold, and even just one of their many products, the 5800, has sold equal to or more than the original "iPhone". Android has now already overtaken Apple btw, and is the fastest growing platform, whilst Apple are act
Re: (Score:2)
PR Bullshit (Score:5, Informative)
The money quote "This is not the first time that model reduction algorithms have been used to ameliorate the complexities of large-scale physical simulations. The advantage of the system designed by Knezevic and his colleagues is its rigorous error bounds, which tell a user the range of possible solutions, and provide a metric of whether an answer is accurate or not. The error bounds are based on mathematical theory developed in Prof. Patera's research group at MIT over a number of years. "
The research is about error bounds on coarse grained models. The smart phone is just hype.
Re: (Score:3, Funny)
Re: (Score:3, Insightful)
Everyone is spamming slashdot, and the people voting on the firehose are generally too lame to understand it. Throw "reduced order methods to perform real-time and reliable simulations" at them and they click the + just to look smart.
Re:PR Bullshit (Score:4, Funny)
Re:PR Bullshit (Score:5, Funny)
Go into settings and select "scientific mode". That makes a lot more buttons appear.
Re: (Score:2)
Re: (Score:2)
And if there research gets lost, the credit claimed by someone else coming along later and doing the same thing again, all because they obscured their research behind misleading buzzwords? Tough.
Exactly (Score:2)
Re: (Score:1)
Re: (Score:2)
Yep. The research is good, but the smartphone is just an angle. Sorta like if I write an adventure game ans say "it could be the first text adventure played on a space station" just because it isn't actually incompatible with the laptops on ISS.
Wow. (Score:2)
So... if you analyze a problem and discover you can get mostly accurate results from a simple algorithm, you don't need a supercomputer anymore? What a concept! I'm going to go write the first physics simulator for personal computers!
Seriously, the cool bit is that they're generating these reduced models programatically. But the way it reads, it sounds like the reduced model itself, and the fact that it runs on smart phones are the important parts.
Re: (Score:2)
Seriously, the cool bit is that they're generating these reduced models programatically. But the way it reads, it sounds like the reduced model itself, and the fact that it runs on smart phones are the important parts.
And that sounds accurate to me. It's a demonstration, among other things, that you can control a complex system, that originally required the efforts of a supercomputer to model, with far simpler tools.
So (Score:1)
Re: (Score:2)
I remember when 100 MFLOPS was munitions-grade computing. Now you have 1 GHz in your pocket and 4 cores of 3 GHz hooked up to 450 cores at 1.5 GHz and you still think Crysis is stuttering even in low res...
Semantics, maybe... (Score:3, Informative)
...but I'm going to go ahead and argue that they are not "performing supercomputing on a phone", because that kind of marketing doesn't belong in research.
Yes, it could be very useful; I have no doubt it's just as useful as they claim. And yes, it allows someone in practice to solve a problem "in the field" with a phone, when otherwise a supercomputer might have to be used.
But the supercomputing was done on a supercomputer in advance, when the reduced model was calculated. Its just that instead of giving one specific answer for one specific input, the supercomputer is returning an algorithm that will approximate the answer within known error bounds for a specified domain of inputs. Executing the algorithm isn't supercomputing (if it were, you couldn't do it in a few seconds on a phone); it's using the fruits of the earlier supercomputing that produced the algorithm.
Re: (Score:1)
I tend to agree, as I could also use my smartphone terminal to log unto that supercomputer, and have a thin client application that is the GUI for setting most info for the processing needed to be done, then let the supercomputer do its thing, and then have that thin client visually display the results, so no real supercomputing going on here...
marketing (Score:1)
Re: (Score:2)
It's a table lookup where the things you look up in the table are easily-solved algebraic equations across the small domain of the single table entry, instead of a continuous model in differential equations across the entire state space of the table.
The thing that can solve the continuous model in differential equations across the entire state space is a supercomputer. The thing that can chop the continuous model up into a table of simple algebraic equations is also a supercomputer. The thing that can loo
Re: (Score:2)
Yes, we know that reduced order models already exist. The new thing here is that you can input the parameter range you want, wait a few hours, and the supercomputer sends you back a custom reduced order model optimized for the parameter range you care about. You can then apply this model "in the field" to the situation you're dealing with. It's supposed to be useful for situations where the details of the problem aren't known ahead of time, and you can't pre-compute the reduced order model.
And no, the ar
Re: (Score:2, Informative)
Furthermore the article has no details on how the error bounds are calculated.
Good point, if you are interested in the details of the error bounds, please check out our preprints below, and the references cited therein.
http://www.tacc.utexas.edu/~peterson/articles/2010_rboomit_cmame_preprint.pdf [utexas.edu]
http://www.tacc.utexas.edu/~peterson/articles/2010_hafs.pdf [utexas.edu]
Put CUDA on a phone, then we can talk (Score:2, Insightful)
I'll believe they've created mobile supercomputing when someone puts a powerful GPU that is CUDA-ready in a smartphone.
Of course, you better get some big batteries for your phone, because Teraflops ain't free
Re: (Score:3, Funny)
CUDA isn't supercomputing. CUDA is more like super-doopercomputing. And it's a fucking crime that nVidia isn't doing better in the market with it.
Re: (Score:3, Interesting)
The lattice QCD people, at least, are porting their code to CUDA just as fast as they can. The bottleneck right now is that there's no good way to get multiple GPU's to communicate (quickly). So, for the largest problems (simulating a 64x64x64x192 lattice), you still need a conventional supercomputer (like Ranger, the one in the article here), because it's just too huge to put on a single GPU and multi-GPU doesn't scale well.
But for smaller problems (like a 24x24x24x64 lattice), GPU's will be great, and peo
Yes smartphones can display results from (Score:2)
super computers, big whoop.
The phone isn't actually doing anything but functioning as a UI.
Re: (Score:2, Informative)
That's an RTFA comment right there.
This isn't just a UI, it's a reduction of the algorithm provided by a supercomputer. However, I believe that this first set of lines is misleading, inaccurate, and likely an example of the writer not knowing what they're talking about:
What if you could perform supercomputing calculations in real-time, on your smartphone ... Researchers ... have created an application that does just that.
It doesn't do supercomputing because it isn't a supercomputer, it just makes an educated guess based on sitting at the supercomputer's knee and playing "monkey see, monkey do". Not a bad trick but the claim's overwrought.
Re: (Score:1)
I'm pretty sure you have a different definition of 'supercomputing' than the rest of the world and I.
I'm fairly sure you also don't understand what they are talking about either. They are attempting to imply they are doing the work of a super computer on the phone, and in fact the super computer is doing the brunt work of reducing it so it only leaves a tiny bit of work left for the phone to process so it looks impressive. That tiny bit just happens to be the most useful bits for a person to play with.
I d
Re: (Score:2)
If you replace "educated guess" with "first-order approximation", it sounds a lot better -- and, in fact, this happens in the sciences all the time. But that's just what a first-order approximation is: it's a guess (based on the first term of a series) that is educated (based on some belief that the subsequent terms are smaller).
Re: (Score:1)
Wow, must have hit a nerve. Where to start:
I assure you, I understand supercomputing well enough and that your comments on supercomputing don't counter what I was saying. Your description of the action of their software:
Most of the work is done on a super computer, then a tiny UI layer is thrown on top
doesn't accurately describe model reduction algorithms. Another commenter pointed out that 'first order approximation' is a better term. The phone doesn't 'finish the last bit' of processing but makes a low-order approximation of the entire process.
As Entropius also noted, educated gues
FEMM for android. (Score:3, Insightful)
I've been using FEMM lately for some magnetics stuff I've been working on. I would LOVE an android port, or some way to run simulations from my phone.
I don't *really* need it, but its just funny how something like that is actually possible these days. We probably will have supercomputers in our hands someday. I mean, current phones already are supercomputers by the standards of what...? 30 years ago? 20 years ago?
Smartphones will become the tricorders of the future, its inevitable.
-Taylor
Re: (Score:1)
My stock Nexus One running Froyo (Android 2.2) gets an effective 34 MFlops on linpack (and I didn't even kill the background tasks).
This is better than 1969's top supercomputer (theoritical peak of 36 MFlops, effective ~10) and equivalent to 1974's top supercomputer CDC STAR-100 [wikipedia.org] that had a theoritical peak of 100 MFlops but which had much lower realworld performance.
The nexus one cost me 600 with tax and shipping. The CDC 7600, which is easily beaten, cost 5 millions in 1970's dollars.
Run-of-the-mill modern
Batteries, anyone? (Score:1)
Seems inefficient (Score:2)
Supercomputers are big. Even when idle they still require lots of power and cooling, so ideally you want your supercomputer to be 100% utilized all of the time. That's why most supercomputers are "over-subscribed" and have batch schedulers (moab/torque, PBS, LSF, etc.). Users submit jobs, and the scheduler goes about placing those jobs on the supercomputer in a way that keeps utilization as close to 100% as possible. This means that typically when you submit a job it will not run immediately.
If your cel
Re: (Score:3, Insightful)
as soon as Cray has a smartphone, I'll try it (Score:2)
oh, wait, it will probably be on (shudder) ATT. all 11,000 pounds of freon-cooled 3-phase phone.
Mobile Objects (Score:2)
I've been waiting over a decade for Java's features to support mobile objects to have an infrastructure that made deploying them worthwhile. Why send the logic around the network, instead of just sending the data to where the processors are? Well, with the vast majority of computing power now distributed among so many users, and mostly idling across the year, it's worth using distributed supercomputing now. Folding@Home was a good start, but the distributed app should be generic enough that any crunching ca
web access (Score:1)