Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Power Upgrades Hardware Technology

ARM Claims PS3-Like Graphics On Upcoming Mobile GPU 217

l_bratch writes, quoting from the BBC, "'British computer chip designer ARM has unveiled its latest graphics processing unit (GPU) for mobile devices. The Mali-T658 offers up to ten times the performance of its predecessor." ARM claims that its latest GPU, which will be ready in around two years, will have graphics performance akin to the PlayStation 3. If this has acceptable power consumption for a mobile device, could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"
This discussion has been archived. No new comments can be posted.

ARM Claims PS3-Like Graphics On Upcoming Mobile GPU

Comments Filter:
  • In two years (Score:2, Insightful)

    by starmonkey ( 2486412 ) on Thursday November 10, 2011 @11:32AM (#38012690)
    In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.
  • Resolution! (Score:5, Insightful)

    by jonnythan ( 79727 ) on Thursday November 10, 2011 @11:33AM (#38012698)

    Sure, PS3-like graphics... except the PS3 is doing it at 1280x720 or 1920x1080. This will be pushing probably 20-40%% of the pixels.. and doing so in 2 years, while the PS3 hardware is 5 years old (to the day).

    So, no, I don't think that a chipset that will, in 2013, do 20% of the job that 2006 hardware does will be making its way into high-end PCs and consoles soon.

  • Re:In two years (Score:5, Insightful)

    by zero.kalvin ( 1231372 ) on Thursday November 10, 2011 @11:37AM (#38012762)
    It doesn't matter if in two years we would be experiencing better graphics, just imagine playing a PS3 like graphics on something that barely consume 1W ( or dunno how much a mobile device should), and I would bet it wouldn't cost that much either.
  • Re:In two years (Score:5, Insightful)

    by tepples ( 727027 ) <tepples.gmail@com> on Thursday November 10, 2011 @11:40AM (#38012802) Homepage Journal

    In two years, PS3-like graphics will be insufficient

    Counterexample: FarmVille.

  • Yea right (Score:5, Insightful)

    by nedlohs ( 1335013 ) on Thursday November 10, 2011 @11:42AM (#38012838)

    In 2 years time the PS3 will be 7 years old.

    The PS2 was 7 years old in 2007. Were PS2 level graphics acceptable for "high end PCs and consoles" in 2007?

    No? Then why would PS3 level be acceptable in 2013?

  • by Anonymous Coward on Thursday November 10, 2011 @11:43AM (#38012842)

    There was a story on CNN a few weeks ago that said that while PC sales are slowly increasing in the entire world, it's very tilted, and they are falling dramatically in the US, Canad, and Europe. The increase is coming from the developing world being able to afford computers as they fall in price.

    The culture shift from desktop computing to mobile is happening in part because mobiles are becoming powerful enough to do most of the tasks that desktops used to do. OK, you'll always get a few neckbeards to say "But the cell phone can't run AutoSuperCadShop 390322.55!" But that misses the point. That's not what 99.9% of consumers DO with their computers. They play some games, including 3D games, they check their facebooks, they look at some news headlines, and so on. All that works fine on a device that they can fit in their pocket. For those times a keyboard is needed, a bluetooth keyboard will do just fine. And for those times a larger screen is needed, a wireless connection to the screen will work fine.

    I don't know why people can't see this shift happening right in front of their eyes. Even the sales data bears it out now: mobile computing is on the upswing, and in the western world, PC sales are falling. It's a nice world: Instead of needing to lug around a desktop or even a netbook, you'll have the capability of a (say) 2009 vintage desktop in your shirtpocket in 2014. A 2009 desktop is nothing to sneeze at, and meets the needs of 99% of the population just fine. The rest will become a more expensive niche, but will still exist in some way.

    It's a Star Trek Future.

  • Re:Tegra 5 (Score:5, Insightful)

    by abigsmurf ( 919188 ) on Thursday November 10, 2011 @11:50AM (#38012930)
    Doesn't Tegra have major heat issues that stop it from being in anything smaller than tablets?

    Both Sony and Nintendo considered using it for their new consoles but the heat and power usage apparently made them turn away from it.
  • Re:In two years (Score:5, Insightful)

    by trum4n ( 982031 ) on Thursday November 10, 2011 @11:50AM (#38012932)
    Better Counterexample: Minecraft!
  • by tepples ( 727027 ) <tepples.gmail@com> on Thursday November 10, 2011 @12:01PM (#38013088) Homepage Journal

    we went from "ARM can barely handle nintendo emulation (single core/500mhz/125mhz gpu)" to "ARM is competing with PS3(4 cores, 1.5ghz, 300+mhz multicore gpu)". In *3* years.

    Are you comparing emulating an NES to running native games? An emulator has to contend with the entire game engine being written in bytecode, and it has to process graphics a scanline at a time so that games' raster effects (parallax scrolling, fixed position status bars, Pole Position/Street Fighter/NBA Jam floor warping, etc.) still work. A native game running on a 3D GPU doesn't need the bytecode interpreter overhead, and it can render one object at a time because it doesn't need raster effects.

  • Re:In two years (Score:5, Insightful)

    by Anonymous Coward on Thursday November 10, 2011 @12:13PM (#38013218)

    Except there is NO WAY it can be done at 1w even at the best rate of computing improvements. Remember, they did not mention power usage in their press release, only the submitted did. While they are taking power into consideration, it seems to me more of scale in where idle usage is extremely low with the cores shut down. This is great news for moble devices that don't expect full usage most of the time (assuming the scale is extreme to where idle is extremely low power usage).

    Remember, Arm has been slowly scaling up in speed while x86 scaling down on power usage. It wouldn't be surprising if this new gpu uses more power then traditionally known for arm. That said, alot remains to be seen. Press release and actual performance can be worlds apart. How many times have a company promised something-like performance only for it to not deliver. Hopefully, it's true though.

  • Re:In two years (Score:4, Insightful)

    by somersault ( 912633 ) on Thursday November 10, 2011 @12:17PM (#38013280) Homepage Journal

    Because we're not talking about graphics cards, we're talking about single chips for using in phones, etc, where compactness and power usage are very important?

  • by bill_mcgonigle ( 4333 ) * on Thursday November 10, 2011 @12:23PM (#38013356) Homepage Journal

    mobile computing is about 5-10 years behind desktop computing

    And it always will be, unless somebody devises as way to provide 15A of power to a mobile device, and a way to dissipate that sort of heat.

    Now, we may eventually reach a state where it just doesn't matter - everybody will have enough computing power on their phone to raytrace a 4K HD stream in realtime and they will reach a natural equilibrium where it just doesn't make sense to make faster chips for desktop computers. Or, we might see such great Internet pervasiveness that everybody just has thin-clients and computes on a CPU farm, but until either of those things happen, desktops will be faster than mobile devices.

  • Re:In two years (Score:5, Insightful)

    by Moryath ( 553296 ) on Thursday November 10, 2011 @04:36PM (#38016154)

    I bet a lot of this is fudging due to size-of-screen.

    Think about it. When the PSX came out, your average homeowner's TV screen was a scant 20 inches (4:3 ratio) diagonal. When the PS2 came out, that was a "whopping" 24".

    When the PS3 came out? Yeah. 37" or larger 16:9 widescreens. A lot of them, given initial price tag, to well above 40".

    Now play a PS2 or PSX game on that humongous screen. Looks like shit, doesn't it? Load that PSX game up instead in the Popstation version on your PSP, or in an emulator on a 13" or even 15" laptop playing with a USB controller from a few feet away. Suddenly it looks a whole hell of a lot better.

    "PS3-level graphics" can be fudged quite a bit when you're dealing in "mobile" devices of a tiny screen and not trying to push massive amounts of AA to get rid of "jaggies" on a bigger screen with bigger pixels.

  • Re:In two years (Score:5, Insightful)

    by Belial6 ( 794905 ) on Thursday November 10, 2011 @04:49PM (#38016338)
    You do realize that the desk isn't why PCs use so much power right? You also realize that people will still use desks whether they have an x86 PC or not, just as they did before the x86 was invented right? ARM is absolutely working towards competing with x86. In what way is trying to get people to buy an Arm computing device instead of an x86 computing device not competing?

    The ARM was just as much a desktop CPU as the x86 was. The difference is that ARM got crushed in the desktop market. At the time, the desktop market demanded computing power at any energy cost, and ARM simply couldn't keep up with Intel. With Intel's focus on the desktop, ARM proceded to pick up Intels scraps. All of the little markets that Intel decided were too small to worry about.

    Fast forward to the 2000, and desktop speeds start outpacing most user's needs. The last 5-6 years of desktop speed improvements have basically been a CPU bubble. CPU speeds have increased faster than most people have any use for. We are currently seeing a state where people are realizing that they are vastly over paying in energy for their CPU processing power usage. The bubble is bursting. ARM is way behind in ramping up the processing power of their CPUs as well as way behind in ramping up their CPU power usage. Like many other bubbles, suddenly people realize that what they were chasing isn't worth it, and they would rather have what was available 10 years ago.

    In today's post CPU bubble environment, we are seeing a situation where Intel's CPU's are not low power enough to cover the entire market, and ARM's CPUs are not fast enough. They are both racing to hit the sweet spot that gives them market dominance, but don't be fooled into thinking that they are not racing to the same goal. They are just at opposite ends of the field.

    The question is who will reach the goal first. On the Intel side, you have dominance in the traditional computing environment as well as market mindshare. Most people know who Intel is and that their PC uses an Intel processor, but most could not tell you what kind of processor their ARM device uses. On ARM's side, you have a new market that did not care about Intel compatibility, and settled on ARM.

    Intel will continue to push downward to smaller devices, while ARM will continue to push upward with larger ones.
  • Re:In two years (Score:3, Insightful)

    by Dutch Gun ( 899105 ) on Thursday November 10, 2011 @06:24PM (#38017212)

    I kind of hope for more stagnation in the graphics quality market. Let's just hang out where we are for a while and hopefully the game makers will start competing on interesting story lines, game mechanics, etc. rather than ripples in water in puddles.

    Improved CPU and GPU capabilities and better gameplay are not mutually exclusive. There are physical limitations to, for instance, rendering a huge number of characters on the screen at once. Or the memory is simply not there to utilize all the interesting animations you need to support that interesting storyline you need.

    Look at it this way... better CG technology hasn't necessarily made movies better, but it really expanded the range of what really good filmmakers could do with realistic budgets. Improved game technology is similar. Right now a huge amount of work has to be done optimizing the game engine and assets to reach acceptable quality levels. Once we hit a point where CPUs and GPUs no longer have to struggle to render, say, a nice open world, more gameplay options will naturally emerge as well, as developers won't have to spend quite so much time fighting against the constraints of the hardware.

    Sure, we'll take advantage of the extra power to render extra shiny stuff (or, frankly, just MORE of what we can already do today - draw distance is still a big limitation for complex scenes). But better technology can also allow more gameplay options as well.

    Also, if you haven't seen all the games that have focused heavily on story and/or interesting gameplay mechanics, then I'm not sure what to tell you except that you need to pay more attention. Some examples: Katamari Darmacy, Okami, Half-Live series, Portal series, Uncharted series, Mass Effect series, Shadow of the Colossus, Ico, Bauldur's Gate series, Deux Ex (first or most recent), Bioshock, SWTOR, Guild Wars 1 & 2, Skyrim, SW: Knights of the Old Republic, Braid, Limbo, N+... seriously... we've never had it so good as gamers, and I've been a gamer since damn near the beginning of videogames. People who say otherwise are looking through rose-colored glasses.

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...