Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Power Upgrades Hardware Technology

ARM Claims PS3-Like Graphics On Upcoming Mobile GPU 217

l_bratch writes, quoting from the BBC, "'British computer chip designer ARM has unveiled its latest graphics processing unit (GPU) for mobile devices. The Mali-T658 offers up to ten times the performance of its predecessor." ARM claims that its latest GPU, which will be ready in around two years, will have graphics performance akin to the PlayStation 3. If this has acceptable power consumption for a mobile device, could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"
This discussion has been archived. No new comments can be posted.

ARM Claims PS3-Like Graphics On Upcoming Mobile GPU

Comments Filter:
  • In two years (Score:2, Insightful)

    by starmonkey ( 2486412 )
    In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.
    • Re:In two years (Score:5, Insightful)

      by zero.kalvin ( 1231372 ) on Thursday November 10, 2011 @11:37AM (#38012762)
      It doesn't matter if in two years we would be experiencing better graphics, just imagine playing a PS3 like graphics on something that barely consume 1W ( or dunno how much a mobile device should), and I would bet it wouldn't cost that much either.
      • Re:In two years (Score:5, Insightful)

        by Anonymous Coward on Thursday November 10, 2011 @12:13PM (#38013218)

        Except there is NO WAY it can be done at 1w even at the best rate of computing improvements. Remember, they did not mention power usage in their press release, only the submitted did. While they are taking power into consideration, it seems to me more of scale in where idle usage is extremely low with the cores shut down. This is great news for moble devices that don't expect full usage most of the time (assuming the scale is extreme to where idle is extremely low power usage).

        Remember, Arm has been slowly scaling up in speed while x86 scaling down on power usage. It wouldn't be surprising if this new gpu uses more power then traditionally known for arm. That said, alot remains to be seen. Press release and actual performance can be worlds apart. How many times have a company promised something-like performance only for it to not deliver. Hopefully, it's true though.

        • by Guspaz ( 556486 )

          Consider that the PowerVR SGX543MP products support up to 16 cores, but nobody has shipped one with more than 2 (Sony's PS Vita will be the first with 4). I believe the Mali-400 in the SGS2 is a 4-core part.

          Considering that PS3 -> 2013 (when the ARM GPU is supposed to come out) is seven years, so we should see ~4 doublings, or 16x the performance that we saw in 2006 when the PS3 came out.

          If we make the out-of-my-ass assumption that a 4-core mali-400 uses 2W of power at full load, and a 16-core T-658 will

          • Re:In two years (Score:5, Insightful)

            by Moryath ( 553296 ) on Thursday November 10, 2011 @04:36PM (#38016154)

            I bet a lot of this is fudging due to size-of-screen.

            Think about it. When the PSX came out, your average homeowner's TV screen was a scant 20 inches (4:3 ratio) diagonal. When the PS2 came out, that was a "whopping" 24".

            When the PS3 came out? Yeah. 37" or larger 16:9 widescreens. A lot of them, given initial price tag, to well above 40".

            Now play a PS2 or PSX game on that humongous screen. Looks like shit, doesn't it? Load that PSX game up instead in the Popstation version on your PSP, or in an emulator on a 13" or even 15" laptop playing with a USB controller from a few feet away. Suddenly it looks a whole hell of a lot better.

            "PS3-level graphics" can be fudged quite a bit when you're dealing in "mobile" devices of a tiny screen and not trying to push massive amounts of AA to get rid of "jaggies" on a bigger screen with bigger pixels.

      • by ArcherB ( 796902 )

        It doesn't matter if in two years we would be experiencing better graphics, just imagine playing a PS3 like graphics on something that barely consume 1W ( or dunno how much a mobile device should), and I would bet it wouldn't cost that much either.

        I still believe that PS3 graphics will be severely dated in two years and is probably dated now. However, if this chip is truly low power and cool running, why not put 10+ of them on a single card?

        • Re:In two years (Score:4, Insightful)

          by somersault ( 912633 ) on Thursday November 10, 2011 @12:17PM (#38013280) Homepage Journal

          Because we're not talking about graphics cards, we're talking about single chips for using in phones, etc, where compactness and power usage are very important?

          • I don't know much about ARM GPUs, but if these turn out to be significantly lower-powered than their counterparts, couldn't multi-gpu ARM boards be put to great use for GPGPU applications?

            • I don't really think GPU hardware is limited by legacy architecture design to the extent that CPUs are. Which means that I think current generation desktop GPUs will already be quite efficient (despite there being graphics cards out that require secondary power connectors). Even x86 is being made more efficient all the time, it seems..

        • I think you misunderstand ARM's market. ARM is not in the desktop market, or even in the laptop market except at the low end. They do, however, completely own the embedded market right up to the top end of the smartphone and table markets. This kind of core will end up in smartphones and tablets. You will be able to run PS2-era graphics on something that fits in your pocket and work from batteries (and probably has Thunderbolt or HDMI output for connecting it up to a big screen). It isn't competing wit
          • Re:In two years (Score:5, Insightful)

            by Belial6 ( 794905 ) on Thursday November 10, 2011 @04:49PM (#38016338)
            You do realize that the desk isn't why PCs use so much power right? You also realize that people will still use desks whether they have an x86 PC or not, just as they did before the x86 was invented right? ARM is absolutely working towards competing with x86. In what way is trying to get people to buy an Arm computing device instead of an x86 computing device not competing?

            The ARM was just as much a desktop CPU as the x86 was. The difference is that ARM got crushed in the desktop market. At the time, the desktop market demanded computing power at any energy cost, and ARM simply couldn't keep up with Intel. With Intel's focus on the desktop, ARM proceded to pick up Intels scraps. All of the little markets that Intel decided were too small to worry about.

            Fast forward to the 2000, and desktop speeds start outpacing most user's needs. The last 5-6 years of desktop speed improvements have basically been a CPU bubble. CPU speeds have increased faster than most people have any use for. We are currently seeing a state where people are realizing that they are vastly over paying in energy for their CPU processing power usage. The bubble is bursting. ARM is way behind in ramping up the processing power of their CPUs as well as way behind in ramping up their CPU power usage. Like many other bubbles, suddenly people realize that what they were chasing isn't worth it, and they would rather have what was available 10 years ago.

            In today's post CPU bubble environment, we are seeing a situation where Intel's CPU's are not low power enough to cover the entire market, and ARM's CPUs are not fast enough. They are both racing to hit the sweet spot that gives them market dominance, but don't be fooled into thinking that they are not racing to the same goal. They are just at opposite ends of the field.

            The question is who will reach the goal first. On the Intel side, you have dominance in the traditional computing environment as well as market mindshare. Most people know who Intel is and that their PC uses an Intel processor, but most could not tell you what kind of processor their ARM device uses. On ARM's side, you have a new market that did not care about Intel compatibility, and settled on ARM.

            Intel will continue to push downward to smaller devices, while ARM will continue to push upward with larger ones.
            • by Nutria ( 679911 )

              suddenly people realize that what they were chasing isn't worth it, and they would rather have what was available 10 years ago.

              What an absolute steaming crock of shit. Who here really wants a 32 bit 1.3GHz Athlon XP back on their desktop?

    • Re:In two years (Score:5, Insightful)

      by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Thursday November 10, 2011 @11:40AM (#38012802) Homepage Journal

      In two years, PS3-like graphics will be insufficient

      Counterexample: FarmVille.

      • Re:In two years (Score:5, Insightful)

        by trum4n ( 982031 ) on Thursday November 10, 2011 @11:50AM (#38012932)
        Better Counterexample: Minecraft!
        • 500,000,000 downloads of Angry Birds?

        • Re: (Score:3, Informative)

          by ifrag ( 984323 )

          Although in Minecraft, you can get some high res textures that make the game look a little more modern, and there are also modded shaders which can do some neat stuff as well. Even stuff like bump mapping.

          I was playing with the default 16x16 for a long time, but I've finally got a little sick of it and made the switch up to 32x32.

          • Just did 256x256 just for shits and giggles. It looks pretty damn amazing with all that nice stuff in the texture pack I found. I also installed a mod that allows the HD stuff and a bunch of other performance options. :)
        • Even better: Cave Story!

          Much better (and semi-joking): NetHack ;)

      • by kesuki ( 321456 )

        one thing i've noticed since returning to slashdot is that all the kids are claiming supercomputer specs are running slow on their tablets etc. seriously there was a time when 3 mhz was cutting edge. what happened to break all the 3 mhz codebase? gone cause some idiot flagged it as obsolete?

        the only reason why hardware with low specs isn't running right is some form of virus like a rabbit program, or some form of power management that is crippling the hardwares specs. i know slashdot has always been a honey

        • by tepples ( 727027 )

          what happened to break all the 3 mhz codebase?

          Changes in CPU architecture, for one thing. Z80 bytecode won't run natively on an Athlon. Also an increase in standards for graphic design and internationalization. It's a lot slower to render bidirectional or ideographic text with stacked diacritics in dozens of writing systems using antialiased scalable fonts with color and shadow than to render monospace fonts from a 7- or 8-bit character set in one size and in black and white.

      • Or- Wii.

        The graphics processing on that is feeble compared to PS3 / XB360, and yet it outsold both of them for a while. Still sells relatively well now (albeit at 5-year-old-hardware prices).

    • Well it is the point that the PS/3 is old in computer terms. While this is an advancement for mobile computing It still fits in the fact in terms of performance mobile computing is about 5-10 years behind desktop computing.

      • by bill_mcgonigle ( 4333 ) * on Thursday November 10, 2011 @12:23PM (#38013356) Homepage Journal

        mobile computing is about 5-10 years behind desktop computing

        And it always will be, unless somebody devises as way to provide 15A of power to a mobile device, and a way to dissipate that sort of heat.

        Now, we may eventually reach a state where it just doesn't matter - everybody will have enough computing power on their phone to raytrace a 4K HD stream in realtime and they will reach a natural equilibrium where it just doesn't make sense to make faster chips for desktop computers. Or, we might see such great Internet pervasiveness that everybody just has thin-clients and computes on a CPU farm, but until either of those things happen, desktops will be faster than mobile devices.

        • I am thinking about advancements in PC displays. I remember 110x100 graphics then 320x200 then 640x480, 800x600 then 1024x768 (on 14" screens) After that they just made displays larger to handle more pixels. the iPhone 4 was one of the first devices I have seen that offered a higher DPI.
          The same thing with color depth, Monochrome, CGA (4 color), Ega (16 color), VGA (256 Color), SVGAs that now offer the common 16 bit/24 bit/32 bit colors.

          Once we reach our perception limit we don't need to improve an area.

    • Re:In two years (Score:5, Interesting)

      by poetmatt ( 793785 ) on Thursday November 10, 2011 @11:56AM (#38013010) Journal

      umm, look at the tegra 3. ARM graphics are catching up to consoles quite easily (consoles were always behind). Remember, it's been 3 years where we went from "ARM can barely handle nintendo emulation (single core/500mhz/125mhz gpu)" to "ARM is competing with PS3(4 cores, 1.5ghz, 300+mhz multicore gpu)". In *3* years. All with devices that are more efficient with power than anything intel can offer. So what do you see for the next 12 months, let alone 3-4 years? Even if the increases slow down they're basically going to make x86 processors irrelevant.

      • we went from "ARM can barely handle nintendo emulation (single core/500mhz/125mhz gpu)" to "ARM is competing with PS3(4 cores, 1.5ghz, 300+mhz multicore gpu)". In *3* years.

        Are you comparing emulating an NES to running native games? An emulator has to contend with the entire game engine being written in bytecode, and it has to process graphics a scanline at a time so that games' raster effects (parallax scrolling, fixed position status bars, Pole Position/Street Fighter/NBA Jam floor warping, etc.) still work. A native game running on a 3D GPU doesn't need the bytecode interpreter overhead, and it can render one object at a time because it doesn't need raster effects.

        • by Hatta ( 162192 )

          Compare emulating an NES on a handheld to emulating it on a PC. FCEU runs well on a 200mhz Pentium. Shouldn't a 500mhz ARM do better?

          • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Thursday November 10, 2011 @01:48PM (#38014226) Homepage Journal

            FCEU runs well on a 200mhz Pentium. Shouldn't a 500mhz ARM do better?

            Not necessarily. Compatibility demands have increased since the Nesticle days and even since the FCE Ultra 0.98 days, and users are less willing to put up with known emulation glitches in specific games than they used to be. The "new PPU" engine in FCEUX is slower, but its behavior is more accurate to that of the NES than the old PPU, and some games demand this accuracy. For example, the Final Fantasy orb effect, text boxes in Marble Madness, and certain things in Sid Meier's Pirates are all done with cycle-timed mid-scanline writes to the PPU's I/O ports. The English version of Castlevania 3 and later Koei games use an IC called "MMC5" that's almost as complex as the coprocessors used in some Super NES games.

            • by Hatta ( 162192 )

              Compatibility demands have increased since the Nesticle days and even since the FCE Ultra 0.98 days, and users are less willing to put up with known emulation glitches in specific games than they used to be.

              Maybe I'm misremembering, but NesDS seems a lot less compatible than FCEU did back in the day.

              • by tepples ( 727027 )
                For one thing, I wasn't aware nesDS was still being maintained. For another, the DS has a 67 MHz CPU and therefore can't run the whole PPU in software, so it emulates NES video using DS's tiled video mode. This doesn't work for mid-scanline effects.
          • by kesuki ( 321456 )

            agreed. arm shouldn't need 5 1ghz cores plus 12 gpu cores. that much power and they could be cracking encryption.

    • by geekoid ( 135745 )

      Its' for the mobile market. So, MW3 on your phone.

    • In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.

      Never underestimate the low-end. Imagine a dongle with an HDMI plug on one end that just plugs into a TV set, but inside it has a chip that can do PS3-level graphics, WiFi for downloading games, Bluetooth for controllers, and enough flash to cache them.

      Most HDMI ports can provide 150mA at 5V, which is minimal for this sort of application, but within sight in the next several years.

      • by kesuki ( 321456 )

        this reminds me of dragonball where they eventually become so strong that they would blow up the earth trying to fight there.
        the numbers are being faked by someone, and nobody here seems to care that somehow we went from 3 mhz cpu with 3 mhz gpu all the way to 64bit 6 core 2.2 ghz cpus and 512 bit 800MHz 1408 Stream Processing Units gpus all in what 20-25 years?

        it's beyond absurd and frankly i don't like it. this is why i have a 40(60 watt 3d gaming) watt(3.5amp) computer and a 70 watt tv set. if they cras

    • In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.

      PS3 graphics are a bit dated already. Consoles (and console ports) are seriously limiting the graphics in current run games. It's a pity, really. Good that cell phones will have circa 2006 GPU capabilities soon, though.

  • Resolution! (Score:5, Insightful)

    by jonnythan ( 79727 ) on Thursday November 10, 2011 @11:33AM (#38012698)

    Sure, PS3-like graphics... except the PS3 is doing it at 1280x720 or 1920x1080. This will be pushing probably 20-40%% of the pixels.. and doing so in 2 years, while the PS3 hardware is 5 years old (to the day).

    So, no, I don't think that a chipset that will, in 2013, do 20% of the job that 2006 hardware does will be making its way into high-end PCs and consoles soon.

    • I think you're looking at the wrong side of the street. This isn't about the top-end computing power; it's about the efficiencies on the bottom end. So, now you can start churning out laptops and cheap PCs with pedestrian graphics cards that use low power and provide significant performance. No need to take the truck nuts off your Dell, sir.

      • I was addressing the question at the end:

        "could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"

        • could we be seeing ultra-low power hardware in high-end PCs and consoles soon?

          I thought that was the entire point of the Wii. Because the "sensor bar" (IR position reference emitter banks) needed to sit by the TV, the console's case needed to be small. This meant Nintendo couldn't use a CPU and GPU with a high TDP, so it stuck with what is essentially a 50% overclocked GameCube. I guess Nintendo is trying the same tactic with the Wii U: take a roughly Xbox 360-class CPU and GPU and take advantage of six years of process shrinks to get the TDP down so it'll fit in the same size case.

        • Comment removed based on user account deletion
        • I was addressing the question at the end:

          "could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"

          Not soon, but inevitably. The equation is: better power efficiency equates to more stream cores. The number of stream cores tends to increase to compensate, so discrete graphics card power consumption stays about the same, near the maximum of what typical cooling systems can accommodate. This somewhat obscures the ongoing trend to lower power designs. However, power consumption per stream unit governs the maximum practical throughput (aka heat dissipation) of high end discrete cards. Therefore it is only a

    • Re: (Score:3, Informative)

      by Anonymous Coward

      Sure, PS3-like graphics... except the PS3 is doing it at 1280x720 or 1920x1080. This will be pushing probably 20-40%% of the pixels.. and doing so in 2 years, while the PS3 hardware is 5 years old (to the day).

      So, no, I don't think that a chipset that will, in 2013, do 20% of the job that 2006 hardware does will be making its way into high-end PCs and consoles soon.

      Except most phones released today have 1080p output via hdmi. So now what?

    • There will still be some hard-core graphics intensive games that will require whatever the cutting edge in graphics is at that point.

      However, as old as PS3 may be- the fact is, that, for most of us non-hard-core gamers PS3 quality graphics is more than enough (and will be still in another 5 years time) for the vast majority of games we'd want to play.

      We're beginning to hit a point of diminishing returns on graphics anyway- you're always going to be limited by what the eye can process, and the ability of the

      • Graphics card companies always try to outperform their competitors. You can do that on price only, but no one is going to buy a new card that's exactly as powerful as the one you already have only cheaper. For that reason I suspect the current trend to continue.
        • Yes, they'll continue to get better and better- and the hardcore gamers will continue to pay over $200 for a graphics card to get the best of the best.

          However, I think for many of us- we're going to see continually less and less gain for each extra $1 spent.

          You can get a 1gig DDR3 video card from Newegg or TigerDirect for $15 after rebate these days.

          Sure, that's not high-tech anymore- but it will run most games fairly well that non-hardcore gamers are likely to encounter.

          If I were into first person shooters

    • In 2 years a phone with a 1080p display is a likely reality. We already have phones/tablets running at/near 1280x720 which is 50% of the 1080p pixel count. But to say that it would be acceptable on the high end PC side is a stretch, in 2 years we will probably have desktop expectations beyond 1080p. Entry level to mid market could see a benefit though, that market has been under served by horrible attempts at "integrated" graphics for years. It will be interesting to see if this GPU compares to the beef

    • Samsung Galaxy Nexus has a 1280x720 screen. And most Android 4.0 devices coming out in the next 12 months will include 1280x720 screens.

      And pretty much every Android device released this year includes a mini-HDMI port for connecting to 720p and 1080p screens.

      IOW, current and future Android phones can already do what you think they can't.

    • The high-end smartphones (Galaxy Nexus) are already at 1280x720. Even if they don't improve at all, in the next two years ARM based netbooks are sure to offer full 1080p.

  • by allanw ( 842185 ) on Thursday November 10, 2011 @11:36AM (#38012750)
  • No (Score:4, Informative)

    by Baloroth ( 2370816 ) on Thursday November 10, 2011 @11:37AM (#38012758)
    The PS3 is 5 years old and based on even older graphics tech. Beating that on mobile is cool, but not surprising. The PS3 never was impressive, graphically, to PC users. Who had better than HD resolutions for years. Some console games are still limited to 720P. Oh, and people had 3D on PC like, 8 years ago (or more.) Sucked then, sucks now.
    • Re:No (Score:5, Informative)

      by Nemyst ( 1383049 ) on Thursday November 10, 2011 @12:05PM (#38013128) Homepage

      Some? Make that most. You can count on two hands 1080p, 60 fps games on both 360 and PS3, with most being 2D games that don't need any sort of graphical power to run.

    • by AdamJS ( 2466928 )

      Having the equivalent of a 7600GT in a super low power mobile form factor would be great, especially considering the actual demands (resolution/AA) would be lower anyways.

  • Tegra 5 (Score:2, Troll)

    by backslashdot ( 95548 )

    nVidia is commited to releasing a new Tegra chip every year. The Tegra 3, which is already out is 5x faster than Tegra 2 (which beats the Mali 400 which is at 1/10th the speed of the GPU ARM announced). So basically, by the time this ARM CPU is released .. Tegra 5 will be out .. and going by the roadmap of how fast Tegra 5 will be .. it will run at least 5x times faster than ARM's chip.

    I hope ARM prices this cheap dirt cheap .. so that sub $200 (off contract) phones can have it.

    • by Andy Dodd ( 701 )

      The Tegra 2's GPU is NOT that hot.

      Hell it can't even play H.264 Main/High profile video at 720p. The Mali-400 has no problem with this.

      (I own a Tegra2 device and an Exynos device with a Mali-400 - in almost any workload, the Exynos utterly dominates the Tegra2 despite the CPU only being clocked 20% higher.)

      • Hell it can't even play H.264 Main/High profile video at 720p.

        Both my Transformer and my Xoom have been able to play H.264 Main/High profile since Android 3.1 came out. The original problem was caused by software problem, not hardware. Link [xda-developers.com].

    • Re:Tegra 5 (Score:5, Insightful)

      by abigsmurf ( 919188 ) on Thursday November 10, 2011 @11:50AM (#38012930)
      Doesn't Tegra have major heat issues that stop it from being in anything smaller than tablets?

      Both Sony and Nintendo considered using it for their new consoles but the heat and power usage apparently made them turn away from it.
      • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Thursday November 10, 2011 @12:15PM (#38013268) Homepage Journal

        Both Sony and Nintendo considered using it for their new consoles but the heat and power usage apparently made them turn away from it.

        And Nintendo ended up using something just as hot and power-hungry for the 3DS. As I understand it, the reason Nintendo ditched Tegra for the 3DS had everything to do with the fact that Tegra wouldn't work with an ARM9 core (ARMv5), and Nintendo needed something cycle-accurate to the ARM946E in order to play DS and DSi games without glitches.

    • How many companies use ARM's GPU? nVidia uses their own GeForce. Qualcomm uses Adreno. TI and Apple use PowerVR. Samsung uses PowerVR and ARM. But as far as I know they are the only ones that use ARM.
  • Yea right (Score:5, Insightful)

    by nedlohs ( 1335013 ) on Thursday November 10, 2011 @11:42AM (#38012838)

    In 2 years time the PS3 will be 7 years old.

    The PS2 was 7 years old in 2007. Were PS2 level graphics acceptable for "high end PCs and consoles" in 2007?

    No? Then why would PS3 level be acceptable in 2013?

    • Re:Yea right (Score:5, Interesting)

      by Jeng ( 926980 ) on Thursday November 10, 2011 @11:50AM (#38012934)

      Because we are getting to the point in technology that us humans won't be able to perceive the difference in graphics.

      You can only make something so lifelike, after that you might as well aim at efficiency.

      • Re:Yea right (Score:5, Informative)

        by 0123456 ( 636235 ) on Thursday November 10, 2011 @11:54AM (#38012990)

        Because we are getting to the point in technology that us humans won't be able to perceive the difference in graphics.

        Hollywood is getting close, but they have huge render farms, terabytes of source data and can spend hours rendering a single frame. GPUs are still a long way from producing photo-realistic output.

      • by Pulzar ( 81031 )

        Because we are getting to the point in technology that us humans won't be able to perceive the difference in graphics. You can only make something so lifelike, after that you might as well aim at efficiency.

        Is there a single game out there that's so lifelike that you can't perceive the difference between it and a real video?

        There's plenty more room for improvement, we're not getting anywhere close to that point.

      • by Twinbee ( 767046 )

        A few words for you: Global illumination, path-tracing, trillions of particles, atom worlds, AI.

        See 5, 3 and 2 from this page:
        http://www.skytopia.com/project/cpu/cpu3.html [skytopia.com]

        Also remember that "lifelike" isn't necessarily an ideal, and that there are things we can see which far exceed the mundane visuals you can get from the relatively dull world we inhabit.

    • by LWATCDR ( 28044 )

      High end was a dumb thing to add. PCs in general yes. If can pump out 1080p it will be good enough for 99.7% of current PC users. Are people going to run CAD or high end video games on it? Probably not.
      Gamers just don't seem to get just how small of a percentage of PC users they are. For a good long time PCs will probably be stuck at 1080p for the majority of monitors since TVs will keep the cost of the panels low for a good while.

      • Sure, but "consoles" was also there making "high end" PCs the correct subset to use. Mind you my PC plays games just fine and isn't what I would call high end...

        • by LWATCDR ( 28044 )

          Not really. The Wii was not start of the art when it came out and did very well. I don't hear people screaming for better graphics than the PS/3 or the 360. Combine that with the rise of casual games and yes it could run a console well enough for many users. The high end market could and frankly is shrinking. You can get good video cards and I do mean good cards for around $120 now that will run games very well on the average monitor. You only need the high end cards for 27" high resolution monitors like t

  • by Anonymous Coward on Thursday November 10, 2011 @11:43AM (#38012842)

    There was a story on CNN a few weeks ago that said that while PC sales are slowly increasing in the entire world, it's very tilted, and they are falling dramatically in the US, Canad, and Europe. The increase is coming from the developing world being able to afford computers as they fall in price.

    The culture shift from desktop computing to mobile is happening in part because mobiles are becoming powerful enough to do most of the tasks that desktops used to do. OK, you'll always get a few neckbeards to say "But the cell phone can't run AutoSuperCadShop 390322.55!" But that misses the point. That's not what 99.9% of consumers DO with their computers. They play some games, including 3D games, they check their facebooks, they look at some news headlines, and so on. All that works fine on a device that they can fit in their pocket. For those times a keyboard is needed, a bluetooth keyboard will do just fine. And for those times a larger screen is needed, a wireless connection to the screen will work fine.

    I don't know why people can't see this shift happening right in front of their eyes. Even the sales data bears it out now: mobile computing is on the upswing, and in the western world, PC sales are falling. It's a nice world: Instead of needing to lug around a desktop or even a netbook, you'll have the capability of a (say) 2009 vintage desktop in your shirtpocket in 2014. A 2009 desktop is nothing to sneeze at, and meets the needs of 99% of the population just fine. The rest will become a more expensive niche, but will still exist in some way.

    It's a Star Trek Future.

    • by 0123456 ( 636235 )

      The culture shift from desktop computing to mobile is happening in part because mobiles are becoming powerful enough to do most of the tasks that desktops used to do.

      No, they're not. They're becoming powerful enough to check your email and play Farmville, which is all that many people used to do with their PCs; they're not much good for actual productive work.

      Meanwhile PC gaming has stagnated due to Microsoft concentrating on pushing console games, so there's little reason for the average home user to upgrade. Word won't let you write stuff ten times faster just because you switched from a Pentium-4 to an i7, and when games are limited by being designed for an Xbox and

      • your super-fast GPU will be sitting idle much of the time waiting for something to do.

        One word: Crysis.

      • I think his point which you missed is that checking email and playing FarmVille is what the majority of consumers do. Most of them are not playing leet games or rendering animation. In businesses, they might write in Word or crunch a few numbers in Excel. It doesn't take a quad-core Core i7 to do that. The stagnation comes from the fact that a desktop made 5 years ago will handle the majority of their tasks and mobile computing is approaching the point where they handle a good majority. Also mobile com
        • by kesuki ( 321456 )

          youtube does movies now, some free, some with rental price pay per view has hit the internet... well actually hulu plus did that... but you get the idea.

      • by kesuki ( 321456 )

        http://www.newegg.com/Product/Product.aspx?Item=N82E16814130659 [newegg.com]
        can you find 1024 stream processors for a console yet? in sli mode?
        and yet in a way i long for simple fun games like i used to play on whatever console was popular at the time.
        no i don't like 'social gaming' it's too whiny and spammy and it seems to exist merely as a reason to go to facebook.

  • When we have handhelds as powerful as the PS3 (the Vita is getting there), we'll have much more powerful PCs and a new generation of consoles.

    • by Locutus ( 9039 )
      the story isn't about the handhelds matching desktops, it's about the handhelds getting some very powerful graphics. besides, the reference was with consoles, not desktops. Just because consoles in a few years might be doing holographic displays it doesn't mean handhelds doing pretty nice 3D graphics on battery power isn't nice too.

      LoB
      • by kesuki ( 321456 )

        holographic displays tanked when sega tried them in arcades. they did worse than laserdisc games. it was a load of special glass and small playing field. i only ever saw one once, but i know they were invented... oh wait it was a parabolic screen, http://www.definiteanswers.com/q/What-is-SEGA-s-HOLOGRAPHIC-game-4c120518a7a5b [definiteanswers.com] but as a kid it seemed like a holographic game!

        the thing is, i don't like games like i used to, especially since some hardware is really flaky like cheap Chinese knockoffs. that and i h

  • Five years ago tomorrow the PS3 made it's debut, did you think that in the mean time everyone just sat back and basked in the glory of its infinite capabilities? Two years from now (if that pans out) will be 7 years since the commercialization of the Cell chip, so seeing a miniature version that uses dramatically less power is pretty much par for the course. Desktop chips that have similar (or more specific) capabilities are already available in many products. Remember, the first PS3 drew an amazing 200

    • by Locutus ( 9039 )
      good points but we are also talking about things in the single digits for power consumption. I agree, die shrinkage and advances in designs give lots of power savings. Still, having PS3 like graphics on a handheld will be nice.

      LoB
  • I think this will make a huge difference in mobile gaming because of screen size. Assuming that this thing outputs to 720p like the Nexus Galaxy, I think this will be a big thing.

    While the PS3 graphics are old and crappy compared to what a modern PC can do, don't forget about screen size. Seeing 720p on a 40 inch screen is a lot different than seeing 720p on a 5 inch screen. The best example of this is fonts that look fine at 5 inches will look like crap expanded to 40 inches. Artifacts and jaggedness on

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...