Forgot your password?
typodupeerror
Graphics Power Upgrades Hardware Technology

ARM Claims PS3-Like Graphics On Upcoming Mobile GPU 217

Posted by timothy
from the vapor-tastes-like-raspberry-foam dept.
l_bratch writes, quoting from the BBC, "'British computer chip designer ARM has unveiled its latest graphics processing unit (GPU) for mobile devices. The Mali-T658 offers up to ten times the performance of its predecessor." ARM claims that its latest GPU, which will be ready in around two years, will have graphics performance akin to the PlayStation 3. If this has acceptable power consumption for a mobile device, could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"
This discussion has been archived. No new comments can be posted.

ARM Claims PS3-Like Graphics On Upcoming Mobile GPU

Comments Filter:
  • In two years (Score:2, Insightful)

    by starmonkey (2486412) on Thursday November 10, 2011 @11:32AM (#38012690)
    In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.
    • Re:In two years (Score:5, Insightful)

      by zero.kalvin (1231372) on Thursday November 10, 2011 @11:37AM (#38012762)
      It doesn't matter if in two years we would be experiencing better graphics, just imagine playing a PS3 like graphics on something that barely consume 1W ( or dunno how much a mobile device should), and I would bet it wouldn't cost that much either.
      • Re:In two years (Score:5, Insightful)

        by Anonymous Coward on Thursday November 10, 2011 @12:13PM (#38013218)

        Except there is NO WAY it can be done at 1w even at the best rate of computing improvements. Remember, they did not mention power usage in their press release, only the submitted did. While they are taking power into consideration, it seems to me more of scale in where idle usage is extremely low with the cores shut down. This is great news for moble devices that don't expect full usage most of the time (assuming the scale is extreme to where idle is extremely low power usage).

        Remember, Arm has been slowly scaling up in speed while x86 scaling down on power usage. It wouldn't be surprising if this new gpu uses more power then traditionally known for arm. That said, alot remains to be seen. Press release and actual performance can be worlds apart. How many times have a company promised something-like performance only for it to not deliver. Hopefully, it's true though.

        • by Guspaz (556486) on Thursday November 10, 2011 @03:19PM (#38015184)

          Consider that the PowerVR SGX543MP products support up to 16 cores, but nobody has shipped one with more than 2 (Sony's PS Vita will be the first with 4). I believe the Mali-400 in the SGS2 is a 4-core part.

          Considering that PS3 -> 2013 (when the ARM GPU is supposed to come out) is seven years, so we should see ~4 doublings, or 16x the performance that we saw in 2006 when the PS3 came out.

          If we make the out-of-my-ass assumption that a 4-core mali-400 uses 2W of power at full load, and a 16-core T-658 will use 8W, we get the equivalent of a 128W GPU from 2006.

          Yeah, I'm doing a lot of making up numbers, handwaving, and bullshitting, but I'm just trying to illustrate the point that a laptop-class ARM GPU from 2013 will probably be able to match the performance of the 2006 PS3's GPU. Moore's law and all that.

          • Re:In two years (Score:5, Insightful)

            by Moryath (553296) on Thursday November 10, 2011 @04:36PM (#38016154)

            I bet a lot of this is fudging due to size-of-screen.

            Think about it. When the PSX came out, your average homeowner's TV screen was a scant 20 inches (4:3 ratio) diagonal. When the PS2 came out, that was a "whopping" 24".

            When the PS3 came out? Yeah. 37" or larger 16:9 widescreens. A lot of them, given initial price tag, to well above 40".

            Now play a PS2 or PSX game on that humongous screen. Looks like shit, doesn't it? Load that PSX game up instead in the Popstation version on your PSP, or in an emulator on a 13" or even 15" laptop playing with a USB controller from a few feet away. Suddenly it looks a whole hell of a lot better.

            "PS3-level graphics" can be fudged quite a bit when you're dealing in "mobile" devices of a tiny screen and not trying to push massive amounts of AA to get rid of "jaggies" on a bigger screen with bigger pixels.

      • by ArcherB (796902) on Thursday November 10, 2011 @12:14PM (#38013240) Journal

        It doesn't matter if in two years we would be experiencing better graphics, just imagine playing a PS3 like graphics on something that barely consume 1W ( or dunno how much a mobile device should), and I would bet it wouldn't cost that much either.

        I still believe that PS3 graphics will be severely dated in two years and is probably dated now. However, if this chip is truly low power and cool running, why not put 10+ of them on a single card?

        • Re:In two years (Score:4, Insightful)

          by somersault (912633) on Thursday November 10, 2011 @12:17PM (#38013280) Homepage Journal

          Because we're not talking about graphics cards, we're talking about single chips for using in phones, etc, where compactness and power usage are very important?

        • by TheRaven64 (641858) on Thursday November 10, 2011 @12:55PM (#38013682) Journal
          I think you misunderstand ARM's market. ARM is not in the desktop market, or even in the laptop market except at the low end. They do, however, completely own the embedded market right up to the top end of the smartphone and table markets. This kind of core will end up in smartphones and tablets. You will be able to run PS2-era graphics on something that fits in your pocket and work from batteries (and probably has Thunderbolt or HDMI output for connecting it up to a big screen). It isn't competing with some energy-guzzler on the desktop, it's reducing the need for such a thing to exist at all. PS3 graphics aren't as good as they can possibly be, but they're a lot better than is required for a host of applications, and now you can do all of those with a pocket-sized device.
          • Re:In two years (Score:5, Insightful)

            by Belial6 (794905) on Thursday November 10, 2011 @04:49PM (#38016338)
            You do realize that the desk isn't why PCs use so much power right? You also realize that people will still use desks whether they have an x86 PC or not, just as they did before the x86 was invented right? ARM is absolutely working towards competing with x86. In what way is trying to get people to buy an Arm computing device instead of an x86 computing device not competing?

            The ARM was just as much a desktop CPU as the x86 was. The difference is that ARM got crushed in the desktop market. At the time, the desktop market demanded computing power at any energy cost, and ARM simply couldn't keep up with Intel. With Intel's focus on the desktop, ARM proceded to pick up Intels scraps. All of the little markets that Intel decided were too small to worry about.

            Fast forward to the 2000, and desktop speeds start outpacing most user's needs. The last 5-6 years of desktop speed improvements have basically been a CPU bubble. CPU speeds have increased faster than most people have any use for. We are currently seeing a state where people are realizing that they are vastly over paying in energy for their CPU processing power usage. The bubble is bursting. ARM is way behind in ramping up the processing power of their CPUs as well as way behind in ramping up their CPU power usage. Like many other bubbles, suddenly people realize that what they were chasing isn't worth it, and they would rather have what was available 10 years ago.

            In today's post CPU bubble environment, we are seeing a situation where Intel's CPU's are not low power enough to cover the entire market, and ARM's CPUs are not fast enough. They are both racing to hit the sweet spot that gives them market dominance, but don't be fooled into thinking that they are not racing to the same goal. They are just at opposite ends of the field.

            The question is who will reach the goal first. On the Intel side, you have dominance in the traditional computing environment as well as market mindshare. Most people know who Intel is and that their PC uses an Intel processor, but most could not tell you what kind of processor their ARM device uses. On ARM's side, you have a new market that did not care about Intel compatibility, and settled on ARM.

            Intel will continue to push downward to smaller devices, while ARM will continue to push upward with larger ones.
            • by Nutria (679911) on Thursday November 10, 2011 @05:34PM (#38016774)

              suddenly people realize that what they were chasing isn't worth it, and they would rather have what was available 10 years ago.

              What an absolute steaming crock of shit. Who here really wants a 32 bit 1.3GHz Athlon XP back on their desktop?

    • Re:In two years (Score:5, Insightful)

      by tepples (727027) <tepples@nOSpAM.gmail.com> on Thursday November 10, 2011 @11:40AM (#38012802) Homepage Journal

      In two years, PS3-like graphics will be insufficient

      Counterexample: FarmVille.

    • by jellomizer (103300) on Thursday November 10, 2011 @11:46AM (#38012894)

      Well it is the point that the PS/3 is old in computer terms. While this is an advancement for mobile computing It still fits in the fact in terms of performance mobile computing is about 5-10 years behind desktop computing.

      • by bill_mcgonigle (4333) * on Thursday November 10, 2011 @12:23PM (#38013356) Homepage Journal

        mobile computing is about 5-10 years behind desktop computing

        And it always will be, unless somebody devises as way to provide 15A of power to a mobile device, and a way to dissipate that sort of heat.

        Now, we may eventually reach a state where it just doesn't matter - everybody will have enough computing power on their phone to raytrace a 4K HD stream in realtime and they will reach a natural equilibrium where it just doesn't make sense to make faster chips for desktop computers. Or, we might see such great Internet pervasiveness that everybody just has thin-clients and computes on a CPU farm, but until either of those things happen, desktops will be faster than mobile devices.

        • by jellomizer (103300) on Thursday November 10, 2011 @01:55PM (#38014284)

          I am thinking about advancements in PC displays. I remember 110x100 graphics then 320x200 then 640x480, 800x600 then 1024x768 (on 14" screens) After that they just made displays larger to handle more pixels. the iPhone 4 was one of the first devices I have seen that offered a higher DPI.
          The same thing with color depth, Monochrome, CGA (4 color), Ega (16 color), VGA (256 Color), SVGAs that now offer the common 16 bit/24 bit/32 bit colors.

          Once we reach our perception limit we don't need to improve an area.

    • Re:In two years (Score:5, Interesting)

      by poetmatt (793785) on Thursday November 10, 2011 @11:56AM (#38013010) Journal

      umm, look at the tegra 3. ARM graphics are catching up to consoles quite easily (consoles were always behind). Remember, it's been 3 years where we went from "ARM can barely handle nintendo emulation (single core/500mhz/125mhz gpu)" to "ARM is competing with PS3(4 cores, 1.5ghz, 300+mhz multicore gpu)". In *3* years. All with devices that are more efficient with power than anything intel can offer. So what do you see for the next 12 months, let alone 3-4 years? Even if the increases slow down they're basically going to make x86 processors irrelevant.

    • Its' for the mobile market. So, MW3 on your phone.

    • by bill_mcgonigle (4333) * on Thursday November 10, 2011 @12:18PM (#38013296) Homepage Journal

      In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.

      Never underestimate the low-end. Imagine a dongle with an HDMI plug on one end that just plugs into a TV set, but inside it has a chip that can do PS3-level graphics, WiFi for downloading games, Bluetooth for controllers, and enough flash to cache them.

      Most HDMI ports can provide 150mA at 5V, which is minimal for this sort of application, but within sight in the next several years.

      • by kesuki (321456) on Thursday November 10, 2011 @02:21PM (#38014558) Journal

        this reminds me of dragonball where they eventually become so strong that they would blow up the earth trying to fight there.
        the numbers are being faked by someone, and nobody here seems to care that somehow we went from 3 mhz cpu with 3 mhz gpu all the way to 64bit 6 core 2.2 ghz cpus and 512 bit 800MHz 1408 Stream Processing Units gpus all in what 20-25 years?

        it's beyond absurd and frankly i don't like it. this is why i have a 40(60 watt 3d gaming) watt(3.5amp) computer and a 70 watt tv set. if they crash and burn i have fallback hardware that isn't so efficient, but at least i know that i don't have the absurd power requirements of some peoples setups. the tv gets run very rarely though.

    • by kelemvor4 (1980226) on Thursday November 10, 2011 @12:49PM (#38013618)

      In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.

      PS3 graphics are a bit dated already. Consoles (and console ports) are seriously limiting the graphics in current run games. It's a pity, really. Good that cell phones will have circa 2006 GPU capabilities soon, though.

  • Resolution! (Score:5, Insightful)

    by jonnythan (79727) on Thursday November 10, 2011 @11:33AM (#38012698) Homepage

    Sure, PS3-like graphics... except the PS3 is doing it at 1280x720 or 1920x1080. This will be pushing probably 20-40%% of the pixels.. and doing so in 2 years, while the PS3 hardware is 5 years old (to the day).

    So, no, I don't think that a chipset that will, in 2013, do 20% of the job that 2006 hardware does will be making its way into high-end PCs and consoles soon.

    • by GodfatherofSoul (174979) on Thursday November 10, 2011 @11:38AM (#38012780)

      I think you're looking at the wrong side of the street. This isn't about the top-end computing power; it's about the efficiencies on the bottom end. So, now you can start churning out laptops and cheap PCs with pedestrian graphics cards that use low power and provide significant performance. No need to take the truck nuts off your Dell, sir.

      • by jonnythan (79727) on Thursday November 10, 2011 @11:45AM (#38012882) Homepage

        I was addressing the question at the end:

        "could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"

        • by tepples (727027) <tepples@nOSpAM.gmail.com> on Thursday November 10, 2011 @12:07PM (#38013166) Homepage Journal

          could we be seeing ultra-low power hardware in high-end PCs and consoles soon?

          I thought that was the entire point of the Wii. Because the "sensor bar" (IR position reference emitter banks) needed to sit by the TV, the console's case needed to be small. This meant Nintendo couldn't use a CPU and GPU with a high TDP, so it stuck with what is essentially a 50% overclocked GameCube. I guess Nintendo is trying the same tactic with the Wii U: take a roughly Xbox 360-class CPU and GPU and take advantage of six years of process shrinks to get the TDP down so it'll fit in the same size case.

        • by hairyfeet (841228) <bassbeast1968@@@gmail...com> on Thursday November 10, 2011 @12:45PM (#38013566) Journal

          Sure we will at least on the PCs although it won't cost a lot and it'll be from AMD and Intel, like with Brazos and Ivy Bridge CULV. I mean when I can pick up a netbook for $350 with 8Gb of RAM that will play L4D and TF2, get 6 hours on a battery while watching HD movies, outputs 1080p over HDMI, and all in a machine that only weighs 3 pounds and costs less than my craptastic Celeron laptop did 5 years ago? Now THAT is nice!

          I think the next advance will be just how far what was once considered "gamer only" levels of graphic performance will spread. i mean it wasn't that long ago that if you wanted to play anything more than DVDs or better than SD you needed a discrete card in your laptop that jumped the price like crazy and made the battery life shit, or how in desktops IGP was a dirty word and how cell phones had to be dropped to almost comically bad resolutions just to get more than a slideshow. Now you have all these machines either in the market or coming down the pipe that get frankly insane levels of graphics for prices so cheap anybody can have one.

          Personally i'm ALL for it. I don't know about the ARM side but on the X86 it looks like OpenCL (which is now supported by Nvidia as well as AMD) is gonna make more and more programs accelerated by the GPU, the power seems to be dropping both on the desktop side and on the mobile side its getting crazy how much performance per watt some of these things are getting, and its all gonna be smooth video and nice if not truly Crysis insane levels on the gaming front. If ARM can keep current power usage levels and get PS3 graphics? I say great, more avenues to sell them means more games!

        • by Daniel Phillips (238627) on Thursday November 10, 2011 @07:33PM (#38017692)

          I was addressing the question at the end:

          "could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"

          Not soon, but inevitably. The equation is: better power efficiency equates to more stream cores. The number of stream cores tends to increase to compensate, so discrete graphics card power consumption stays about the same, near the maximum of what typical cooling systems can accommodate. This somewhat obscures the ongoing trend to lower power designs. However, power consumption per stream unit governs the maximum practical throughput (aka heat dissipation) of high end discrete cards. Therefore it is only a matter of time before ultra-low power design becomes dominant at the high end as well as low.

    • Re:Resolution! (Score:3, Informative)

      by Anonymous Coward on Thursday November 10, 2011 @11:44AM (#38012868)

      Sure, PS3-like graphics... except the PS3 is doing it at 1280x720 or 1920x1080. This will be pushing probably 20-40%% of the pixels.. and doing so in 2 years, while the PS3 hardware is 5 years old (to the day).

      So, no, I don't think that a chipset that will, in 2013, do 20% of the job that 2006 hardware does will be making its way into high-end PCs and consoles soon.

      Except most phones released today have 1080p output via hdmi. So now what?

    • by Oswald McWeany (2428506) on Thursday November 10, 2011 @11:54AM (#38012984)

      There will still be some hard-core graphics intensive games that will require whatever the cutting edge in graphics is at that point.

      However, as old as PS3 may be- the fact is, that, for most of us non-hard-core gamers PS3 quality graphics is more than enough (and will be still in another 5 years time) for the vast majority of games we'd want to play.

      We're beginning to hit a point of diminishing returns on graphics anyway- you're always going to be limited by what the eye can process, and the ability of the artists.... sure when 3D goes mainstream and is built into our monitors all of a sudden graphics cards will need to be more powerfull.

      I personally can't think of one game I have played in the last 5 years where going beyond PS3 quality graphics would have improved the game for me.

      Now, I'm not hard-core, and don't play the shoot-em up first person genre which tend to be the most gpu intensive- but people like me make up an increasingly significant portion of the game-market.

      Look at how successfull simple things like Angry Birds can be.

      • by CubicleView (910143) on Thursday November 10, 2011 @01:59PM (#38014332) Journal
        Graphics card companies always try to outperform their competitors. You can do that on price only, but no one is going to buy a new card that's exactly as powerful as the one you already have only cheaper. For that reason I suspect the current trend to continue.
        • by Oswald McWeany (2428506) on Thursday November 10, 2011 @02:19PM (#38014538)

          Yes, they'll continue to get better and better- and the hardcore gamers will continue to pay over $200 for a graphics card to get the best of the best.

          However, I think for many of us- we're going to see continually less and less gain for each extra $1 spent.

          You can get a 1gig DDR3 video card from Newegg or TigerDirect for $15 after rebate these days.

          Sure, that's not high-tech anymore- but it will run most games fairly well that non-hardcore gamers are likely to encounter.

          If I were into first person shooters maybe I would need a $100 card.

          If I'm playing the newest release of Worms, civilization, Football manager or FIFA the $15 card will work for me.

    • by jeffmeden (135043) on Thursday November 10, 2011 @12:15PM (#38013264) Homepage Journal

      In 2 years a phone with a 1080p display is a likely reality. We already have phones/tablets running at/near 1280x720 which is 50% of the 1080p pixel count. But to say that it would be acceptable on the high end PC side is a stretch, in 2 years we will probably have desktop expectations beyond 1080p. Entry level to mid market could see a benefit though, that market has been under served by horrible attempts at "integrated" graphics for years. It will be interesting to see if this GPU compares to the beefed up CPU/GPU integrations coming from AMD. The new frontier (aside from serious game enthusiasts) is smaller and greener.

    • by phoenix_rizzen (256998) on Thursday November 10, 2011 @12:57PM (#38013708)

      Samsung Galaxy Nexus has a 1280x720 screen. And most Android 4.0 devices coming out in the next 12 months will include 1280x720 screens.

      And pretty much every Android device released this year includes a mini-HDMI port for connecting to 720p and 1080p screens.

      IOW, current and future Android phones can already do what you think they can't.

    • by bWareiWare.co.uk (660144) on Thursday November 10, 2011 @02:47PM (#38014842) Homepage

      The high-end smartphones (Galaxy Nexus) are already at 1280x720. Even if they don't improve at all, in the next two years ARM based netbooks are sure to offer full 1080p.

  • by allanw (842185) on Thursday November 10, 2011 @11:36AM (#38012750)
  • No (Score:4, Informative)

    by Baloroth (2370816) on Thursday November 10, 2011 @11:37AM (#38012758)
    The PS3 is 5 years old and based on even older graphics tech. Beating that on mobile is cool, but not surprising. The PS3 never was impressive, graphically, to PC users. Who had better than HD resolutions for years. Some console games are still limited to 720P. Oh, and people had 3D on PC like, 8 years ago (or more.) Sucked then, sucks now.
  • Tegra 5 (Score:2, Troll)

    by backslashdot (95548) on Thursday November 10, 2011 @11:42AM (#38012830)

    nVidia is commited to releasing a new Tegra chip every year. The Tegra 3, which is already out is 5x faster than Tegra 2 (which beats the Mali 400 which is at 1/10th the speed of the GPU ARM announced). So basically, by the time this ARM CPU is released .. Tegra 5 will be out .. and going by the roadmap of how fast Tegra 5 will be .. it will run at least 5x times faster than ARM's chip.

    I hope ARM prices this cheap dirt cheap .. so that sub $200 (off contract) phones can have it.

  • Yea right (Score:5, Insightful)

    by nedlohs (1335013) on Thursday November 10, 2011 @11:42AM (#38012838)

    In 2 years time the PS3 will be 7 years old.

    The PS2 was 7 years old in 2007. Were PS2 level graphics acceptable for "high end PCs and consoles" in 2007?

    No? Then why would PS3 level be acceptable in 2013?

  • by Anonymous Coward on Thursday November 10, 2011 @11:43AM (#38012842)

    There was a story on CNN a few weeks ago that said that while PC sales are slowly increasing in the entire world, it's very tilted, and they are falling dramatically in the US, Canad, and Europe. The increase is coming from the developing world being able to afford computers as they fall in price.

    The culture shift from desktop computing to mobile is happening in part because mobiles are becoming powerful enough to do most of the tasks that desktops used to do. OK, you'll always get a few neckbeards to say "But the cell phone can't run AutoSuperCadShop 390322.55!" But that misses the point. That's not what 99.9% of consumers DO with their computers. They play some games, including 3D games, they check their facebooks, they look at some news headlines, and so on. All that works fine on a device that they can fit in their pocket. For those times a keyboard is needed, a bluetooth keyboard will do just fine. And for those times a larger screen is needed, a wireless connection to the screen will work fine.

    I don't know why people can't see this shift happening right in front of their eyes. Even the sales data bears it out now: mobile computing is on the upswing, and in the western world, PC sales are falling. It's a nice world: Instead of needing to lug around a desktop or even a netbook, you'll have the capability of a (say) 2009 vintage desktop in your shirtpocket in 2014. A 2009 desktop is nothing to sneeze at, and meets the needs of 99% of the population just fine. The rest will become a more expensive niche, but will still exist in some way.

    It's a Star Trek Future.

  • by Sockatume (732728) on Thursday November 10, 2011 @11:46AM (#38012890)

    When we have handhelds as powerful as the PS3 (the Vita is getting there), we'll have much more powerful PCs and a new generation of consoles.

    • by Locutus (9039) on Thursday November 10, 2011 @12:04PM (#38013112)
      the story isn't about the handhelds matching desktops, it's about the handhelds getting some very powerful graphics. besides, the reference was with consoles, not desktops. Just because consoles in a few years might be doing holographic displays it doesn't mean handhelds doing pretty nice 3D graphics on battery power isn't nice too.

      LoB
      • by kesuki (321456) on Thursday November 10, 2011 @03:01PM (#38014986) Journal

        holographic displays tanked when sega tried them in arcades. they did worse than laserdisc games. it was a load of special glass and small playing field. i only ever saw one once, but i know they were invented... oh wait it was a parabolic screen, http://www.definiteanswers.com/q/What-is-SEGA-s-HOLOGRAPHIC-game-4c120518a7a5b [definiteanswers.com] but as a kid it seemed like a holographic game!

        the thing is, i don't like games like i used to, especially since some hardware is really flaky like cheap Chinese knockoffs. that and i hate all the remote wand/camera stuff, and i like movies better than games, but i refuse to have multiple media of the same movies.

  • by jeffmeden (135043) on Thursday November 10, 2011 @11:54AM (#38012994) Homepage Journal

    Five years ago tomorrow the PS3 made it's debut, did you think that in the mean time everyone just sat back and basked in the glory of its infinite capabilities? Two years from now (if that pans out) will be 7 years since the commercialization of the Cell chip, so seeing a miniature version that uses dramatically less power is pretty much par for the course. Desktop chips that have similar (or more specific) capabilities are already available in many products. Remember, the first PS3 drew an amazing 200 watts at full load, and within 2 years that was more than cut in half. This is just more progress, and *promised* progress at that. Hey ARM, why not just say you will have a flying car in 2 years?

  • by inhuman_4 (1294516) on Thursday November 10, 2011 @12:59PM (#38013736)

    I think this will make a huge difference in mobile gaming because of screen size. Assuming that this thing outputs to 720p like the Nexus Galaxy, I think this will be a big thing.

    While the PS3 graphics are old and crappy compared to what a modern PC can do, don't forget about screen size. Seeing 720p on a 40 inch screen is a lot different than seeing 720p on a 5 inch screen. The best example of this is fonts that look fine at 5 inches will look like crap expanded to 40 inches. Artifacts and jaggedness on 40 inch are going to be pretty minimal on a 5 inch. We are talking about shrinking by almost a factor of 10. At some point the quality of the output will exceed our eyes ability to notice the difference.

    Of course this will do nothing to improve what the chip can render in terms of complex environments, smoke etc. But at 5 inches it is not hard to have too much on the screen.

Memory fault -- brain fried

Working...