Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Portables Graphics Software Entertainment Games Hardware

Ion Platform For Atom Tested With Games, HD Video 115

J. Dzhugashvili writes "Nvidia has already pulled the curtain off its Ion platform, which couples GeForce 9400 integrated graphics with Intel's Atom processor. But how does it perform? The Tech Report has taken the tiny Ion reference system for a spin in games and video decoding to see if the GeForce GPU really helps. The verdict? 1080p playback is actually smooth, and the whole system only draws 25W during playback. Fast-paced action games are another story—Half-Life 2, Quake Wars, and Call of Duty 4 are all choppy with a single Atom core and single-channel RAM, although they do run. TR concludes that Ion is nevertheless a clear improvement over Intel's 945G chipset, especially since Nvidia doesn't expect Ion-based Atom systems to cost significantly more than all-Intel ones." Update: 02/04 09:14 GMT by T : HotHardware is one of the several other sites offering some performance benchmark numbers on the new chipset.
This discussion has been archived. No new comments can be posted.

Ion Platform For Atom Tested With Games, HD Video

Comments Filter:
  • Does the Atom processor make the Internet faster? Because if not, I'm going back to a P4!

  • by Jaysyn ( 203771 )

    Looks like I didn't wait long enough to get the netbook.

    • by Nursie ( 632944 )

      I'm very happy with my eee901. I've debianised it and it's replaced my "big" 13 inch vaio for casual use. I still use the vaio for anythin cpu intensive, or if I want a bigger screen. Or for typing anything other than the odd email/slashdot post.

      What happened to the dual core Atom chips?

      • by vux984 ( 928602 )

        I'm very happy with my eee901. I've debianised it and it's replaced my "big" 13 inch vaio for casual use. I still use the vaio for anythin cpu intensive, or if I want a bigger screen. Or for typing anything other than the odd email/slashdot post.

        So its 'replaced' your Vaio for casual use... yet you use the vaio for anything cpu intensive, when you want a bigger screen, or if your typing something longer than a /. post. Serious question: what does that leave?

        I've been eyeing netbooks, myself, but am having

        • Re: (Score:3, Interesting)

          by Nursie ( 632944 )

          "Serious question: what does that leave?"

          Email.
          Slashdot.
          Flashing Neo Freerunner with stuff.
          Using it as a terminal into my servers for maintenance tasks.
          Music & Movies (on the plane or sometimes hooked up to a big LCD)

          Err...

          Taking places I wouldn't take a decent laptop. Or places I wouldn't think to take a normal one, but it's small enough to throw in the bag.
          Seeing something on tv and wanting to look it up on wikipedia NOW and every other computer is out of reach and takes ages to boot...

          I don't know if

        • by Nursie ( 632944 )

          To answer another bit of your post - the niche, for me, is fast boot (or wakeup from suspend is *damn* quick), very portable but also fully featured. It's not a mobile phone screen, it can do most of what a larger and more powerful machine can do, that's it really.

          It's just another debian device...

          You may wish to take into account that I very nearly bought a Sony Vaio TZ a couple of years back, so *really* small is something that appeals to me.

    • Re:Damn (Score:5, Insightful)

      by Midnight Thunder ( 17205 ) on Tuesday February 03, 2009 @02:14PM (#26713743) Homepage Journal

      Looks like I didn't wait long enough to get the netbook.

      You can never wait too long to get the ultimate configuration, but there is only so long you can wait to have something to use.

      • Re: (Score:3, Informative)

        by Chabo ( 880571 )

        Right. The only way you can really be screwed by new hardware coming out is if you buy right before a price reduction. If you pay attention to the market, you pretty much know when those are going to happen. Athlon X2 price drop when Conroe was released, Penryn price drop when Phenom II was released, etc.

      • That's the fun of netbooks - they're ridiculously cheap for such cool machines. You can buy another one a year or so after buying the first, and you've still spent less than the cost of a low end laptop.
    • Yeah, I have an Atom netbook with the Intel GPU, and it is a little slow for full-screen HD video (hulu). I would love it if an NVIDIA GPU were an option, even for significantly more money.

      On the other hand, I got a netbook so I could escape the distractions of TV, video games, and home; and instead escape to a café where I can actually get work done.

  • nVidia is doomed. (Score:5, Interesting)

    by tjstork ( 137384 ) <todd.bandrowsky@ ... UGARom minus cat> on Tuesday February 03, 2009 @01:50PM (#26713253) Homepage Journal

    I hate to say it because they do good work, but I think nVidia is ultimately doomed as it is today. Everyone rips Intel's integrated 3d graphics but they just keep getting better every year. Although AMD should have bought nVidia instead of ATI, they do own ATI, and so have a pretty good graphics system on their own. Eventually, both AMD and Intel are going to wind up with 3d calculations on the die in some fashion, and that's going to leave nVidia for what?

    • Re:nVidia is doomed. (Score:4, Informative)

      by Bearhouse ( 1034238 ) on Tuesday February 03, 2009 @01:56PM (#26713381)

      Insightful. If one looks at the post here today:

      http://hardware.slashdot.org/article.pl?sid=09/02/02/2344208 [slashdot.org]

      About the new Acer with Intel's highly-integrated N280/GN40 chipset, you've got to wonder about the long-term viability of nVidia.

      • by jebrew ( 1101907 )
        I've got an e-machines laptop with that duo (see Woot sometime last week).

        I've got to say, I didn't have terribly high hopes going in (~$300...and e-machines), but it plays 720p content just fine, edits my source code with very little lag, and can actually playback 1080p content well enough (when plugged into my TV...so the 1080 is actually useful).

        Still sucks for games...but I don't play much.

      • by LoRdTAW ( 99712 )

        X86-64 is an open standard. Who is to say they could not jump in with an X86 cpu? Yes it does sound far fetched but VIA does have CPU tech but very poor video. Maybe a Future Nvidia/VIA mash up is what they need. Would be nice to have some decent embedded competition.

        An all in one CPU/GPU chip could house a dual core N-VIA processor, memory controller, GPU core with a standard IO interconnect and a compact south bridge for the SATA/Ethernet/Sound/USB etc. Not very memory bandwidth oriented but none the less

    • Re: (Score:1, Insightful)

      by Anonymous Coward

      nVidia is in no way hurting and nor will it in the forseeable future. consumer entertainment graphics cards are a big slice of their pie, no doubt, but even if that slice were to go away nVidia is where professionals turn for high end data modeling etc.

      http://www.nvidia.com/object/tesla_computing_solutions.html

      we're talkin TFLOPS of GPU power in a 1U rackmount.

      • Re: (Score:3, Insightful)

        I agree that Nvidia has a good slice of life left; but do remember: Nvidia(as well as ATI) got into high-performance workstation stuff, and undercut the super esoteric stuff, in large part because they could amortize much of the R&D cost over huge numbers of consumer parts. There are niches were totally custom esoteric high end stuff can survive, even prosper; but trying to survive exclusively on high end stuff is an ugly business.

        The history of computing is littered with the corpses of high end outf
    • Bring Intel's Larabee into the equation and things get a whole-lot more uncomfortable for the Green-Team... Maybe the days of discreet graphics solutions are over, but expect Larabee to give nVidia a jolly good thrashing in the market before on-die becomes the status-quo. Ray-tracing, anyone?
      • by Tycho ( 11893 )

        Ray-tracing at 640x480@10fps, anyone?

        Fixed that for you. Those figures would probably not be too far off, and raster graphics can look within 90% of the same quality that ray-tracing achieves with raster graphics only doing 10% of the amount of computations than ray-tracing.

        Also, the still renders using ray-tracing that Intel has put out are often done with global illumination. The scenes rendered are also intentionally assembled in a way to make ray-tracing look dramatically better than the same scene rendered with extremely basic raster ren

        • Re: (Score:3, Informative)

          by ogdenk ( 712300 )

          Raster rendering used to be extremely slow as well before much R&D money was pumped into making cheap hardware with enough oompf to do it well on affordable computers.

          I had a PowerVR2-based card back in 1996 and it struggled with Quake 2 at 512x384 in a K6 233. Software rendering almost as fast. The 3DFX Voodoo1 was less than impressive as well.

          Give it time, raytracing hardware will become viable eventually.

        • Thank you for clearing that up for me! Nice to have some technical info on the matter. =) Like ogdenk said, we'll have to wait to see what the future holds for ray-tracing.
    • by Kjella ( 173770 )

      Eventually could take a very long time... Then there's consoles, which is also a big market to fight for. Putting the CPU and GPU which are the two biggest power draws in a modern computer isn't exactly without drawbacks. Sure, latency would improve but that means more memory lines, more power lines, more heat to dissapate from the same area. Given AMDs breathing problems on the CPU side I'd say the GPU wars are in much better shape than the GPU wars.

    • by chipace ( 671930 )

      ION is about a very targeted market (mobile gamers and HD enthusiasts). They are very willing to pay 2x -3x the profit (not the cost) for a mobile gaming/HD product.

      I see Nvidia as having a good future, as they are listening to their customers, and not trying to predict the market.

      I am not interested in ION at all, but it delivers the goods to those that want it.

      • I'm also not exactly interested in their reference design, although... almost maybe. What they should add: Four more ethernet ports and a chip and external antenna. Then the thing could be an always-on router and file server that lives in the livingroom and can play back any media files on the network (gigabit ethernet is good enough for HD, right?).

        So while you are right that they're aiming at a very targeted market, they don't have to be. They could make of this platform what iTV should have been. And

    • by Hogwash McFly ( 678207 ) on Tuesday February 03, 2009 @03:12PM (#26714873)

      and that's going to leave nVidia for what?

      Asking for a bailout?

    • nVidia don't just make GPUs for x86. They also have a very advanced SoC incorporating a multi-core ARM CPU and an nVidia GPU. The version aimed at Netbook-type systems draws under 4W for the entire SoC.
    • It wouldn't surprise me one bit if you were correct, and nVidia ended up going the same way as 3com and other vendors whose products ended up being integrated into the MB. But, I think it's also a pretty valid argument to say that there will continue to be space at the top of the food chain, selling cards that appeal only to the more hardcore gaming enthusiast. There are just too many gamers out there who have plenty of money and a love of bleeding-edge hardware.
    • by p0tat03 ( 985078 )

      Except that the performance for Intel's integrated graphics is still junk. NVidia's 9400M chipset at least offers decent performance, and IMHO is poised to take a lot of market share from products that are currently Intel's.

      AMD? AMD is a has-been. They bought ATI how long ago? They've been promising a CPU-GPU hybrid for years now, and it's always "just around the corner". As far as I'm concerned I'll believe it when I see it, because AMD doesn't look like they're capable of delivering on that promise.

      The da

      • X3100 is old technology; I have a 4500MHD [intel.com] on my 12" notebook. I haven't tested out the performance much but it's supposed to support Blu-ray and 1080P. I haven't seen it in a netbook yet though.
      • by jedidiah ( 1196 )

        Playing a movie is not a terribly interesting problem.
        Neither is Photoshop anymore for that matter.

        Playing a movie only becomes interesting due to the fact
        that the number of pixels being pushed around has increased
        dramatically at the same time encoding methods have gotten
        more sophisticated.

        All of the effort spent on speeding up games can probably be
        easily re-purposed for things like "simple movies" if you're
        just a little clever about it.

        That's all this is really.

        Nvidia as usual is just being smarter than th

        • by p0tat03 ( 985078 )

          and it would be nice if a $300 PC could keep up with a $300 (HD/BD)DVD player.

          Your HD/BD player doesn't come with a screen, keyboard, battery, and all the trimmings... this isn't even a fair comparison. At this stage expecting a netbook-level device to handle HD movies is simply ridiculous.

          And that's precisely my point: all the effort speeding up games is easily re-purposed for playing HD movies, and this is in fact *why* NVidia will succeed where Intel fails. Intel has always made poor-performing integrated solutions on the expectation that only gamers need performance. The reality

    • Intels 3d graphics has been a joke for the longest time. While they might be getting better every year, they still have a lot of catching up to do.
    • I hate to say it because they do good work, but I think nVidia is ultimately doomed as it is today. Everyone rips Intel's integrated 3d graphics but they just keep getting better every year.

      And nVidia's graphics aren't getting any better? A GPU and a CPU stuffed together into the same chip will always be a low-cost/low-power/low-end solution, can never come close to the capability of a GPU that has the whole die to itself. If Intel/AMD has ~2B transistors in a chip that are divided between a general-purpose CPU and a GPU, can that ever match a 2B transistor discrete GPU + a discrete CPU? Unlikely. Plus, CUDA has put forth interesting possibilities for putting the GPU to other uses.

      Although AMD should have bought nVidia instead of ATI, they do own ATI, and so have a pretty good graphics system on their own.

      And they sho

  • by hattig ( 47930 ) on Tuesday February 03, 2009 @01:50PM (#26713259) Journal

    How does the ION chipset compare in power consumption with the mobile 945 used in netbooks (the 6W TDP one, not the 20W+ TDP desktop variant that's a total joke).

    25W for CPU, Chipset, HD, Memory, motherboard doesn't seem as low as it could be.

    Still, if they can get 8 hours out of a 6 cell battery in a netbook with it, great. It's a far far far more advanced chipset than the Intel crud.

    • 25W for CPU, Chipset, HD, Memory, motherboard doesn't seem as low as it could be.

      That's the power consumption while playing HD video. Even the Little Rokubox is supposed to peak over 5W and ALL it does is play video and it has no storage to speak of, no GPU to speak of (but it does have a dedicated video decoder.) Running the LCD backlight is probably one of the big loads, but using a general purpose GPU (it's not just for pushing pixels any more, after all) in this application is necessarily going to hurt power consumption.

      I think it's pretty fantastic for what it is. I think the best

      • Maybe 1080p is really processor intensive, but the OMAP 3530 can decode 720p H.264 with a power draw of under 1.8W for the CPU/GPU/DSP, flash, and RAM. If the WiFi and TFT are drawing less than 20W then this number is not very impressive.
      • by slapys ( 993739 )

        That's the power consumption while playing HD video. Even the Little Rokubox is supposed to peak over 5W and ALL it does is play video and it has no storage to speak of, no GPU to speak of (but it does have a dedicated video decoder.) Running the LCD backlight is probably one of the big loads, but using a general purpose GPU (it's not just for pushing pixels any more, after all) in this application is necessarily going to hurt power consumption.

        Can't we get a dedicated chip for the pipeline of network pack

  • Valve games (Score:3, Interesting)

    by Chabo ( 880571 ) on Tuesday February 03, 2009 @01:50PM (#26713265) Homepage Journal

    Well of course Half Life 2 is choppy on the platform -- the Source engine is very CPU-intensive. Almost every system is going to be CPU-bound with Valve games, unless you happen to be running a Core i7 with an entry-level video card, at 1920x1200. As for the other games, you're still running on integrated graphics, and there's only so much you can do before you need a separate card.

    Disclaimer: I work for Intel, but I've been a fan of Valve's games for much longer than that.

    • I've been running HL2 on an atom 330, with a pci ati 2400 (note, old school pci, not pci-e), and except for a little flitter when levels start, it's been great.
      Then again, I'm running 1024/768 (monitor max res).

      I do wonder why everyone is so gung-ho about the single core atom,,, isn't the dual core only about $5 more?
      That's worth it for me... add on a Nvidia chipset (please do CUDA), it'd be beautiful.

      • by Chabo ( 880571 )

        Plus most hardware review sites tend to say "If you can't play it at full settings, it's not worth it." ;)

        I first played HL2 on a GeForce 4 MX 440, at 640x480, on minimal settings. That was alright, but the real problem was that if you set the game to DX7 (which is all the GF4 was capable of), then the draw distance for models was greatly reduced. That meant the chapter where you had to use your grav gun to get objects to stand on top of the sand was a real pain, cause you couldn't see any boxes that were m

        • That meant the chapter where you had to use your grav gun to get objects to stand on top of the sand was a real pain, cause you couldn't see any boxes that were more than 15 feet away unless you zoomed in,

          BAM! Antlioned.

          and you couldn't use the grav gun while zoomed. Zoom in, find an object, zoom out, pull it, zoom back in to see if you lost it, repeat.

          Sorry, did I interrupt your train of thought?

      • by kenh ( 9056 )

        I agree - the dual-core (four pseudo-cores if you include hyper-threading) Atom CPU is a very capable machine. I'm not a gamer, but I did get the Intel D945GCLF2 MB with the dual-core CPU, Gigabit LAN, and dual SATA ports for a test/trainer 64-bit Windows Server 2008 machine and it works great. Is it the fastest machine I own, no, but as a build-up/test/tear-down box it works very well, and if I leave it on, the power drain is minimal. I do wish it had something other than a PCI slot (PCIe or even PCI-X wou

      • I do wonder why everyone is so gung-ho about the single core atom,,, isn't the dual core only about $5 more? That's worth it for me... add on a Nvidia chipset (please do CUDA), it'd be beautiful.

        Intel is restricting the dual-core Atom to desktops AFAIK. Probably something like 80-90% of Atoms are in netbooks, so the dual-core Atom not an option for most people for now, at least.

  • This is a shift away for the CPU to the GPU and Intel will hate it.
    This or even the plane atom is good enough for a very large percentage of users.
    This would work for just about every Office PC, average home user, and media center.
    About the only tasks this will now work for is media editing, gaming, or heavy technical use.

    The one problem I see with it is the cost. That extra money is a big percentage of the cost of the one of these mini systems.
    I so want one.

    • This is a shift away for the CPU to the GPU and Intel will hate it.

      Well they should have seen the writing on the wall. It is much easier to make an efficient specialised processor, than an efficient generic one. With OpenCL on the horizon, don't be surprised to see computer with dual graphics chips (one used for OpenGL/Directx and the other for OpenCL). Note I see OpenCL being beneficial to physics aspects of games. If Intel has any sense they will either improve their graphics chips or invest in Nvidia.

      • With OpenCL on the horizon, don't be surprised to see computer with dual graphics chips (one used for OpenGL/Directx and the other for OpenCL).

        I believe that you are incredibly wrong - thank goodness. All you have to do is look at history to see that computers have been steadily moving towards commoditization. Even the last Cray machines were just bundles of commodity processors with some (very important and significant) glue.

        Intel is simply going to put more of the functionality of GPUs into its CPUs. Meanwhile they are talking about a future with "thousands of cores" in the computers of mainstream users. While that is clearly a long way off if i

        • Re: (Score:3, Informative)

          by forkazoo ( 138186 )

          Intel is simply going to put more of the functionality of GPUs into its CPUs. Meanwhile they are talking about a future with "thousands of cores" in the computers of mainstream users.

          Actually, AMD is out front on putting GPU functionality into CPU's with the "fusion" platform. Intel is taking the long way around with Larrabee and putting X86 into the GPU. Go figure. Anyhow, the end result will be to reduce chip counts and take advantage of the high number of transistors that can cheaply be put on a singl

          • Actually, AMD is out front on putting GPU functionality into CPU's with the "fusion" platform.

            Well, I'm not saying who will get there first or do it better, I don't know yet. And haven't done much research either, but the point is that's the way that computing has always gone and there is no reason to believe it will be different. The Amiga was a landmark computer because with all its cheap little custom chips it could kick the shit out of a 486 for multimedia tasks, but then the PCs became completely commoditized and you could do with brute force and ignorance and cheap parts what the Amiga could d

        • by LWATCDR ( 28044 )

          Actually I am not so sure.
          For one thing super computers often do include co processors for things like vector ops and such.
          GPUs are becoming more and more important not less and less. Intel has failed to make a good GPU. Why is up to debate. Frankly I think CPUs have for now reached a good enough level for most people. Now they want lower power, heat, size, and at best HD playback.
          As to gaming take a look at the XBox360 and PS3. They are very GPU heavy and pretty CPU light. Even the Cells SPEs are really mo

    • Intel has claimed that they will get rid of the need for GPUs by adding GPU cores to their CPUs. NVidia has claimed that CPUs don't matter much anymore, and that their GPUs are what consumers really need to go forward.

      Time will tell who is right, so I own both INTC and NVDA ;-)

  • A reasonable start (Score:5, Insightful)

    by abigsmurf ( 919188 ) on Tuesday February 03, 2009 @01:59PM (#26713427)

    Games performance isn't really the issue for these. These things aren't designed for games.

    What these are best used for are Media Centre setups. However it doesn't play all 1080p content smoothly which is a major issue. There are plenty of options for this kind of thing, the Popcorn hour, the WD HDTV box. Those are good to a point but fall down on format support, especially mkv which doesn't have full subtitle and codec support on either.

    The current best option is an energy efficient Athlon based setup. These cost about $75-$100 more than an atom system and use a bit more power but they'll play back any video you throw at them without dropping any frames.

    Maybe with a dual core atom and using dual core optimised codecs this will reach the goal of never having to notice a dropped frame, regardless of format and bit rate but this atom solution still isn't the Media center beast it could be.

    • Games performance isn't really the issue for these. These things aren't designed for games.

      Totally agree. who is going to be playing Half Life 2 on a 8" or 10" screen? =P

      Maybe some other games, but FPS? With a trackpad? (wh owants to lug around a mouse with their ultra portable netbook?)

      • by Midnight Thunder ( 17205 ) on Tuesday February 03, 2009 @02:24PM (#26713969) Homepage Journal

        Totally agree. who is going to be playing Half Life 2 on a 8" or 10" screen? =P

        Certainly, but looking towards platforms such as PSP, Nintendo DS and the iPhone, we can see that there is a market for games taking advantage of small format screens. While Half Life 2 won't be targeted at these platforms, there are already FPS games for some of these platforms, though then again we are more likely to see a Nvidia + ARM combination, than a Nvida + x86 combination, simply because of battery limitations.

        • I might be wrong, but I think any given netbook is going to be pretty cool with running PSP/DS/iPhone games fairly well? The, for lack of a better term, computing power of an iPhone vs. MSI Wind has to be tilted towards the Wind. I don't think Half-Life 2 is going to be running on an iPhone anytime soon :)

      • You've clearly never played an FPS on a tablet. Literally point and click ;)
    • So, what's a good energy efficient athlon setup? With dual core? and 64 bit?

      I used to know both intel and amd cpus quite well, but it was years and years ago when things still made sense. Nowadays, I keep putting off buying something because I can't figure out what's what!

    • by mrsalty ( 104200 )

      I have to disagree with you on this point. I replaced an Athlon64 3800+ machine with a Popcorn Hour because the Athlon simply did not have the power to play back 1080p content. It could manage 720p no problem, but would stutter on 1080p enough to make it unwatchable. It was also far harder to setup and use MythTV then it is the PH. Sure the PH has it flaws, but it works out of the box with a remote and plays nearly any media you stream at it. Also, if you are starting from scratch, it is cheaper.
      -mrsalt

  • ...with using IBM's Sequoia for a graphics processor? Ok, they need to work on the price a little, and maybe it's a little bulky for most portables, oh and the power supply might need upgrading, but aside from that, what have the Romans ever done for us?

  • I'd buy some. (Score:4, Interesting)

    by Big Boss ( 7354 ) on Tuesday February 03, 2009 @02:27PM (#26714035)

    With recent developments in VDPAU, the HD capable GPU acceleration for Linux, I could use this board. The only thing I would change is to make it wider and move all the ports to the back. Include an LCD or VFD if you want to get fancy, and and IR receiver on the front. Perfect MythTV frontend machine. I would like the dual-channel RAM though, to help with 1080i playback.

    Put it in a nice small case like those used for modern DVD players, and they have a winner.

    • Re: (Score:3, Interesting)

      by QuasiEvil ( 74356 )

      My first thought exactly - gimme, gimme, gimme, need a new Myth frontend. Let's see - low power, good Linux-supported decompression acceleration, and has an HDMI port. This is exactly what I've been waiting on.

    • VDPAU still can't play a lot of video.

      Another important thing, VDPAU can't utilize memory allocated by Turbo Cache, and 9400M don't has enough integrated RAM to decode HD video (not sure about 720p, but sure it can't do 1080p)

  • 25 Watts? (Score:3, Informative)

    by Bruce Perens ( 3872 ) * <bruce@perens.com> on Tuesday February 03, 2009 @02:28PM (#26714065) Homepage Journal
    25 watts during playback? Huh? This is more than twice what my current netbook with Intel graphics uses. It generally runs less than 11 watts, and I can get it to a bit less than 9 with some tweaking. I don't suppose anyone's seriously proposing an Atom platform that pulls 25 watts during playback.

    Bruce

    • But intel graphics can't do 1080p at 9W or 11W.

      • Re: (Score:3, Insightful)

        by Bruce Perens ( 3872 ) *

        But Intel graphics can't do 1080p

        Perhaps because there isn't optimized MPEG playback code for that chipset?

        Part of the problem here is that on the desktop, Intel's vendors don't want great Intel graphics, they want to be able to sell up to an external display card. So, it's only the laptop platform that could drive an improvement in Intel graphics.

      • But intel graphics can't do 1080p at 9W or 11W.

        Not that it even matters on the 1024x600 screen on most netbooks (or the 1280x800 screen on a select few). Oh, you were talking about hooking it up to an HDTV monitor. Well, that's what the HTPC or game console is for. At the moment it really seems like the Ion platform is aiming for a niche that barely exists. Now, for a really low-powered HTPC, this might show some promise in another generation or two. But at the moment, I'll pass.
    • by afidel ( 530433 )
      The Intel GMA950 which can do 1080p pulls greater than 20W just for the chipset.
      • For the last time, the chipset your are referencing is the DESKTOP variant, not the MOBILE variant.

        Mobile 945: 7w TDP [intel.com].

        945 Desktop: 22w TDP [intel.com].

        Nobody in their right mind is going to put a desktop chipset in a netbook. So yes, the 25w consumed by this test platform (4w for CPU + 20w for chipset) IS significantly more than most netbooks (4w for CPU + 7w for chipset), and raises the question of whether this is a viable solution for HD playback.

    • Aye, that seems to be what they are proposing.

      Bruce
    • The atom platform is a joke. It's competing with ARM SoC solutions that draw under 2W and can play 720p H.264 in this power envelope, dropping to around 15mW if all you're doing is playing music. It's an order of magnitude worse than its competitors, but the massive Intel marketing machine keeps screaming 'look at us! We're relevant!' and the tech press believes them.

      Unless you want to run Windows (and surely you don't Bruce), there's no compelling reason to go with Intel for a portable.

      • Well, there's a MIPS laptop somewhere out there, on fire sale for less than $200. But where would I get an ARM laptop? I'm currently using an Acer Aspire One, and I have two of the 9-cell batteries, and can get 8 hours out of each. This is sufficient for my traveling and public speaking. I'd be happy to try an ARM laptop, if it's physically as nice as the Aspire One, and has a VGA output (because I want to project from the Linux system, not whatever Windows system they have at the venue), and doesn't cost m
        • Not out yet, but a few companies (including the ASUS spin-off Pegatron) are starting to use Freescale's i.MX515 in netbooks. This is a Cortex A8 running at 1GHz with integrated DSP and OpenGL 2 ES GPU. A few pre-production models are floating around and use around 0.5W playing back 720p H.264. A couple are scheduled for release around May.
      • Aha, but where can I as a consumer buy this magic ARM SoC solution laptop to run linux on?

  • While off-the-shelf PC games might not work that great on this combo, I suspect that there could be really good, beautiful looking games created and fine-tuned just for such a platform. Someone could possibly use this as the basis for a portable entertainment system to compete with the Nintendo DS, PSP, etc. XBox Portable anyone? If you consider that Atom-based systems would typically have smaller screens, so that you might be looking at a resolution of maybe 640x480 or 800x600 (or perhaps wide-screen aspec

    • by sznupi ( 719324 )

      I would like to for such standard to go even "lower", towards current netbooks as the base - there's already quite a lot of them, and the Poulsbo/new integrated Atom ones won't be significantly more powerfull.

      But...there's either no need for it or it won't bring anything noteworthy to the table - definatelly not any big release, just casual games at most. And you can already find a lot of them that run nice, and also a lot of really good, older "big" releases.

  • And, maybe AppleTV? I wouldn't imagine a lot of mini owners are gaming on them, and if they can get the price down to reasonable, why not?

    Will stick to my 45WAthlonX2/AMD740G mythtv box until AppleTV gets a tuner
    But I would like to replace that aging G4 that I occasionally use

  • A review of the ION platform with a dual core Atom 330 is here:

    http://www.pcper.com/article.php?aid=663 [pcper.com]

  • by jhfry ( 829244 ) on Tuesday February 03, 2009 @02:54PM (#26714521)

    I see this being the hot new frontend for mythtv. With VDPAU supported for HD decoding, fanless/quiet fan, atom processor, a bit of ram, and a SD card for storage I could make one hell of a nice tiny front end.

    I want one now!

  • A Atom base mini is a cpu power down grade and there is no way that it will be a good buy at $600 maybe $400 - $500 but it should use ddr2 not high cost ddr3 laptop ram. AND DON"T thing about not giving a free mini dp to dvi cable with it $30-$100 more to use your own display??.

    Apple may try to sell a system with this a 1gb of ram at $600 but that will make the old mini still look good even more so if it drops to $500 or less and ddr 3 will just make it cost more as the cpu is so slow that it is better to g

    • Assuming that it plays 1080p correctly, the Ion platform would make for an excellent AppleTV. The only question is that will it be cheap enough. But the extra horsepower of the Atom could allow the AppleTV to be used for other things.

      There were rumors about Apple using the Ion platform in the mini but I believe those to be false. The AppleTV appears to be a much more likely target.

      • I rather doubt apple will downgrade the mini cpuwise, they will add the nivida gpu, no doubt, but it probably will be an upgrade like the macbooks and the air!
        No atom, apple tv however which still has the g4 would be an option, but is the atom even faster than the g4, I am not sure about it!

    • by ogdenk ( 712300 )

      It would make a great Macbook Nano or low cost edu box for Apple to fill the niche the old Mac LC series and eMacs filled.

      I could see a Mac Nano being built. Maybe, just maybe, Apple could build a usable decent little box to compete with $350 Walmart crap PC's that would not be a total POS with crap HW quality. I would pay $400 to get a genuine Apple branded Atom-based machine that I could run OS X on without disgusting patches and hacks.

      The Atom is fast enough to not be a complete toy and with the Ion pl

  • I bought a WD TV Friday, and it is going back to Fry's tonight. The fact that it couldn't "aggregate" my mp3s or play back two thrids of my mst3k avi files was a deal breaker.

    It also failed indicate what the problem was when I (unknowingly) tried to use a non-HDCP compliant HDMI cable. (It allows selecting upto 720p output on HDMI, but the configuration menu keeps reverting to showing "composite" as the display type.) I figured it out by swapping cables with my Philips (HDMI equipped) DVD player.

    Remote sens

  • I known this. I read about in www.pensearch.t35.com
  • Is Tegra dead then? It sure was promising... I have been waiting for nVidia to come out with retail items equipped with Tegra since Q3 2008. Atom + Ion seems a direct threat to Tegra. I waited as long as I could for Tegra, but due to need and desire I ended up with an iPhone. All I can find are the original plugs for Tegra technology: http://www.nvidia.com/object/product_tegra_600_us.html [nvidia.com] "The first NVIDIA Tegra 600 Series-based devices are expected to begin shipping in mid-2009".

E = MC ** 2 +- 3db

Working...