Ion Platform For Atom Tested With Games, HD Video 115
J. Dzhugashvili writes "Nvidia has already pulled the curtain off its Ion platform, which couples GeForce 9400 integrated graphics with Intel's Atom processor. But how does it perform? The Tech Report has taken the tiny Ion reference system for a spin in games and video decoding to see if the GeForce GPU really helps. The verdict? 1080p playback is actually smooth, and the whole system only draws 25W during playback. Fast-paced action games are another story—Half-Life 2, Quake Wars, and Call of Duty 4 are all choppy with a single Atom core and single-channel RAM, although they do run. TR concludes that Ion is nevertheless a clear improvement over Intel's 945G chipset, especially since Nvidia doesn't expect Ion-based Atom systems to cost significantly more than all-Intel ones." Update: 02/04 09:14 GMT by T : HotHardware is one of the several other sites offering some performance benchmark numbers on the new chipset.
But the real question is- (Score:2, Funny)
Does the Atom processor make the Internet faster? Because if not, I'm going back to a P4!
Re:But the real question is- (Score:4, Informative)
Re: (Score:2)
Does the Atom processor make the Internet faster? Because if not, I'm going back to a P4!
You're thinking of the Pentium !!! [suite101.com], which made the interwebs your bitch.
Damn (Score:1)
Looks like I didn't wait long enough to get the netbook.
Re: (Score:2)
I'm very happy with my eee901. I've debianised it and it's replaced my "big" 13 inch vaio for casual use. I still use the vaio for anythin cpu intensive, or if I want a bigger screen. Or for typing anything other than the odd email/slashdot post.
What happened to the dual core Atom chips?
Re: (Score:2)
I'm very happy with my eee901. I've debianised it and it's replaced my "big" 13 inch vaio for casual use. I still use the vaio for anythin cpu intensive, or if I want a bigger screen. Or for typing anything other than the odd email/slashdot post.
So its 'replaced' your Vaio for casual use... yet you use the vaio for anything cpu intensive, when you want a bigger screen, or if your typing something longer than a /. post. Serious question: what does that leave?
I've been eyeing netbooks, myself, but am having
Re: (Score:3, Interesting)
"Serious question: what does that leave?"
Email.
Slashdot.
Flashing Neo Freerunner with stuff.
Using it as a terminal into my servers for maintenance tasks.
Music & Movies (on the plane or sometimes hooked up to a big LCD)
Err...
Taking places I wouldn't take a decent laptop. Or places I wouldn't think to take a normal one, but it's small enough to throw in the bag.
Seeing something on tv and wanting to look it up on wikipedia NOW and every other computer is out of reach and takes ages to boot...
I don't know if
Re: (Score:2)
To answer another bit of your post - the niche, for me, is fast boot (or wakeup from suspend is *damn* quick), very portable but also fully featured. It's not a mobile phone screen, it can do most of what a larger and more powerful machine can do, that's it really.
It's just another debian device...
You may wish to take into account that I very nearly bought a Sony Vaio TZ a couple of years back, so *really* small is something that appeals to me.
Re:Damn (Score:5, Insightful)
Looks like I didn't wait long enough to get the netbook.
You can never wait too long to get the ultimate configuration, but there is only so long you can wait to have something to use.
Re: (Score:3, Informative)
Right. The only way you can really be screwed by new hardware coming out is if you buy right before a price reduction. If you pay attention to the market, you pretty much know when those are going to happen. Athlon X2 price drop when Conroe was released, Penryn price drop when Phenom II was released, etc.
Re: (Score:2)
Re: (Score:2)
Yeah, I have an Atom netbook with the Intel GPU, and it is a little slow for full-screen HD video (hulu). I would love it if an NVIDIA GPU were an option, even for significantly more money.
On the other hand, I got a netbook so I could escape the distractions of TV, video games, and home; and instead escape to a café where I can actually get work done.
nVidia is doomed. (Score:5, Interesting)
I hate to say it because they do good work, but I think nVidia is ultimately doomed as it is today. Everyone rips Intel's integrated 3d graphics but they just keep getting better every year. Although AMD should have bought nVidia instead of ATI, they do own ATI, and so have a pretty good graphics system on their own. Eventually, both AMD and Intel are going to wind up with 3d calculations on the die in some fashion, and that's going to leave nVidia for what?
Re:nVidia is doomed. (Score:4, Informative)
Insightful. If one looks at the post here today:
http://hardware.slashdot.org/article.pl?sid=09/02/02/2344208 [slashdot.org]
About the new Acer with Intel's highly-integrated N280/GN40 chipset, you've got to wonder about the long-term viability of nVidia.
Re: (Score:1)
I've got to say, I didn't have terribly high hopes going in (~$300...and e-machines), but it plays 720p content just fine, edits my source code with very little lag, and can actually playback 1080p content well enough (when plugged into my TV...so the 1080 is actually useful).
Still sucks for games...but I don't play much.
Re: (Score:2)
X86-64 is an open standard. Who is to say they could not jump in with an X86 cpu? Yes it does sound far fetched but VIA does have CPU tech but very poor video. Maybe a Future Nvidia/VIA mash up is what they need. Would be nice to have some decent embedded competition.
An all in one CPU/GPU chip could house a dual core N-VIA processor, memory controller, GPU core with a standard IO interconnect and a compact south bridge for the SATA/Ethernet/Sound/USB etc. Not very memory bandwidth oriented but none the less
Re: (Score:2)
NVIDIA in Linux - Super easy
Re: (Score:2)
Not any more. nVidia breaks kernels. AMD has a free driver for pretty much all of their GPUs.
nVidia + VIA would be nice, if they weren't morons and dropped the platform.
Re: (Score:1, Insightful)
nVidia is in no way hurting and nor will it in the forseeable future. consumer entertainment graphics cards are a big slice of their pie, no doubt, but even if that slice were to go away nVidia is where professionals turn for high end data modeling etc.
http://www.nvidia.com/object/tesla_computing_solutions.html
we're talkin TFLOPS of GPU power in a 1U rackmount.
Re: (Score:3, Insightful)
The history of computing is littered with the corpses of high end outf
Re: (Score:1)
Re: (Score:2)
Ray-tracing at 640x480@10fps, anyone?
Fixed that for you. Those figures would probably not be too far off, and raster graphics can look within 90% of the same quality that ray-tracing achieves with raster graphics only doing 10% of the amount of computations than ray-tracing.
Also, the still renders using ray-tracing that Intel has put out are often done with global illumination. The scenes rendered are also intentionally assembled in a way to make ray-tracing look dramatically better than the same scene rendered with extremely basic raster ren
Re: (Score:3, Informative)
Raster rendering used to be extremely slow as well before much R&D money was pumped into making cheap hardware with enough oompf to do it well on affordable computers.
I had a PowerVR2-based card back in 1996 and it struggled with Quake 2 at 512x384 in a K6 233. Software rendering almost as fast. The 3DFX Voodoo1 was less than impressive as well.
Give it time, raytracing hardware will become viable eventually.
Re: (Score:1)
Re: (Score:2)
Eventually could take a very long time... Then there's consoles, which is also a big market to fight for. Putting the CPU and GPU which are the two biggest power draws in a modern computer isn't exactly without drawbacks. Sure, latency would improve but that means more memory lines, more power lines, more heat to dissapate from the same area. Given AMDs breathing problems on the CPU side I'd say the GPU wars are in much better shape than the GPU wars.
Re: (Score:1)
ION is about a very targeted market (mobile gamers and HD enthusiasts). They are very willing to pay 2x -3x the profit (not the cost) for a mobile gaming/HD product.
I see Nvidia as having a good future, as they are listening to their customers, and not trying to predict the market.
I am not interested in ION at all, but it delivers the goods to those that want it.
Re: (Score:2)
So while you are right that they're aiming at a very targeted market, they don't have to be. They could make of this platform what iTV should have been. And
Re:nVidia is doomed. (Score:4, Insightful)
nVidia/ATI will end up going the way of Creative. It used to be that to get any sort of decent sound you were required to buy a PCI sound card. I'm out of the hard-core gaming scene, but I don't know anyone that uses anything but integrated sound. When I can get 7.1 sound from my motherboard, why would I consider buying something else?
Creative seriously fucked up the sound card market to try and corner it and wound up destroying audio on the PC. Most of the serious competition got bought up or put out of business by Creative's 'win by any means necessary' plan.
Re: (Score:3, Informative)
According to Anandtech, currently Creative still has the best game compatibility, because the game devs write to their cards, but Asus' Xonar line has better sound quality, and nearly the same level of game compatibility. I know if I were to build a new machine I'd take their advice on that, what with Creative's driver troubles, especially on x86-64.
http://anandtech.com/guides/showdoc.aspx?i=3497&p=5 [anandtech.com]
Based on Valve's stats [steampowered.com], it looks like only about 3.5% of Steam users have an X-Fi card. I do know of a la
Specifically. (Score:2)
Creative's 'win by any means necessary' plan.
We're talking about how Creative went and bought Emu, then, turned around and shut off the flow of chips to Turtle Beach. There was a nice little competition there and Creative just pissed all over it with a pretty sleazy play. I don't feel bad about Intel and Microsoft screwing Creative out of the equation at all.
Re:nVidia is doomed. (Score:4, Funny)
and that's going to leave nVidia for what?
Asking for a bailout?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Except that the performance for Intel's integrated graphics is still junk. NVidia's 9400M chipset at least offers decent performance, and IMHO is poised to take a lot of market share from products that are currently Intel's.
AMD? AMD is a has-been. They bought ATI how long ago? They've been promising a CPU-GPU hybrid for years now, and it's always "just around the corner". As far as I'm concerned I'll believe it when I see it, because AMD doesn't look like they're capable of delivering on that promise.
The da
Re: (Score:2)
Re: (Score:2)
Playing a movie is not a terribly interesting problem.
Neither is Photoshop anymore for that matter.
Playing a movie only becomes interesting due to the fact
that the number of pixels being pushed around has increased
dramatically at the same time encoding methods have gotten
more sophisticated.
All of the effort spent on speeding up games can probably be
easily re-purposed for things like "simple movies" if you're
just a little clever about it.
That's all this is really.
Nvidia as usual is just being smarter than th
Re: (Score:2)
and it would be nice if a $300 PC could keep up with a $300 (HD/BD)DVD player.
Your HD/BD player doesn't come with a screen, keyboard, battery, and all the trimmings... this isn't even a fair comparison. At this stage expecting a netbook-level device to handle HD movies is simply ridiculous.
And that's precisely my point: all the effort speeding up games is easily re-purposed for playing HD movies, and this is in fact *why* NVidia will succeed where Intel fails. Intel has always made poor-performing integrated solutions on the expectation that only gamers need performance. The reality
Re: (Score:1)
Re: (Score:1)
I hate to say it because they do good work, but I think nVidia is ultimately doomed as it is today. Everyone rips Intel's integrated 3d graphics but they just keep getting better every year.
And nVidia's graphics aren't getting any better? A GPU and a CPU stuffed together into the same chip will always be a low-cost/low-power/low-end solution, can never come close to the capability of a GPU that has the whole die to itself. If Intel/AMD has ~2B transistors in a chip that are divided between a general-purpose CPU and a GPU, can that ever match a 2B transistor discrete GPU + a discrete CPU? Unlikely. Plus, CUDA has put forth interesting possibilities for putting the GPU to other uses.
Although AMD should have bought nVidia instead of ATI, they do own ATI, and so have a pretty good graphics system on their own.
And they sho
What about power consumption? (Score:4, Insightful)
How does the ION chipset compare in power consumption with the mobile 945 used in netbooks (the 6W TDP one, not the 20W+ TDP desktop variant that's a total joke).
25W for CPU, Chipset, HD, Memory, motherboard doesn't seem as low as it could be.
Still, if they can get 8 hours out of a 6 cell battery in a netbook with it, great. It's a far far far more advanced chipset than the Intel crud.
Re: (Score:2)
25W for CPU, Chipset, HD, Memory, motherboard doesn't seem as low as it could be.
That's the power consumption while playing HD video. Even the Little Rokubox is supposed to peak over 5W and ALL it does is play video and it has no storage to speak of, no GPU to speak of (but it does have a dedicated video decoder.) Running the LCD backlight is probably one of the big loads, but using a general purpose GPU (it's not just for pushing pixels any more, after all) in this application is necessarily going to hurt power consumption.
I think it's pretty fantastic for what it is. I think the best
Re: (Score:2)
Re: (Score:1)
Can't we get a dedicated chip for the pipeline of network pack
Valve games (Score:3, Interesting)
Well of course Half Life 2 is choppy on the platform -- the Source engine is very CPU-intensive. Almost every system is going to be CPU-bound with Valve games, unless you happen to be running a Core i7 with an entry-level video card, at 1920x1200. As for the other games, you're still running on integrated graphics, and there's only so much you can do before you need a separate card.
Disclaimer: I work for Intel, but I've been a fan of Valve's games for much longer than that.
Re: (Score:1)
I've been running HL2 on an atom 330, with a pci ati 2400 (note, old school pci, not pci-e), and except for a little flitter when levels start, it's been great.
Then again, I'm running 1024/768 (monitor max res).
I do wonder why everyone is so gung-ho about the single core atom,,, isn't the dual core only about $5 more?
That's worth it for me... add on a Nvidia chipset (please do CUDA), it'd be beautiful.
Re: (Score:2)
Plus most hardware review sites tend to say "If you can't play it at full settings, it's not worth it." ;)
I first played HL2 on a GeForce 4 MX 440, at 640x480, on minimal settings. That was alright, but the real problem was that if you set the game to DX7 (which is all the GF4 was capable of), then the draw distance for models was greatly reduced. That meant the chapter where you had to use your grav gun to get objects to stand on top of the sand was a real pain, cause you couldn't see any boxes that were m
Re: (Score:1)
That meant the chapter where you had to use your grav gun to get objects to stand on top of the sand was a real pain, cause you couldn't see any boxes that were more than 15 feet away unless you zoomed in,
BAM! Antlioned.
and you couldn't use the grav gun while zoomed. Zoom in, find an object, zoom out, pull it, zoom back in to see if you lost it, repeat.
Sorry, did I interrupt your train of thought?
Re: (Score:2)
I agree - the dual-core (four pseudo-cores if you include hyper-threading) Atom CPU is a very capable machine. I'm not a gamer, but I did get the Intel D945GCLF2 MB with the dual-core CPU, Gigabit LAN, and dual SATA ports for a test/trainer 64-bit Windows Server 2008 machine and it works great. Is it the fastest machine I own, no, but as a build-up/test/tear-down box it works very well, and if I leave it on, the power drain is minimal. I do wish it had something other than a PCI slot (PCIe or even PCI-X wou
Re: (Score:2)
Intel is restricting the dual-core Atom to desktops AFAIK. Probably something like 80-90% of Atoms are in netbooks, so the dual-core Atom not an option for most people for now, at least.
Intel will hate it. (Score:2)
This is a shift away for the CPU to the GPU and Intel will hate it.
This or even the plane atom is good enough for a very large percentage of users.
This would work for just about every Office PC, average home user, and media center.
About the only tasks this will now work for is media editing, gaming, or heavy technical use.
The one problem I see with it is the cost. That extra money is a big percentage of the cost of the one of these mini systems.
I so want one.
Re: (Score:2)
This is a shift away for the CPU to the GPU and Intel will hate it.
Well they should have seen the writing on the wall. It is much easier to make an efficient specialised processor, than an efficient generic one. With OpenCL on the horizon, don't be surprised to see computer with dual graphics chips (one used for OpenGL/Directx and the other for OpenCL). Note I see OpenCL being beneficial to physics aspects of games. If Intel has any sense they will either improve their graphics chips or invest in Nvidia.
Re: (Score:2)
With OpenCL on the horizon, don't be surprised to see computer with dual graphics chips (one used for OpenGL/Directx and the other for OpenCL).
I believe that you are incredibly wrong - thank goodness. All you have to do is look at history to see that computers have been steadily moving towards commoditization. Even the last Cray machines were just bundles of commodity processors with some (very important and significant) glue.
Intel is simply going to put more of the functionality of GPUs into its CPUs. Meanwhile they are talking about a future with "thousands of cores" in the computers of mainstream users. While that is clearly a long way off if i
Re: (Score:3, Informative)
Actually, AMD is out front on putting GPU functionality into CPU's with the "fusion" platform. Intel is taking the long way around with Larrabee and putting X86 into the GPU. Go figure. Anyhow, the end result will be to reduce chip counts and take advantage of the high number of transistors that can cheaply be put on a singl
Re: (Score:2)
Actually, AMD is out front on putting GPU functionality into CPU's with the "fusion" platform.
Well, I'm not saying who will get there first or do it better, I don't know yet. And haven't done much research either, but the point is that's the way that computing has always gone and there is no reason to believe it will be different. The Amiga was a landmark computer because with all its cheap little custom chips it could kick the shit out of a 486 for multimedia tasks, but then the PCs became completely commoditized and you could do with brute force and ignorance and cheap parts what the Amiga could d
Re: (Score:2)
Actually I am not so sure.
For one thing super computers often do include co processors for things like vector ops and such.
GPUs are becoming more and more important not less and less. Intel has failed to make a good GPU. Why is up to debate. Frankly I think CPUs have for now reached a good enough level for most people. Now they want lower power, heat, size, and at best HD playback.
As to gaming take a look at the XBox360 and PS3. They are very GPU heavy and pretty CPU light. Even the Cells SPEs are really mo
Re: (Score:2)
Intel has claimed that they will get rid of the need for GPUs by adding GPU cores to their CPUs. NVidia has claimed that CPUs don't matter much anymore, and that their GPUs are what consumers really need to go forward.
Time will tell who is right, so I own both INTC and NVDA ;-)
A reasonable start (Score:5, Insightful)
Games performance isn't really the issue for these. These things aren't designed for games.
What these are best used for are Media Centre setups. However it doesn't play all 1080p content smoothly which is a major issue. There are plenty of options for this kind of thing, the Popcorn hour, the WD HDTV box. Those are good to a point but fall down on format support, especially mkv which doesn't have full subtitle and codec support on either.
The current best option is an energy efficient Athlon based setup. These cost about $75-$100 more than an atom system and use a bit more power but they'll play back any video you throw at them without dropping any frames.
Maybe with a dual core atom and using dual core optimised codecs this will reach the goal of never having to notice a dropped frame, regardless of format and bit rate but this atom solution still isn't the Media center beast it could be.
Re: (Score:2)
Games performance isn't really the issue for these. These things aren't designed for games.
Totally agree. who is going to be playing Half Life 2 on a 8" or 10" screen? =P
Maybe some other games, but FPS? With a trackpad? (wh owants to lug around a mouse with their ultra portable netbook?)
Re:A reasonable start (Score:5, Interesting)
Totally agree. who is going to be playing Half Life 2 on a 8" or 10" screen? =P
Certainly, but looking towards platforms such as PSP, Nintendo DS and the iPhone, we can see that there is a market for games taking advantage of small format screens. While Half Life 2 won't be targeted at these platforms, there are already FPS games for some of these platforms, though then again we are more likely to see a Nvidia + ARM combination, than a Nvida + x86 combination, simply because of battery limitations.
Re: (Score:2)
I might be wrong, but I think any given netbook is going to be pretty cool with running PSP/DS/iPhone games fairly well? The, for lack of a better term, computing power of an iPhone vs. MSI Wind has to be tilted towards the Wind. I don't think Half-Life 2 is going to be running on an iPhone anytime soon :)
Re: (Score:2)
Re: (Score:2)
So, what's a good energy efficient athlon setup? With dual core? and 64 bit?
I used to know both intel and amd cpus quite well, but it was years and years ago when things still made sense. Nowadays, I keep putting off buying something because I can't figure out what's what!
Re: (Score:1)
I have to disagree with you on this point. I replaced an Athlon64 3800+ machine with a Popcorn Hour because the Athlon simply did not have the power to play back 1080p content. It could manage 720p no problem, but would stutter on 1080p enough to make it unwatchable. It was also far harder to setup and use MythTV then it is the PH. Sure the PH has it flaws, but it works out of the box with a remote and plays nearly any media you stream at it. Also, if you are starting from scratch, it is cheaper.
-mrsalt
What's wrong... (Score:2)
...with using IBM's Sequoia for a graphics processor? Ok, they need to work on the price a little, and maybe it's a little bulky for most portables, oh and the power supply might need upgrading, but aside from that, what have the Romans ever done for us?
I'd buy some. (Score:4, Interesting)
With recent developments in VDPAU, the HD capable GPU acceleration for Linux, I could use this board. The only thing I would change is to make it wider and move all the ports to the back. Include an LCD or VFD if you want to get fancy, and and IR receiver on the front. Perfect MythTV frontend machine. I would like the dual-channel RAM though, to help with 1080i playback.
Put it in a nice small case like those used for modern DVD players, and they have a winner.
Re: (Score:3, Interesting)
My first thought exactly - gimme, gimme, gimme, need a new Myth frontend. Let's see - low power, good Linux-supported decompression acceleration, and has an HDMI port. This is exactly what I've been waiting on.
Re: (Score:1)
VDPAU still can't play a lot of video.
Another important thing, VDPAU can't utilize memory allocated by Turbo Cache, and 9400M don't has enough integrated RAM to decode HD video (not sure about 720p, but sure it can't do 1080p)
Re: (Score:1)
Well, I'm waiting for the Asus eeebox b204 and b206 to come. They have an ATI Radeon HD 3450 series with 256 MB DDR2 memory and HDMI output.
And ATI HD Accelerated playback is *not* supported under linux. They barely manage video playback via XVideo.
25 Watts? (Score:3, Informative)
Bruce
Re: (Score:1)
But intel graphics can't do 1080p at 9W or 11W.
Re: (Score:3, Insightful)
Perhaps because there isn't optimized MPEG playback code for that chipset?
Part of the problem here is that on the desktop, Intel's vendors don't want great Intel graphics, they want to be able to sell up to an external display card. So, it's only the laptop platform that could drive an improvement in Intel graphics.
Re: (Score:2)
Not that it even matters on the 1024x600 screen on most netbooks (or the 1280x800 screen on a select few). Oh, you were talking about hooking it up to an HDTV monitor. Well, that's what the HTPC or game console is for. At the moment it really seems like the Ion platform is aiming for a niche that barely exists. Now, for a really low-powered HTPC, this might show some promise in another generation or two. But at the moment, I'll pass.
Re: (Score:2)
Re: (Score:2)
For the last time, the chipset your are referencing is the DESKTOP variant, not the MOBILE variant.
Mobile 945: 7w TDP [intel.com].
945 Desktop: 22w TDP [intel.com].
Nobody in their right mind is going to put a desktop chipset in a netbook. So yes, the 25w consumed by this test platform (4w for CPU + 20w for chipset) IS significantly more than most netbooks (4w for CPU + 7w for chipset), and raises the question of whether this is a viable solution for HD playback.
Re: (Score:1)
Bruce
Re: (Score:1)
The atom platform is a joke. It's competing with ARM SoC solutions that draw under 2W and can play 720p H.264 in this power envelope, dropping to around 15mW if all you're doing is playing music. It's an order of magnitude worse than its competitors, but the massive Intel marketing machine keeps screaming 'look at us! We're relevant!' and the tech press believes them.
Unless you want to run Windows (and surely you don't Bruce), there's no compelling reason to go with Intel for a portable.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Aha, but where can I as a consumer buy this magic ARM SoC solution laptop to run linux on?
Game engines targetted to this platform? (Score:2)
While off-the-shelf PC games might not work that great on this combo, I suspect that there could be really good, beautiful looking games created and fine-tuned just for such a platform. Someone could possibly use this as the basis for a portable entertainment system to compete with the Nintendo DS, PSP, etc. XBox Portable anyone? If you consider that Atom-based systems would typically have smaller screens, so that you might be looking at a resolution of maybe 640x480 or 800x600 (or perhaps wide-screen aspec
Re: (Score:2)
I would like to for such standard to go even "lower", towards current netbooks as the base - there's already quite a lot of them, and the Poulsbo/new integrated Atom ones won't be significantly more powerfull.
But...there's either no need for it or it won't bring anything noteworthy to the table - definatelly not any big release, just casual games at most. And you can already find a lot of them that run nice, and also a lot of really good, older "big" releases.
Mac Mini (Score:1)
And, maybe AppleTV? I wouldn't imagine a lot of mini owners are gaming on them, and if they can get the price down to reasonable, why not?
Will stick to my 45WAthlonX2/AMD740G mythtv box until AppleTV gets a tuner
But I would like to replace that aging G4 that I occasionally use
They only tested the single core CPU...? (Score:2, Insightful)
A review of the ION platform with a dual core Atom 330 is here:
http://www.pcper.com/article.php?aid=663 [pcper.com]
My new mythtv frontends!!!! (Score:4, Interesting)
I see this being the hot new frontend for mythtv. With VDPAU supported for HD decoding, fanless/quiet fan, atom processor, a bit of ram, and a SD card for storage I could make one hell of a nice tiny front end.
I want one now!
A Atom based mac mini is a cpu power down grade an (Score:2)
A Atom base mini is a cpu power down grade and there is no way that it will be a good buy at $600 maybe $400 - $500 but it should use ddr2 not high cost ddr3 laptop ram. AND DON"T thing about not giving a free mini dp to dvi cable with it $30-$100 more to use your own display??.
Apple may try to sell a system with this a 1gb of ram at $600 but that will make the old mini still look good even more so if it drops to $500 or less and ddr 3 will just make it cost more as the cpu is so slow that it is better to g
The Atom + Ion platform better suited for AppleTV (Score:2)
Assuming that it plays 1080p correctly, the Ion platform would make for an excellent AppleTV. The only question is that will it be cheap enough. But the extra horsepower of the Atom could allow the AppleTV to be used for other things.
There were rumors about Apple using the Ion platform in the mini but I believe those to be false. The AppleTV appears to be a much more likely target.
Re: (Score:2)
I rather doubt apple will downgrade the mini cpuwise, they will add the nivida gpu, no doubt, but it probably will be an upgrade like the macbooks and the air!
No atom, apple tv however which still has the g4 would be an option, but is the atom even faster than the g4, I am not sure about it!
Re: (Score:2)
It would make a great Macbook Nano or low cost edu box for Apple to fill the niche the old Mac LC series and eMacs filled.
I could see a Mac Nano being built. Maybe, just maybe, Apple could build a usable decent little box to compete with $350 Walmart crap PC's that would not be a total POS with crap HW quality. I would pay $400 to get a genuine Apple branded Atom-based machine that I could run OS X on without disgusting patches and hacks.
The Atom is fast enough to not be a complete toy and with the Ion pl
WD TV (Score:2)
I bought a WD TV Friday, and it is going back to Fry's tonight. The fact that it couldn't "aggregate" my mp3s or play back two thrids of my mst3k avi files was a deal breaker.
It also failed indicate what the problem was when I (unknowingly) tried to use a non-HDCP compliant HDMI cable. (It allows selecting upto 720p output on HDMI, but the configuration menu keeps reverting to showing "composite" as the display type.) I figured it out by swapping cables with my Philips (HDMI equipped) DVD player.
Remote sens
Kate (Score:1)
What of Tegra? (Score:1)