Killer Mobile Graphics — NVIDIA's GeForce 8800M 89
MojoKid writes "Today NVIDIA unveiled the much-anticipated GeForce 8800M series of mobile graphics processors. The GeForce 8800M is powered by the new G92M GPU which is built on a 65nm manufacturing process and shares a lineage with the desktop-bound G92 GPU on which NVIDIA built their GeForce 8800 GT. The 8800M series will come in two flavors, a GTX and a GTS, with different configurations of stream processors, 64 for the GTS model and 96 for the high-end GTX."
Effects on Battery Life? (Score:1, Insightful)
Although to be fair they're probably targeting 17" "laptops" anyway, and at that size I guess it's fair to say battery life shouldn't be a primary concern (heck if you're carrying it around I'd be more worried about getting a hernia).
Re:Effects on Battery Life? (Score:4, Informative)
Well, it's supposedly more mizerly (?) than the 512MB 8600m GT, which I have. With a 1.6 c2d, 2GB ram, XP Pro, and a 85WHr battery, I can get over 5 hours of battery life, which I think is dang good considering that I can play games quite well. That of course is not while gaming, only web browsing and using IM.
Of course, this will vary from laptop to laptop, YMMV.
Re:Effects on Battery Life? (Score:5, Insightful)
Re: (Score:2)
Re: (Score:1)
If you're buying a laptop because it has this graphics chip, battery life is secondary to frames per second. The people that are buying these laptops buy them because they're suited for playing games anywhere while plugged in, not traveling and off site work.
I'd like to second this. I thought I would be chiefly concerned about battery life, so I payed extra for the 12-cell battery, that when I undervolted my CPU, gave me 6.5h of laptop time.
Later I realized, its only about once/year that I actually make use of that time. All other times I have an outlet nearby.
Note the weight of the 12 cell ~ weight of 6-cell + charger.
And I was stuck with a laptop that played wow on lowest settings at 8fps.
Now, I would go for the compact-ish laptop (and try to keep it light),
Re:Effects on Battery Life? (Score:5, Informative)
Maybe the NVIDIA Technical Brief will yield some answers: http://www.nvidia.com/object/IO_26269.html [nvidia.com] (Warning, spawns a PDF)
PowerMizer7.0 Power Management Techniques:
Use of leading edge chip process
CPU load balancing
Intelligent GPU utilization management
Revolutionary performance-per-watt design
PCI Express power management
Aggressive clock scaling
Dedicated power management circuits
Display brightness management
Adaptive performance algorithms
CPU Offload Example (from NVIDIA's Technical Brief)
Figures 3 and 4 (see PDF) show CPU utilization when running a Blu-ray H.264 HD movie using the CPU and GPU, respectively. You can see that under the GPU video playback, 30% less CPU cycles are being used. This dramatic reduction in CPU usage means less power is being consumed by the processor, therefore system power consumption is reduced. resulting in longer battery life.
Note: Testing was conducted on an Intel Centrino based platform with 2 GHz Core2 Duo processor, and a GeForce 8600M GS, running Intervideo WinDVD8 playing a Casino Royale H.264 Blu-ray disc.
Re: (Score:2)
Power usage not a win (Score:2)
The real benefeit here is the ability to use the slowest, lowest powered chip you can find in a media center and still be able to decode HD.
Re: (Score:1, Interesting)
Then why even bother reading articles about high-end graphics cards? Were you hoping that somehow, nVidia managed to build in magical transistors that uses less power the more of them there are? Everyone knows (or should know) that high end graphics cards use a lot more energy than low end ones. Partly because they run at much higher speeds, partly because they have a lot more transistors. This one in particul
Re: (Score:3, Informative)
That effectively decreases the number of proce3ssors and of course saves a *lot* of Watts.
Re:Effects on Battery Life? (Score:4, Interesting)
Actually, as time goes on I do less and less live recording, and when I do, I just use one of these extra-cool portable digital recorders and go somewhere quiet. When you work in a home studio, you make adjustments.
But I still perk up whenever I hear about "decreased voltage". I'm all for saving watts.
ATI's new desktop graphics cards do this (Score:3, Informative)
As for PowerPlay, which is usually found in mobile GPUs, AMD has opted to include broader power management support in their desktop GPUs as well. While they aren't to wholly turn off parts of the chip, clock gaiting is used, as well as dynamic adjustment of core and memory clock speed and voltages. The command buffer is monitored to determine when power saving features need to be applied. This means that when applications need the power of the GPU it will run at full speed, but when less is going on (or even when something is CPU limited) we should see power, noise, and heat characteristics improve. One of the cool side effects of PowerPlay is that clock speeds are no longer determined by application state. On previous hardware, 3d clock speeds were only enabled when a fullscreen 3D application started. This means that GPU computing software (like folding@home) was only run at 2D clock speeds. Since these programs will no doubt fill the command queue, they will get full performance from the GPU now.
Hopefully, Nvidia will follow the lead.
Re: (Score:2)
Powerful video card? Try a 8800 GT. Want it quiet? Try swapping out the cooler for an Accelero S1 passive heatsink. It will run much significantly cooler than the stock active cooling solution... and if you want to get fancy (and really drop your temps), attach a low-speed 120mm (ie, nice and quiet) fan to it.
If you pay attention to noise when designing your system, you should be fine. I've got a completely
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Alienware already has two 8800M GTX models (Score:5, Informative)
http://www.alienware.com/intro_pages/m17x_m15x.aspx [alienware.com]
NVIDIA GeForce 8800M Link: http://www.nvidia.com/object/geforce_8M.html [nvidia.com]
Re: (Score:3, Insightful)
Re: (Score:2)
I might, if I had money to burn on a gaming rig, and people were going to see it. They do have that 'look at me, I'm well off' sheen.
Re: (Score:2)
In much the same way Monster cables have that, "I have plenty of money and have no problems parting with it" sort of way. Which is a very strong attractor for certain kinds of chicks.
Re: (Score:1)
Re: (Score:2)
Which is kind of ironic, since those chick will have no idea what a Monster cable or an Alienware laptop looks like. They generally go by how shiney your cars is.
Re: (Score:2)
iMac (Score:3, Informative)
Unlikely. (Score:2, Interesting)
Could be completely unfounded rumor, so take with a grain of salt, but it does sound like the Apple we know and love [to hate].
By the way -- this is why I love to be a PC/Linux user. I can buy whatever hardware I want, I'm not bound by the moods of His Holy Turtleneckness. The disadvantage is, it has to be something with a Linux driver, but in a pinch, I can
Re: (Score:2, Informative)
My work machine is brand new Macbook Pro. It's got an nvidia card in it. Like all Macbook Pros do. [apple.com] So does the Mac Pro. [apple.com]
Re:Unlikely. (Score:5, Informative)
Apple was going 100% ATI, but then ATI leaked about how they'd got the contract to the press and Jobs was furious. He really HATES secrets getting let out (I've no idea why, it seems to be industry standard practice. But if you ever happen to enter a NDA with applie then you better honour it!)
Anyway, Apple pulled the contract and shifted every mac they could to nvidia. However, for some reason they didn't shift imac despite shifting everything else. I have a vague suspicion it is to force apple developers to always code in a GPU independent way (basically to keep nividia honest) but as an imac owner, it is very annoying.
Re: (Score:3, Informative)
You have to go pretty far back in Apple's product line to find a point where there wasn't a pretty even mixture of video card combination available.
Re: (Score:2)
Nvidia gets the macbook pro and the default choice for the mac pro -- 2 lines
Intel gets the mini and the macbook --2 lines
ATI gets the imac, the xserve and a choice for the mac pro -- 3 lines
so ATI wins on number of product lines.
but in terms of total units to apple I suspect intel is the winner followed by nvidia.
Re: (Score:1, Insightful)
That's awesome! So... how many drivers have you written? Or how many drivers do Linux users write, on average?
Users don't write drivers. The Macintosh presents a complete system, which Just Works without someone having to worry about what component is used and what is not used.
Just works... (Score:2)
Like, say, when you plug in a peripheral that doesn't work.
Dude, I know. I had a Powerbook. I know what the Mac Experience is, and I know why it's attractive.
I also know that the second you want some non-standard hardware, or, really, non-standard anything, it's a crapshoot. And as a Linux user, when I say hardware support is a "crapshoot", it means really, really fucking bad.
But as you may not realize, I actually am a software developer -- so if I really can't find anyone
Re: (Score:2)
Apple isn't stupid. They remember what happened back in the late 1990s - their only graphics vendor was ATI who was able to completely dictate what g
Re: (Score:2, Insightful)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2, Informative)
Re: (Score:2)
Re: (Score:3, Informative)
Yep, that might be true.. but to get to 256MB in graphics memory you have to spend $2499.00 US. That's just crazy. (MacBook Pro 15") I'm sorry, but I'll just get a iMac and purchase a cheap PC based laptop, and
Re: (Score:1)
Example, their current pro portable chip is the 8600M, not the current fastest one they could have gone with, and they even underclocked it below its standard speed.
Re: (Score:2)
Looks nice... (Score:1)
Passively cooled desktop cards? (Score:5, Interesting)
Re:Passively cooled desktop cards? (Score:5, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Eh, there are silent cards already (Score:2)
Passive cards are nothing new and you can get them with regular desktop GPU's. Just shop around a bit. I got one, granted I keep it extra cool by having a very large (but slow moving) fan installed on the case that blows on it, but that is just to be safe, it has enough metal strapped to it to cool it with normal airflow in your case.
Wanting a laptop GPU in your desktop is just silly, unless you want to consume less power. Passive cooling has been around for a long time and exists for the CPU as well. Just
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Much better than simply putting the drives on rubber grommets.
If you also combine this with a low-noise or passive-cooling psu, have all remaining fans rubber-mounted and get some noise absorbing material to put on the inside of your chassis, you'll have a nearly silent pc.
Re: (Score:1)
My plan is to wait until these drop below $100, then get another for a silent SLI configuration.
MSI NX8600GTS-T2D256EZ HD GeForce 8600GTS 256MB 128-bit GDDR3 PCI Express x16 HDCP Ready SLI Supported Video
Somewhat unrelated but... (Score:1)
Wait... (Score:3, Funny)
Wait ten days... 8800GTSE2 with pack in game
Wait ten days... 8800GTSE OVERCLOCKED EXTREME EDITION
Wait ten days... 8850LE
Wait ten days... 8850GTS
Wait six months... driver update makes 8850GTS 25% slower.
Wait ten days... 9800GT
Wait three months... driver update makes 8850GTS 25% slower.
Wait ten days... 9850GTS
Wait three months... driver update makes 9800GT 25% slower.
This is the song that never ends!
Yes it goes on and on my friends...
will it work on exisitng laptops (Score:1)
Re: (Score:2)
Geez. I think someone got lost on the intertubes here.
Re: (Score:2, Informative)
Don't bring flames if you don't know what you are talking about.
Re: (Score:1)
So, in conclusion I think this will probably fit in the Del
Re: (Score:1)
Re: (Score:2)
Can these be bought without a laptop? (Score:2, Interesting)
Yes, I was dumb enough to buy a lappy from Alienware...definitely catches attention in the airport, though.
Re: (Score:1)
Dell & Alienware use a proprietary formfactor which is not generally available elsewhere.
I already paid 300% of what I should have on a GPU upgrade from Dell. Not only was the process thoroughly frustrating and overpriced, but my laptop just barely gets by with the approved Dell upgrade.
I was looking in this thread for vendor recommendations, but sadly I don't see any. So here are mine: (DISCLAIMER: I currently own a Dell. I am not a professional tech writer and d
Re: (Score:1)
Your comment about the Alienware laptops isn't entirely true...my m7700 is Clevo with a different top on it and uses MXM PCI-e cards. Buying them through Alienware is absolutely crazy, however. Before I voided my warranty by flashing my own BIOS with one from their support site (go figure) I had them send me a replacement card as mine had went on the fritz. The price for my outdated card (7800GTX Go) was absurd (in the area of $1200USD for just
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Xvmc (Score:2)