Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Portables

Killer Mobile Graphics — NVIDIA's GeForce 8800M 89

MojoKid writes "Today NVIDIA unveiled the much-anticipated GeForce 8800M series of mobile graphics processors. The GeForce 8800M is powered by the new G92M GPU which is built on a 65nm manufacturing process and shares a lineage with the desktop-bound G92 GPU on which NVIDIA built their GeForce 8800 GT. The 8800M series will come in two flavors, a GTX and a GTS, with different configurations of stream processors, 64 for the GTS model and 96 for the high-end GTX."
This discussion has been archived. No new comments can be posted.

Killer Mobile Graphics — NVIDIA's GeForce 8800M

Comments Filter:
  • Nice 3D graphics are great, but if I'm buying a mobile PC I'm more concerned about my battery life. The 8800M article mentions, "Power Mizer 7.0 Technology," but the lack of numbers backing it is concerning.

    Although to be fair they're probably targeting 17" "laptops" anyway, and at that size I guess it's fair to say battery life shouldn't be a primary concern (heck if you're carrying it around I'd be more worried about getting a hernia).
    • by gerf ( 532474 ) on Tuesday November 20, 2007 @01:01AM (#21417131) Journal

      Well, it's supposedly more mizerly (?) than the 512MB 8600m GT, which I have. With a 1.6 c2d, 2GB ram, XP Pro, and a 85WHr battery, I can get over 5 hours of battery life, which I think is dang good considering that I can play games quite well. That of course is not while gaming, only web browsing and using IM.

      Of course, this will vary from laptop to laptop, YMMV.

    • by futuresheep ( 531366 ) on Tuesday November 20, 2007 @01:07AM (#21417163) Journal
      If you're buying a laptop because it has this graphics chip, battery life is secondary to frames per second. The people that are buying these laptops buy them because they're suited for playing games anywhere while plugged in, not traveling and off site work.
      • If you're buying a laptop because it has this graphics chip, battery life is secondary to frames per second. The people that are buying these laptops buy them because they're suited for playing games anywhere while plugged in, not traveling and off site work.
        Also lots of people buy laptops as desktop replacements these days, just because they take up less space.
      • If you're buying a laptop because it has this graphics chip, battery life is secondary to frames per second. The people that are buying these laptops buy them because they're suited for playing games anywhere while plugged in, not traveling and off site work.

        I'd like to second this. I thought I would be chiefly concerned about battery life, so I payed extra for the 12-cell battery, that when I undervolted my CPU, gave me 6.5h of laptop time.

        Later I realized, its only about once/year that I actually make use of that time. All other times I have an outlet nearby.

        Note the weight of the 12 cell ~ weight of 6-cell + charger.

        And I was stuck with a laptop that played wow on lowest settings at 8fps.

        Now, I would go for the compact-ish laptop (and try to keep it light),

    • by Zymergy ( 803632 ) * on Tuesday November 20, 2007 @01:15AM (#21417191)
      PowerMizer Mobile Technology page: http://www.nvidia.com/object/feature_powermizer.html [nvidia.com]

      Maybe the NVIDIA Technical Brief will yield some answers: http://www.nvidia.com/object/IO_26269.html [nvidia.com] (Warning, spawns a PDF)

      PowerMizer7.0 Power Management Techniques:
      Use of leading edge chip process
      CPU load balancing
      Intelligent GPU utilization management
      Revolutionary performance-per-watt design
      PCI Express power management
      Aggressive clock scaling
      Dedicated power management circuits
      Display brightness management
      Adaptive performance algorithms

      CPU Offload Example (from NVIDIA's Technical Brief)
      Figures 3 and 4 (see PDF) show CPU utilization when running a Blu-ray H.264 HD movie using the CPU and GPU, respectively. You can see that under the GPU video playback, 30% less CPU cycles are being used. This dramatic reduction in CPU usage means less power is being consumed by the processor, therefore system power consumption is reduced. resulting in longer battery life.
      Note: Testing was conducted on an Intel Centrino based platform with 2 GHz Core2 Duo processor, and a GeForce 8600M GS, running Intervideo WinDVD8 playing a Casino Royale H.264 Blu-ray disc.
      • I can agree that power usage by the CPU is cut since the utilization is not as high. But since the GPU is doing all this decoding, power usage by the GPU will increase. If the GPU is equal in decoding efficiency to the CPU, then power consumption is a wash and there is no power savings.

        The real benefeit here is the ability to use the slowest, lowest powered chip you can find in a media center and still be able to decode HD.
    • Re: (Score:1, Interesting)

      by Hektor_Troy ( 262592 )

      Nice 3D graphics are great, but if I'm buying a mobile PC I'm more concerned about my battery life.

      Then why even bother reading articles about high-end graphics cards? Were you hoping that somehow, nVidia managed to build in magical transistors that uses less power the more of them there are? Everyone knows (or should know) that high end graphics cards use a lot more energy than low end ones. Partly because they run at much higher speeds, partly because they have a lot more transistors. This one in particul

      • Re: (Score:3, Informative)

        by m94mni ( 541438 )
        Actually, one important part of newer PowerMizer designs (>3.0 maybe) is that parts of the GPU are *turned off* when not in use. Other parts run on decreased voltage.

        That effectively decreases the number of proce3ssors and of course saves a *lot* of Watts.
        • by PopeRatzo ( 965947 ) * on Tuesday November 20, 2007 @07:56AM (#21419021) Journal

          Actually, one important part of newer PowerMizer designs (>3.0 maybe) is that parts of the GPU are *turned off* when not in use. Other parts run on decreased voltage.
          Sounds good. Why don't they put one of these into a desktop for me? My machine is used for digital music and video editing and post-production. It would be nice to have a powerful video card that doesn't need too much noisy refrigeration. As it is, I've got a constant fight with heat and noise. I've built a quiet enclosure, but it gets warm in there. I've tried liquid cooling, but it's also a little noisy. Right now, I've got liquid cooling inside the enclosure, and cables as long as practical. I'm planning a closet for my gear, but I've got to wait until my kid finishes school and moves out first. I'd rather live with the noise and keep her around.

          Actually, as time goes on I do less and less live recording, and when I do, I just use one of these extra-cool portable digital recorders and go somewhere quiet. When you work in a home studio, you make adjustments.

          But I still perk up whenever I hear about "decreased voltage". I'm all for saving watts.
          • From http://www.anandtech.com/video/showdoc.aspx?i=3151&p=2 [anandtech.com] :

            As for PowerPlay, which is usually found in mobile GPUs, AMD has opted to include broader power management support in their desktop GPUs as well. While they aren't to wholly turn off parts of the chip, clock gaiting is used, as well as dynamic adjustment of core and memory clock speed and voltages. The command buffer is monitored to determine when power saving features need to be applied. This means that when applications need the power of the GPU it will run at full speed, but when less is going on (or even when something is CPU limited) we should see power, noise, and heat characteristics improve. One of the cool side effects of PowerPlay is that clock speeds are no longer determined by application state. On previous hardware, 3d clock speeds were only enabled when a fullscreen 3D application started. This means that GPU computing software (like folding@home) was only run at 2D clock speeds. Since these programs will no doubt fill the command queue, they will get full performance from the GPU now.

            Hopefully, Nvidia will follow the lead.

          • by Kamots ( 321174 )
            "It would be nice to have a powerful video card that doesn't need too much noisy refrigeration."

            Powerful video card? Try a 8800 GT. Want it quiet? Try swapping out the cooler for an Accelero S1 passive heatsink. It will run much significantly cooler than the stock active cooling solution... and if you want to get fancy (and really drop your temps), attach a low-speed 120mm (ie, nice and quiet) fan to it.

            If you pay attention to noise when designing your system, you should be fine. I've got a completely
            • I don't know about Linux, but apparently Nero has a DriveSpeed application that will allow you to limit the speed of your CD-ROM drives. Just running them at 16x or 32x instead of 48x or whatever will significantly reduce your optical drive noise. I just tend to rip CD and DVD images to the hard drive, so I don't worry about it one way or the other ;)
              • just tend to rip CD and DVD images to the hard drive, so I don't worry about it one way or the other ;)
                Right. I wouldn't run a CD or DVD drive when I was trying to record audio or mix. They're way too noisy. More and more since I started using Alcohol and Daemon Tools, the only time I use a CD or DVD is when I first take it out of the box. After that, it's on my shelf.
    • I love my 17" notebook, but I agree that it is a bit large and power-hungry. I only carry it around when I know I will use it, and for the off-days I always have my trusty HTC Kaiser. Now as far as graphics are concerned, I am a bit jealous of these new cards. I hope to hear news of the 8800M being offered as an MXM card so that I could (potentially) swap out my Go7600 for an 8800M. Come on NVIDIA, that's the whole point of having modular graphics cards!
    • I see no reason whatsoever why this card should spend significantly more battery power while in plain old 2D mode for normal "on the move" usage than "lesser" chips. Perhaps it even spends less than some comparable chips, seeing as it's built on a 65nm process. And if you're playing 3D games, I really doubt battery life is your biggest concern.
  • by Zymergy ( 803632 ) * on Tuesday November 20, 2007 @12:58AM (#21417113)
    It appears Alienware will be using the GeForce 8800M GTX in their "m15x" and "m17x" models:
    http://www.alienware.com/intro_pages/m17x_m15x.aspx [alienware.com]
    NVIDIA GeForce 8800M Link: http://www.nvidia.com/object/geforce_8M.html [nvidia.com]
    • Re: (Score:3, Insightful)

      by snl2587 ( 1177409 )
      That is, of course, if you really want to pay for Alienware.
      • That is, of course, if you really want to pay for Alienware.

        I might, if I had money to burn on a gaming rig, and people were going to see it. They do have that 'look at me, I'm well off' sheen.
        • I might, if I had money to burn on a gaming rig, and people were going to see it. They do have that 'look at me, I'm well off' sheen.

          In much the same way Monster cables have that, "I have plenty of money and have no problems parting with it" sort of way. Which is a very strong attractor for certain kinds of chicks.
          • No worries, they'll be repelled by his geek aura easily enough.
          • by mgblst ( 80109 )

            In much the same way Monster cables have that, "I have plenty of money and have no problems parting with it" sort of way. Which is a very strong attractor for certain kinds of chicks.


            Which is kind of ironic, since those chick will have no idea what a Monster cable or an Alienware laptop looks like. They generally go by how shiney your cars is.
    • by dbcad7 ( 771464 )
      Well, those are nice drawings of a laptop... are they actually manufactured yet ?
  • iMac (Score:3, Informative)

    by tsa ( 15680 ) on Tuesday November 20, 2007 @01:16AM (#21417195) Homepage
    Let's hope Apple puts this card in the next iMac instead of the crappy ATI they put in it now.
    • Unlikely. (Score:2, Interesting)

      Last I heard, Steve Jobs had some issues with nVidia, thus you get ATI for all new macs, end of story. Unless someone else comes along -- Intel, maybe?

      Could be completely unfounded rumor, so take with a grain of salt, but it does sound like the Apple we know and love [to hate].

      By the way -- this is why I love to be a PC/Linux user. I can buy whatever hardware I want, I'm not bound by the moods of His Holy Turtleneckness. The disadvantage is, it has to be something with a Linux driver, but in a pinch, I can
      • Re: (Score:2, Informative)

        by grahamd0 ( 1129971 )

        My work machine is brand new Macbook Pro. It's got an nvidia card in it. Like all Macbook Pros do. [apple.com] So does the Mac Pro. [apple.com]

      • Re:Unlikely. (Score:5, Informative)

        by Anonymous Coward on Tuesday November 20, 2007 @04:19AM (#21417921)
        Actually, you got it the wrong way around.

        Apple was going 100% ATI, but then ATI leaked about how they'd got the contract to the press and Jobs was furious. He really HATES secrets getting let out (I've no idea why, it seems to be industry standard practice. But if you ever happen to enter a NDA with applie then you better honour it!)

        Anyway, Apple pulled the contract and shifted every mac they could to nvidia. However, for some reason they didn't shift imac despite shifting everything else. I have a vague suspicion it is to force apple developers to always code in a GPU independent way (basically to keep nividia honest) but as an imac owner, it is very annoying.
      • Re: (Score:3, Informative)

        Intel graphics are already in Macbook (non-pros) and the Mac Mini; the low range of Apple products hasn't had a dedicated GPU/VRAM setup since the PowerPC days. In fact, there are fewer product lines with ATI chips than there are any others, since ATI is now only present in the iMac line of products and as a BTO option on the Mac Pro.

        You have to go pretty far back in Apple's product line to find a point where there wasn't a pretty even mixture of video card combination available.
        • there are fewer product lines with ATI chips than there are any others, since ATI is now only present in the iMac line of products and as a BTO option on the Mac Pro.
          Nvidia gets the macbook pro and the default choice for the mac pro -- 2 lines
          Intel gets the mini and the macbook --2 lines
          ATI gets the imac, the xserve and a choice for the mac pro -- 3 lines

          so ATI wins on number of product lines.

          but in terms of total units to apple I suspect intel is the winner followed by nvidia.
      • Re: (Score:1, Insightful)

        by Anonymous Coward
        ...in a pinch, I can write my own.
         
        That's awesome! So... how many drivers have you written? Or how many drivers do Linux users write, on average?
         
        Users don't write drivers. The Macintosh presents a complete system, which Just Works without someone having to worry about what component is used and what is not used.
        • ...right up until it Just Doesn't.

          Like, say, when you plug in a peripheral that doesn't work.

          Dude, I know. I had a Powerbook. I know what the Mac Experience is, and I know why it's attractive.

          I also know that the second you want some non-standard hardware, or, really, non-standard anything, it's a crapshoot. And as a Linux user, when I say hardware support is a "crapshoot", it means really, really fucking bad.

          But as you may not realize, I actually am a software developer -- so if I really can't find anyone
      • by Calibax ( 151875 ) *
        I would guess this is unfounded. After all, nVidia graphics are in the latest Mac Book Pro models as well as being the standard product in the Mac Pro. In fact, the Mac Pro series will support up to 4 graphics adapters, provided they are nVidia cards, as only a single ATI card is supported. nVidia also supplies workstation class video cards for the Mac Pro.

        Apple isn't stupid. They remember what happened back in the late 1990s - their only graphics vendor was ATI who was able to completely dictate what g
    • Re: (Score:2, Insightful)

      by Vskye ( 9079 )
      I'd agree here. Personally, I'd love a MacBook but I will never buy one until it comes with at least 256MB of dedicated video ram. Are ya reading this Steve? And yes, Nvidia would be a better option.
      • No, Steve isn't reading this. Steve doesn't care about the six laptops he'd sell worldwide with that configuration.
      • by Gr8Apes ( 679165 )
        Buy a MacBook Pro. Refurbs are getting pretty darn inexpensive, especially one generation back.
    • Re: (Score:2, Informative)

      Well, since we're talking about laptops and mobile graphics, I feel the need to point out that my new MacBook Pro has an nVidia 8600 GT in it. Apple has provided nVidia chips in the MacBook Pro line for a few months now. You can get 128 MB or 256 MB, depending on whether you buy the 15" or 17" model.
      • by tsa ( 15680 )
        Really? That's strange. I have the older MBP, a 15" model with the ATI Radeon X1600 card, and 256 MB.
      • Re: (Score:3, Informative)

        by Vskye ( 9079 )

        Well, since we're talking about laptops and mobile graphics, I feel the need to point out that my new MacBook Pro has an nVidia 8600 GT in it. Apple has provided nVidia chips in the MacBook Pro line for a few months now. You can get 128 MB or 256 MB, depending on whether you buy the 15" or 17" model.

        Yep, that might be true.. but to get to 256MB in graphics memory you have to spend $2499.00 US. That's just crazy. (MacBook Pro 15") I'm sorry, but I'll just get a iMac and purchase a cheap PC based laptop, and

    • Won't happen. Apple never uses the top graphics cards, they always pick from among the bottom or a few generations back.

      Example, their current pro portable chip is the 8600M, not the current fastest one they could have gone with, and they even underclocked it below its standard speed.
  • by Kris_J ( 10111 ) * on Tuesday November 20, 2007 @01:51AM (#21417329) Homepage Journal
    I need to upgrade an old PC that's built to be quiet, thus doesn't have a fan on the video card. Anyone know if these chips could be used to make a passively cooled desktop video card, and if they're likely to be?
    • by vipz ( 1179205 ) on Tuesday November 20, 2007 @02:25AM (#21417449)
      I believe the 8800GT on the desktop side of things uses the same G92 chip. Sparkle has already announced a passively cooled version of that: Press Release [sparkle.com.tw] Pictures of a passively cooled Gainward card have also been floating around the net.
    • Anyone know if these chips could be used to make a passively cooled desktop video card, and if they're likely to be?
      Dunno about these chips yet, but Asus has their Silent Magic line of passively cooled cards. Their EN8500GT SILENT MAGIC/HTP/512M [asus.co.nz] looks like a reasonable option.
    • passively cooled desktop cards are not particularly uncommon (so long as you arent wanting the top of the line uber-gaming rig) for example in the 30 seconds i spent looking at an online store i found this [centrecom.com.au] and this [centrecom.com.au], both of which are 8600 GT series cards. there didnt seem to be any 8800 series cards, but it wouldnt suprise me to see them later in on, once even faster 8xxxx series cards come out (8900? i dunno if its in the works)
    • I have a fanless 7600gt by Gigabyte, it's totally silent, does not run hot and works well in all games. It's smallish and does not require additional pci-x power connectors. HTH
    • Passive cards are nothing new and you can get them with regular desktop GPU's. Just shop around a bit. I got one, granted I keep it extra cool by having a very large (but slow moving) fan installed on the case that blows on it, but that is just to be safe, it has enough metal strapped to it to cool it with normal airflow in your case.

      Wanting a laptop GPU in your desktop is just silly, unless you want to consume less power. Passive cooling has been around for a long time and exists for the CPU as well. Just

      • the biggest problem with a silent pc (no cpu/gpu fans at least) is the amount of noise that hdd's generate and the fact there's not much you can do about them. I've got a quite case that has rubber grommets that hdds sit on but it doesnt help with the whine. admittedly I've got some pretty old drives in my case, so it would probably be a bit quieter if they were replaced. I'm looking forward to the day i can afford a decent sized solid state drive to boot off and then have a NAS device off in a cupboard som
        • by Junta ( 36770 )
          My experience is that some of the newer drives are pretty decent on noise. But, getting out of local disks isn't hard at all even with pretty run-of-the-mill motherboards, at least not with linux. You could setup up a box with the gobs of storage somewhere, put some linux on it (CentOS 5.1 might be a good choice) with software target (CentOS 5.1 ought to get that incuded, since RHEL5.1 did, otherwise google for iet). CentOS/RHEL support a fairly nomral install to an iSCSI software target with the softwar
        • Put in some new, fairly quite drives, then put them in something like a Quiet drive enclosure [scythe-eu.com].
          Much better than simply putting the drives on rubber grommets.
          If you also combine this with a low-noise or passive-cooling psu, have all remaining fans rubber-mounted and get some noise absorbing material to put on the inside of your chassis, you'll have a nearly silent pc.
    • by ffflala ( 793437 )
      If you are willing to go with a slightly lesser 8600GTS model, here's a passively-cooled card that I put into my quiet machine last month. FWIW, I get decent frame rates on Oblivion (running on WINE) w/ high quality settings. I hesitate to post a product link/shill for a company, but then again you did ask.

      My plan is to wait until these drop below $100, then get another for a silent SLI configuration.

      MSI NX8600GTS-T2D256EZ HD GeForce 8600GTS 256MB 128-bit GDDR3 PCI Express x16 HDCP Ready SLI Supported Video
  • when are we going to be seeing the PS3's RSX shrink from 90 down to 65nm? I imagine it's already taped out and ready to go.
  • Wait... (Score:3, Funny)

    by Vampyre_Dark ( 630787 ) on Tuesday November 20, 2007 @05:51AM (#21418307)
    Wait ten days... 8800GTSE
    Wait ten days... 8800GTSE2 with pack in game
    Wait ten days... 8800GTSE OVERCLOCKED EXTREME EDITION
    Wait ten days... 8850LE
    Wait ten days... 8850GTS
    Wait six months... driver update makes 8850GTS 25% slower.
    Wait ten days... 9800GT
    Wait three months... driver update makes 8850GTS 25% slower.
    Wait ten days... 9850GTS
    Wait three months... driver update makes 9800GT 25% slower.

    This is the song that never ends!
    Yes it goes on and on my friends...
  • I have a dell e1505, will this card work on my laptop
    • lol, this is not a card. You can't "upgrade" your laptop - it is not a desktop.

      Geez. I think someone got lost on the intertubes here.
      • Re: (Score:2, Informative)

        Actually, these cards are replaceable. This module is very easily removed and uses a standard mxm pci-e interface so that they can be changed out.

        Don't bring flames if you don't know what you are talking about.
      • Actually some people can swap out video cards for laptops. I know that a Dell 9300 (it's a 17" model) can upgrade from an ATI x300 (middle of the road then) to a Nvidia 6800go (high end). Some found out that a 7800gtx go could fit in the 9300 too. Sadly, because of BIOS and physical layout issues, that is where it ended for the 9300, but Dell's high end computer, unless they have integrated graphics, can usually swap video cards at least once.

        So, in conclusion I think this will probably fit in the Del
      • by Khyber ( 864651 )
        Someone needs to work in a laptop repair depot, sometime.
  • Just wondering...I have an Alienware m7700 that could really use a new card to put some life into it. Does anyone know if there is a place to just buy these cards without having to go through Dell, Alienware or some other company of that sort? Any linkage would be appreciated.

    Yes, I was dumb enough to buy a lappy from Alienware...definitely catches attention in the airport, though.
    • Yes, but not for Dell/Alienware laptops.

      Dell & Alienware use a proprietary formfactor which is not generally available elsewhere.

      I already paid 300% of what I should have on a GPU upgrade from Dell. Not only was the process thoroughly frustrating and overpriced, but my laptop just barely gets by with the approved Dell upgrade.

      I was looking in this thread for vendor recommendations, but sadly I don't see any. So here are mine: (DISCLAIMER: I currently own a Dell. I am not a professional tech writer and d
      • Thanks for the links, I will take a look when I get home.

        Your comment about the Alienware laptops isn't entirely true...my m7700 is Clevo with a different top on it and uses MXM PCI-e cards. Buying them through Alienware is absolutely crazy, however. Before I voided my warranty by flashing my own BIOS with one from their support site (go figure) I had them send me a replacement card as mine had went on the fritz. The price for my outdated card (7800GTX Go) was absurd (in the area of $1200USD for just
    • Typically, and find no reason that Alienware would be different, the graphics engine (GPU) is integrated with the Mother Board on laptops. Good luck unsoldering it and soldering in a new one.
  • by bh_doc ( 930270 )
    When NVidia finally implements Xvmc support in the 8 series [nvnews.net] I might start giving a shit about their products again. But I suspect by the time that happens the open source ATI drivers might be a real alternative. So I probably won't give a shit then, either. /angry at nvidia

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...