Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Bug Portables Hardware

Laptops With Certain NVidia Chips Failing 310

Eukariote writes "An estimated 18 million laptops with NVidia G84 and G86 graphics chips sold in the past one and a half years are experiencing high failure rates. Various laptop models from multiple manufacturers (Apple, Dell, HP, Lenovo, and others) are affected. NVidia blames it on bad chip packaging causing thermal failure. BIOS updates that turn the laptop fan on more frequently or permanently have been released by Dell and HP. The cynical interpretation is that this is likely to only delay the problem until the warranty has expired."
This discussion has been archived. No new comments can be posted.

Laptops With Certain NVidia Chips Failing

Comments Filter:
  • by DogDude ( 805747 ) on Thursday July 31, 2008 @11:06PM (#24427725)
    Having to have my laptop fan all of the time to account for a bad chip is an unacceptable fix. It's loud, it takes more electricity to run, and it shortens the life of the fan, and possibly the whole computer as a result.
  • Today's fun fact (Score:5, Insightful)

    by mu11ing1t0ver ( 1175051 ) on Thursday July 31, 2008 @11:09PM (#24427753)
    "The power of accurate observation is frequently called cynicism by those who don't have it." - George Bernard Shaw
  • So, is it not fair (Score:2, Insightful)

    by hubdawg ( 1148477 ) on Thursday July 31, 2008 @11:18PM (#24427831)
    to say that possibly the manufacturer packaged and shipped these chips with inadequate cooling ? The best chip of any manufacturer is susceptible to heat failure. Why is it all Nvidia's fault, seems to me it should be a shared responsibilty. They need to come up with a viable solution and compensate the people who may be affected.
  • by Manip ( 656104 ) on Thursday July 31, 2008 @11:38PM (#24427989)

    But it is Nvidia's fault because they signed off on these cooling units.

    That is like saying it isn't your car maker's fault if they put breaks in your car designed for a lawnmower and instead it is obviously the people who are making these lawnmower breaks fault for not making sure they can break a much heavier car...

    From what I'm reading the issue isn't with fans not performing as expected. The issue is that at the performance rate Nvidia had them at they simply didn't do the job needed and resulting in the GPU overheating and destroying its self.

    It is entirely, 100% Nvidia's fault. If you put in substandard parts you get a substandard result.

  • by Brain_Recall ( 868040 ) <brain_recall&yahoo,com> on Thursday July 31, 2008 @11:46PM (#24428053)
    No, Xbox 360's use an ATi chip.

    Although RoHS probably contributed to the RRoD, mostly it was an improper thermal solution. There was an article awhile back where it was discovered that Microsoft engineers decided to cut costs by designing the heatsink system themselves. Insufficient cooling and an improper mounting system allowed the board to warp more than the RoHS solder could handle. Newer 360's have lots of extra epoxy around the package to keep it from pulling too far away from the motherboard.

  • by Chuck Chunder ( 21021 ) on Thursday July 31, 2008 @11:49PM (#24428079) Journal

    Sounds like you're drawing a long bow to me.

    The problem here sounds like it's inside the chips themselves.

    I'm no metallurgist or hardware expert but I'd have thought solder is used when mounting the chips to the board, not inside the board itself.

  • by MachDelta ( 704883 ) on Thursday July 31, 2008 @11:52PM (#24428115)
    Agreed. Most reference coolers (and even a lot of 3rd party ones) aren't worth the cheap plastic used to make them. When I pulled the ref cooler off my 8800GT last year I was shocked to find that the fan didn't even sit completely atop the core, and that there was a LOT of excess thermal paste and stupidly thick thermal pads. It's little suprise the card was heatsoaking to 90C after a few hours of Bioshock and crashing itself! I can only cringe in horror when I imagine something like that stuffed into a freaking laptop. Fortunatly I had already planned on replacing the stock cooler (just a big heatpipe/heatsink with a 120mm fan ziptied to it) and lo and behold my card now has trouble hitting low 40's even after hours of flogging.

    \ Long story short, all manufacturers should be held accountable for the idiotic shortcuts they take when it comes to cooling their electronics. Its kind of an important aspect of electronics, no? Why not spend a buck or two more on something that actually does the job? Till then the first thing I do with any graphics card (or CPU for that matter) is still going to be to chuck the stock cooler into my parts bin, and then look for something bigger or better.
  • by fermion ( 181285 ) on Friday August 01, 2008 @12:07AM (#24428243) Homepage Journal
    The lack of lead in solder is a technological issue and as such is solved by more advanced technology. Certainly there are few people here who are opposed to higher technology?

    Sure we can whine about the extra work we are forced to do, or the fact that we have to pay for higher technology, but what good does that do. As technologically savvy people we live for the chance to advance the technology. We see these opportunities all over the place. Smaller cars require innovate means to increase safety and power. Smaller computers require more power efficient components and better batteries. Have one type of plastic go away just opens up a space for innovative new plastics. this is what makes the world exciting.

    So, if some company can't keep up, then they just suck as technologist and need to go away. A car company can't make technologically advanced cars, screw them. A video card manufacturer can't keep up with the trends and make a reliable video card, screw them too. I have involved in a number of situations where the process had to be rethought. Someone whines that a baby might be born with defect and we can't use this chemical. Someone complains that the dust will give them cancer and we must use a hood. Someone complains that we can't reliably dispose of an agent, and we must switch agents. Sure, we could say who cares if some worker dies. So what? But in each case the change was made, and technology gave us an equal or better solution.

    It is always easier to blame failure of the external forces rather than taking responsibility for a personal lack of creativity. This change is solder is not the first scape goat used by the those that lack innovative solutions, and won't be the last. There will always be firms that say a problem can't be solved, and they will be generally over thrown by those who then find the solution. I think that any number of lazy American firms are discovering that right now, while others are riding the way of can-do innovations.

  • by Anonymous Coward on Friday August 01, 2008 @12:09AM (#24428253)

    Note that NVIDIA's customers are the OEMs putting chips in laptops and desktops, not the users buying systems. Users will have to deal with the computer manufacturers.

    Of course, Dell, HP, and company will understandably try to minimize their expenditure on warrantee repairs.

  • by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Friday August 01, 2008 @12:42AM (#24428495)

    Your comment assumes that higher technology is always better.

    Sometimes what you need is a hammer, not a jackhammer. I'm not convinced the massive failures all over the place that result from using lead-free solder are worth the incremental environmental benefit.

  • by nawcom ( 941663 ) on Friday August 01, 2008 @01:03AM (#24428635) Homepage

    Unfortunately people think there's a difference with a macbook logic board (intel *coughs*) and an intel motherboard. Though a fan of OS X, Apple needs to give up on putting their apple logo stickers over the original 3rd party vendors hardware. It's a fucking PC/laptop with EFI.

  • by MRAB54 ( 1109973 ) on Friday August 01, 2008 @01:08AM (#24428667)

    Exactly. An overclocked PC can run for days on end completely stable in a room temp ~75 degrees. But if you put that desktop in an oven and get the air temp up around 150, something is gonna burn up. It really should be the OEMs responsibility for saying "Hey, your card gives out more heat than our laptop design can dissipate. We can't deploy these."

  • by Anonymous Coward on Friday August 01, 2008 @02:04AM (#24429009)

    I'm sitting on a lenovo laptop with an Nvidia and the fan doesnt come on under normal conditions until the laptop is what i would consider too hot (~65 celsius). its very quiet too.
    these laptops need their fan to have a performance mode. while i'm playing wow the bottom of the computer has literally burned my leg.
    I'm about to go buy a laptop cooling pad just to help lengthen the life of my laptop.

    while gaming, and using a program to force the fan on max, i sit around 70-74 Celsius.

  • by mrsteveman1 ( 1010381 ) on Friday August 01, 2008 @02:29AM (#24429131)

    Logic board at least makes sense. Motherboard sounds ridiculous.

  • by mysidia ( 191772 ) on Friday August 01, 2008 @02:31AM (#24429143)

    If you choose to take the cynical interpretation why not ignore the update and hope it fails in the warranty? Of course if you do that and it fails not long after the warranty then you'll have only yourself to blame for being a cynical bastard.

    You have the possibility of doing the opposite: leave your machine on often.

    And often stress the GPU for abnormal periods of time.. (i.e. leave a non-stop hardware-accelerated graphics processes running in order to tack out the GPU for a few thousand hours)

    Presumably, if your GPU is faulty, it should fail during the extreme system stress testing.

    And if it survives you can apply the BIOS patch afterwards if you see fit, confident that the GPU is fine.

  • Re:Model numbers (Score:2, Insightful)

    by dinisio ( 760164 ) on Friday August 01, 2008 @05:19AM (#24429913)

    Not all the Dell XPS M1330 mount an NVidia chip.
    I own one and mine has an Intel GM965/GL960.
    You had to pay 100$ more for the pleasure of having a burning NVidia chip ;)

  • by vegiVamp ( 518171 ) on Friday August 01, 2008 @06:28AM (#24430249) Homepage

    Yes, because serial version numbering is so very very confusing to the average luser.

  • by v1 ( 525388 ) on Friday August 01, 2008 @07:34AM (#24430609) Homepage Journal

    the key of course is not having to click 5-6 random alerts, notices, and popups before managing to get an application launched.

    Attention! Your mouse has moved! Would you like to go to the Security Center to see if someone has owned your computer?

    You have just clicked Cancel. Cancel or Allow?

  • by TheDarkMaster ( 1292526 ) on Friday August 01, 2008 @08:46AM (#24431433)
    I agree. Charlie maybe blames too much nVidia, but because this he needs to be wrong about this case? I guess not. you, me, anyone maybe don't like the messenger, but this don't make your message automatically a lie
  • by Hairy Heron ( 1296923 ) on Friday August 01, 2008 @02:52PM (#24438175)

    massive failures all over the place that result from using lead-free solder

    Citation please.

"The one charm of marriage is that it makes a life of deception a neccessity." - Oscar Wilde

Working...