Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Nvidia build quality?

  • 04-12-2010 8:23pm
    #1
    Registered Users, Registered Users 2 Posts: 1,146 ✭✭✭


    I have a 9800GX2 which has been baked once and has just gone on me again after about 6 months. Can't be bothered going through that again for another few months of use so reckon I'm going to treat myself to a new card. I swore I would never go Nvidia again after everything I read about baking the cards being a common problem but just wondering is this still problem with their newer cards? I've been looking around and the GTX 470 is getting great reviews so am very tempted but dont want to be baking this again down the line as I only got about 18 months out of my 9800GX2. Are Nvidia drivers still very dodgy especially for this card? Or would I be better off going down the ATI route with something from the 6800 or 5870 series which are all in the same price range.


Comments

  • Moderators, Technology & Internet Moderators Posts: 18,381 Mod ✭✭✭✭Solitaire


    HD5870 is the value king right now if you have enough money and PSU capacity (although I expect it would take less to run than a 9800GX2 anyway! :D). As for the underfill issue... I'm really not sure :o It may still be present in GF100 (GTX470) but might have been finally fixed in GF104 (GTX460) as it was the GF100 that finally showed nVidia how bad the issue actually was - at 40nm it went from a "slowly dying card" problem to a "silicon not even being usable at time of manufacture" one :eek:


  • Registered Users, Registered Users 2 Posts: 1,146 ✭✭✭aaronm13


    Thanks Solitaire, I have a 1000w PSU so I'm covered there.I actually found the GTX 470 SOC a bit cheaper than the HD5870 so that why I veering towards the Nvidia card but have never owned a ATi card hence my hesitation. Just hope Nvidia (granted it was a BFG make) got there act together cause when I took the GX2 apart the thermal paste job was a joke on it and I knew baking it was a short term fix.


  • Registered Users, Registered Users 2 Posts: 86,729 ✭✭✭✭Overheal


    hmm..read sols gtx 460 thread. I can only imagine what surprises people will uncover once theyve started owning the 470s for long enough. Remembering that it took almost a year to realise what was happening to my 8400. I say go for the AMD HD5970.


  • Moderators, Technology & Internet Moderators Posts: 18,381 Mod ✭✭✭✭Solitaire


    Yeah, it started with Apple - because most PCs have Intel or VIA/Mirage integrated graphics and most high-profile nVidia soultions for PCs are blingin' gaming cards the issues with nVidia was blamed on "them pesky kids" overclocking cards too much and was swept under the carpet. Did you know that until recently nVidia GPUs couldn't be run at over 70°C? Because at 80°C the underfill (the glue-like goop you can see around the edges of the die) turns to jelly. Now, what did every nVidia GPU from that generation do? :pac: Now add Power Saving features aggressively switching mobile GPUs on and off to conserve power, making them heat up, cool down, heat up again etc. until the repeated expansion cycles caused the solder joints (bumps), weakened by the lack of support from the jellified underfill, to bust.

    Then every G5 (IIRC) Apple started to fail, especially poorly-cooled mobile chipsets inside the Macbooks, as the vast majority of graphics adaptors in Apple products were, at the time, all nVidia. This led to the discovery of the underfill problem in Macs, only someone had a look at all nVidia graphics solutions and found that any such chip could fail the same way regardless of platform :eek:

    Then a lot of laptops - especially HPs - started to fail. nVidia screramed long and loud that this was because TSMC were a useless bunch of gob****es (even though it was only nVidia silicon that was dying), HP and Apple were a useless bunch of gob****es (even though it was only nVidia-equipped units dying) and that the users were a bunch of- yeah, you get the picture. On another note guess who lost an awful lot of contracts lately? OEMs don't like having their rep smeared by an incompetant hyena, nVidia. And if you wondered why the early Fermis were dying clean off the wafer due to the high urine content... maybe it wasn't clever pulling that smear job on TSMC either :o:pac:

    Eventually desktop solutions died too from similar issues (less aggressive power saving, thus less heat cycles). These parts are the late 7000-series, entire 8000-series and earlier 9000-series chips, and the trick of causing the bumps' solder to reflow via a quick trip to somewhere very warm gave rise to the iconic Oven Trick.

    At this point we get to the really funny stuff. nVidia sorted the underfill issue by using a firmer, high-temperature type. But this stuff can destroy the GPU with vibrational stresses unless additional protection is added. Here is a gruesomely amusing page showing the vital part a whole load of recent nVidia GPUs needed for survival yet was never added to the die :eek: Oops... These things couldn't really be sorted with the Oven Trick as the dies were more likely to shatter than bust their bumps :eek: The good news is that nVidia did sort out the problem afterwards and the majority of the affected parts were binned and quietly replaced before many got out into the retail channel.

    The only real problem with Fermi is that its a joke and made by a company with all the moral fibre of a jellyfish :P


  • Registered Users, Registered Users 2 Posts: 86,729 ✭✭✭✭Overheal


    And somehow ATI is the one that got its brand absorbed :D but last I heard yes, Apple no longer deals in nVidia chipsets.


  • Advertisement
  • Moderators, Technology & Internet Moderators Posts: 18,381 Mod ✭✭✭✭Solitaire


    ATi did it voluntarily. It knew it was going to absolutely haemorrhage money thanks to its own Fermi, the HD2000 (which as we all know came good in the end, although it took two die shrinks to get there!), and knew that integrated CGPU chips and eventually System-On-Chip processors were ultimately the future even in the desktop space. Why do you think nVidia started smearing their ex-best-friend Intel and trying to create GPGPU products? ;) A brand new product is often prototyped up to two years before it hits the shelves and up to three years before that its being carefully planned. AMD were the first to see the future for the mainstream were CPU and GPU dies on the same chip so they stole a march on Intel, forcing them and nVidia to try and create their own despite each lacking half the requisite experience.


Advertisement