Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

1194195197199200218

Comments

  • Registered Users, Registered Users 2 Posts: 7,682 ✭✭✭frozenfrozen


    photo_2022-11-05_08-54-14-smaller.jpg

    I didn't need too extreme of a bend at all to not hit the side of the case. Perspective is a bit confusing there but they aren't touching the side panel



  • Registered Users, Registered Users 2 Posts: 14,011 ✭✭✭✭Cuddlesworth


    I see a lot of theory's but nobody's able to recreate it so far in a lab setting, not for want of trying. I saw a few posts today of people checking native cables and finding connectors with melting plastic, so its not a Nvidia adaptor problem in itself, although they have been the worst offenders in terms of damage.

    Its pretty obvious its a heat problem between the pin and socket, so that means one of two things. Too much power for the spec, or a bad connection caused by a manufacturing flaw or tolerance issue. Weirdest thing though, the people with native cables say the equivalent socket and connector on the power supply is fine on the cables where the GPU end is melting. And those cables are allegedly interchangeable, eg doesn't matter which end goes in which device. Nvidia better hope its a driver problem because they have a history of not bothering with recalls on crappy products.



  • Registered Users, Registered Users 2 Posts: 4,185 ✭✭✭_CreeD_


    I'm presuming the new connector was developed and tested in relative isolation where it's tolerance for the 600W load is reasonable. But in this implementation it's sitting right beside a massive heat generator and it's attached heatsink with an airflow system that is blowing waste heat right over it. Between heat conduction across the circuit and convection out over it from the cooler mechanism it may be pushing them past that tolerance (remember it's not just the heat added from those mechanisms directly, that hot air reduces the ability of and rate at which the power junction/lead can passively cool itself). The fact that the same interchangeable cable remains fine on the PSU end points to an environmental issue on the other.



  • Registered Users, Registered Users 2 Posts: 14,011 ✭✭✭✭Cuddlesworth


    Pretty sure that type of plastic melts at above 250 degrees Celsius, so that doesn't seem likely. So far today I have seen people inspecting brand new cables and seeing plastic inside the connectors(manufacturing flaw), shorter connecters then spec(manufacturing flaw), cables not plugged in all the way(user/design flaw). Still nothing official from Nvidia at this point.



  • Registered Users, Registered Users 2 Posts: 4,185 ✭✭✭_CreeD_


    My point though was not that the card is generating enough heat on it's own, neither is the cable. The heat from the card will affect the cable's ability to dissipate heat within it's expected tolerance. 'If' the threshold is already near then it can push it over the edge (something bad quality will exacerbate). The difference in temp effects the rate of heat exchange, less difference lowers the rate. It will also increase resistance, and if the card attempts to draw more power to compensate that could also contribute.

    All conjecture for now.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    What I don't understand is why they went with this style instead of going with a beefy 2 or 4 contracts connector, something like this:

    image.png




  • Registered Users, Registered Users 2 Posts: 767 ✭✭✭ricimaki


    You'd need to run some very heavy gauge wires which are very difficult to bend. Multiple smaller wires are easier to bend/route around inside a PC case. There's probably a cost element to it too, as those connectors can be relatively expensive, along with the tooling, crimps etc. required to actual use them.



  • Registered Users, Registered Users 2 Posts: 14,011 ✭✭✭✭Cuddlesworth


    Its a 12V feed in a PC, its not a 240V mains connector.



  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    Those are automotive connectors rated for 70A so they would be perfect for the job if I'm not missing anything. As for the cables nothing prevents having a bundle of cables tied together at the connector's end, just as they are in this problematic new connector.



  • Registered Users, Registered Users 2 Posts: 14,011 ✭✭✭✭Cuddlesworth


    The connector isn't problematic, if it was it would been immediately noticeable in testing by countless techs and companies.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 767 ✭✭✭ricimaki


    Those are designed for charging electric vehicles, are IP67 rated, and also rated for sub-zero temperatures. Each connector would take at least 4 PCI slots in vertical height. Ordering in large quantities would be €15+ per connector, which is ridiculous when you compare it to the €0.39 for a 8-pin PCIE price. They absolutely could be used to power a RTX 4090, but to say they are not optimal for PC's would be an understatement...



  • Registered Users, Registered Users 2 Posts: 55,647 ✭✭✭✭Headshot




  • Registered Users, Registered Users 2 Posts: 8,448 ✭✭✭Mr Crispy


    Reviews out today for the 4080? Still priced terribly.



  • Registered Users, Registered Users 2 Posts: 6,425 ✭✭✭jonski


    Yeah , kinda disapointed , probably follow GN's advice and get a 3080 .



  • Registered Users, Registered Users 2 Posts: 31,275 ✭✭✭✭Lumen


    I finally got round to buying an LG C2 and am gaming away on MW II at a solid 120fps in 4k on my 3080 10GB. An amazing experience.

    So....what about the 4080 or 4090? Who is it for?

    If I was into ultra-high refresh desktop gaming I'd get something like a AW3423DW, but that doesn't have enough pixels to stress a 4080, let alone a 4090.

    The fast-paced shooters are optimised to give decent FPS on normal hardware, and the slower paced walking simulators don't need ultra-high FPS.

    I still sort of want a 40xx for the engineering (particularly the efficiency, I'd probably power limit it for the silence), but I can't rationalise the purchase.

    It's a shame the 3080s haven't come down more in price, they still seem to be about €800. I think that's still above the launch price in 2020.



  • Registered Users, Registered Users 2 Posts: 1,451 ✭✭✭jebidiah




  • Registered Users, Registered Users 2 Posts: 7,682 ✭✭✭frozenfrozen


    I still haven't played a game with mine so I presume whales and other people who use it for work mostly


    Mine scores 13.4k in the blender benchmark when I tried it briefly with the power limit +8%. A 3090 scores something like 5.7k in the same benchmark



  • Registered Users, Registered Users 2 Posts: 4,185 ✭✭✭_CreeD_


    I mentioned this in the AMD forum for the 7900XTX, you're right in that the high end 30x0 with DLSS are already fine for 4K/60+ on current titles - With a a 3090 I'll usually see no lower than 80-90 (with Ultra bells and whistles)..IF it's not using ray-tracing, then that goes quickly downward. So to me the 4090 is the first true 4K/60+ with ray-tracing solution. If RT is not your thing then the new AMD cards or high end last gen nividias will probably be fine for the next few years (though the memory limits on anything less than a 3090 base will mean turning down more graphic options over time).



  • Registered Users, Registered Users 2 Posts: 14,011 ✭✭✭✭Cuddlesworth


    I used to think cards like the Titan's/4090 were picked up by people who would need/use them, now I think they are bought by people who have to have the latest. Latest card, latest iphone, latest fashion etc. Don't even think they need to have money anymore, just living paycheck to paycheck surrounded by things.

    In the meantime, I will continue to buy mid market, turn down graphics settings and enjoy playing games with my friends because things like the release of a 4080/4090 don't concern me at 1k+ pricing.



  • Registered Users, Registered Users 2 Posts: 4,185 ✭✭✭_CreeD_


    I disagree. If you love high fidelity and high frame rate gaming then it is of use, and even top-end previous gen do not do RT justice so there is a gap in a niche but emerging visual enhancement that the new cards can fill. The lengths anyone will got to achieve what they perceive as excellence in something they love shouldn't just be written off as rabid consumerism and fashion-whoring just because someone else doesn't see the same value in that area. You'll see it all round us in cars, audio, TVs, watches...pretty much anything. It's not like someone paying the Apple tax just so they can sit in a coffee shop with the logo displayed prominently on a laptop they barely know how to use who's power they don't even remotely need. I think there is a place for the 4090 for the reasons I mentioned, now it's still a steep price to benefit ramp but it's wrong to say it has no real value to those willing to buy it.

    Personally I will be because I like RT, subtle as it is in today's offerings, and I want to support it's development further by investing in the technology. What we are seeing today are only ever partial implementations. I primarily game on a LG C1 with a 120hz target, why would I not spend close to the same money on the device that feeds it? (though of course I do wish I didn't have to aswell :) ).



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    Big meh really isn't it, a bog standard xx80 card at an msrp of $1199. Think GS really hit it on the head with his review.

    Was debating moving on from my old 1080ti build last year but seeing these mad prices why bother, maybe I'll build a PC from cheaper second hand parts, or amds offerings.



  • Moderators, Computer Games Moderators, Technology & Internet Moderators, Help & Feedback Category Moderators Posts: 26,063 CMod ✭✭✭✭Spear


    Gamers Nexus claim to have recreated the melting adapter:




  • Registered Users, Registered Users 2 Posts: 18,969 ✭✭✭✭K.O.Kiki


    I'd definitely wait on RX 7900 XT/XTX or even RX 7800 series at this point.

    Nvidia have stagnated completely; the 4080/4090 are simply more money = more performance (and it fails even there)



  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    yeah im really interested in the 7800 xt/xtx, if it performs around 4070-4080 for €700 i might get it



  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    Or for the same money (msrp) as a 3080 ti it offers about 30% improvement in performance with RT and 30% more VRAM.



  • Registered Users, Registered Users 2 Posts: 613 ✭✭✭slipperyox


    12v is worse.

    A 100 watt light bulb in a car (100/12 = 8.3 amps)

    A 100 watt light bulb in a house (100/230 = 0.43 amps)

    The cable required to carry the current in the car has to be much thicker.

    And those connectors look like low current designed for an IP rating of dust/moisture



  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    They are 70A automotive connectors, I posted them just as an example of a better choice (imho) than the multi-pin currently used.

    https://www.farnell.com/datasheets/2143843.pdf



  • Registered Users, Registered Users 2 Posts: 613 ✭✭✭slipperyox


    i think the 70A might be a surge current? and 7.5 continuous?



  • Moderators, Computer Games Moderators Posts: 14,782 Mod ✭✭✭✭Dcully


    Im also still rocking a 1080ti for 1440P gaming.

    Im not sure which is the most in need of replacing the GPU or my 8600K @ 4.5ghz,but to be fair both are holding up well.

    Its looking like the 40 series are not for me so the 3080 is one ill keep an eye on and see what way the AMD cards work out.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    I'd say that CPU is already a bottleneck even when paired with a 1080ti, so that must be the first upgrade.



Advertisement