Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

1191192194196197218

Comments

  • Registered Users, Registered Users 2 Posts: 7,682 ✭✭✭frozenfrozen


    nice one I just bought one of those inno3d ones that maxburns had. Hadn't heard of that site before



  • Registered Users, Registered Users 2 Posts: 55,651 ✭✭✭✭Headshot


    Holy hell my card could be getting delivered tomorrow lol

    Cannot wait



  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    These are actually selling out, I despair for the future 😭 although I hope everyone here enjoys them, lucky feckers.



  • Registered Users, Registered Users 2 Posts: 2,391 ✭✭✭rob808


    Nice enjoy upload some photos when you get it.

    I probably try get a 4080 16Gb or 7800 xt sometime next year.



  • Registered Users, Registered Users 2 Posts: 18,969 ✭✭✭✭K.O.Kiki


    Low stock & early adopters. Of course they're "selling out".

    I'll only despair if they end up consistently out-selling other cards 3+ months from now.

    ---

    IMHO:

    **** card from Nvidia.

    Big performance gains but even bigger price increase compared to RTX 3080 10Gb.

    It's not a real "upgrade" from RTX 3000 if the cost-per-frame gets substantially worse.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    When computing price/performance or cost per frame RT and DLSS need to be used, since they are a significant section of the silicon and of the cost. Comparing raw performance without the advancements of the last 3 generations is pointless.



  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    Dlss is even worse considering version 3 is just software that Nvidia have artificially locked to their new cards



  • Registered Users, Registered Users 2 Posts: 18,969 ✭✭✭✭K.O.Kiki


    It's still 2x cost = 2x performance, unless I'm forgetting something major.



  • Registered Users, Registered Users 2 Posts: 31,275 ✭✭✭✭Lumen


    The launch MSRP of the 4090 is (I think) only $100 higher then the launch price of the 3090, and completely destroys it in terms of outright performance, efficiency and cooling. And it supports DLSS 3.

    I don't see the problem.



  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    Tbh as long as you can think of it as a titan card the cost is acceptable, they where always 1k - 2k. It's the 4080s that are more egregious, particularly the 4070 pretending to be one but at a more expensive MSRP than a 3080.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 31,275 ✭✭✭✭Lumen


    Agreed. And aside from pricing and nomenclature, my biggest gripe right now (apart from lack of FE availability) is that both 4080 cards are the same size as the 4090, despite the 4090 having plenty of cooling capacity. It's pretty much killed high performance in genuinely small form factor, at least until cases are redesigned around them.



  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    The dies don't even seem that big in the cards, from what I've seen the coolers extend around 30% if not more past the end of the die. The aio versions or sticking them on a loop have a genuine use case now other than bling



  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    This is not true, there is a new fixed function silicon block which is part of the DLSS 3 pipeline.



  • Registered Users, Registered Users 2 Posts: 1,011 ✭✭✭harmless


    The 4080 will have 450w+ of cooling too? That's crazy, 250w will probably be plenty of power to get good performance.



  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    More cooling = more better :) as it leaves room for better boost, overclocking or silent running.



  • Registered Users, Registered Users 2 Posts: 18,969 ✭✭✭✭K.O.Kiki


    I'm not comparing to the 3090 - that card was overpriced from Day 1.

    Compare it to the 3080.




  • Registered Users, Registered Users 2 Posts: 1,011 ✭✭✭harmless


    Less noise yes but pumping loads of power into the 4090 does not seem to do much. Many will be paying a premium for a large heatsink that may no fit in their PC. It's nice to have as an option on the high end AIBs but I don't think it should be standard.



  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    If that's the case how have modders gotten it to work on a 2070 🤷‍♂️ surely that doesn't have the special hardware. No doubt Nvidia will do something with drivers to stop people from doing this in the future.


    https://appuals.com/dlss-3-0-works-with-turing-ampere/



  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    It "appears" to be working with no proof and no latency and no qualitative analysis. Maybe it enabled the software path but without the input from the new hw block resulting is something that doesn't match the quality of 4000 series. So probably the inserted frames were just duplicates.



  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    I mean if you want to believe Nvidia wouldn't pretend it's hw locked (aka shady ****) in order to sell cards that's up to you but imo it's just bullshit made to make people upgrade from 2xxx and 3xxx series GPUs, if a 2080 or 3080 could use dlss 3 it would make the new cards a whole lot less interesting for people.



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    The reality is that DLSS 3 pipeline has a dedicated hw block, it's not a matter of believing it is there because it is. Pure software AI based frame generation do exists but in order to match DLSS 3 quality and performance you need dedicated HW, otherwise we would have seen it in AMD cards. So the only **** that's shady is that claim with no proof that you can run DLSS3 on 2000 series.



  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    Cool, I'm not going to argue with you, I'm not a pc hardware/software engineer and I haven't deep dived into these new cards. Maybe your right 🤷‍♂️ all I know is when it comes to being upfront and consumer focused I don't trust Nvidia in the slightest anymore.



  • Registered Users, Registered Users 2 Posts: 18,969 ✭✭✭✭K.O.Kiki


    This is a similar argument to "well why can't GTX 1080 Ti run RTX" or "2080 s basically the same as 3080 with DLSS".

    Nvidia does add dedicated hardware for these functions to each subsequent generation.



  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    The argument with the 1xxx series was that they didn't have the tensor cores that the 2xxx series brought, granted the 4xxx series has updated tensor cores, but they are still tensor cores. Will be interesting to see if modders get it working well on older GPUs.



  • Registered Users, Registered Users 2 Posts: 6,569 ✭✭✭Cordell


    They tried to run RT on the 1000 series, it worked but with like 3fps. Anything that is hardware accelerated can be run in software but with unusable performance.



  • Registered Users, Registered Users 2 Posts: 31,275 ✭✭✭✭Lumen




  • Registered Users, Registered Users 2 Posts: 7,531 ✭✭✭Icyseanfitz


    Even if it gave a 10% increase from dlss 2 it should be useable by the consumer



  • Registered Users, Registered Users 2 Posts: 31,275 ✭✭✭✭Lumen


    Yeah, but you can get a 10% FPS increase in most any setup by fiddling with quality/performance settings.

    I've heard rumours that Nvidia know that DLSS 3 isn't really ready yet but ran out of time before the 40 series launch. Probably hoping that the insane rates from the 4090 would disguise the complete garbage coming out in the interpolated frames.



  • Registered Users, Registered Users 2 Posts: 55,651 ✭✭✭✭Headshot


    Holy Moly this is a big big card



  • Advertisement
  • Registered Users, Registered Users 2 Posts: 4,185 ✭✭✭_CreeD_


    My $.02 on the power vs heat vs monstrous AIB cooling solutions is that the card originally was going to try to run at 600W max and a few factors caused them to to pull back. The obvious being the glut of 3x cards, which is why they are releasing the 4090 first since it is so massively ahead in performance, wheres the 4080 / 4070 won't have as big of a differentiator when matched directly in the current market, if they boost it even higher the lower 4x cards will look even worse. The other being the lack of ATX3 supplies and the prevalance of 850W and below even in most highend gaming rigs so there's an associated cost there above the card itself if that's a hard requirement. Lastly the energy crisis will make the TCO higher and more prevalent in people's minds also. AIBs did not spend all of that extra money on those coolers, and the knowledge they simply may not fit in end users cases and thus prohibit even more sales, thinking they wouldn't be needed. I think if not by end of this then early next year we'll see updated BIOS and perhaps base card designs that will unlock higher voltages (a key limiter derbauer saw) and those insane coolers will start to make more sense. For now even if it still shows voltage locked I would go with a card that openly claims to be tuned for up to 600W as the underlying power delivery is more likely to just be soft-locked (and upgradable later) if this turns out to be true.

    For myself I'm waiting for a good AIO cooled card. I have a large well venitated case with an 420 AIO on the CPU and bottom-intake/top-outtake fans purely for the GPU flow, and that handles a 3090 adequately but it'd be nice to not have to worry about dumping that 450-600W of heat into the case regardless and just pipe it directly out. With the increased heat output of basic MBD components and SSD heat throttling being a thing, and going to be more of a factor with PCIE5 drives, it should help with overall system performance regardless of any actual OC improvement on the GPU.



Advertisement