Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi all! We have been experiencing an issue on site where threads have been missing the latest postings. The platform host Vanilla are working on this issue. A workaround that has been used by some is to navigate back from 1 to 10+ pages to re-sync the thread and this will then show the latest posts. Thanks, Mike.
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Nvidia RTX Discussion

12425272930126

Comments

  • Registered Users Posts: 237 ✭✭Komsomolitz




  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    4 ray traced features. I can't see the 20 series handling that too well unless you want to play at < 30 FPS.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    BloodBath wrote: »
    4 ray traced features. I can't see the 20 series handling that too well unless you want to play at < 30 FPS.

    The only upside to WFH is that I've been able to save up enough for a new GPU if I wanted one :pac:


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭EoinHef


    Would that not be why there touting DLSS?

    That should help the 20XX series. Im sure it will be well tuned for Cyberpunk aswell givin the size of the game launch and how much anticipation there is around it.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    According the laymen gaming,who played the PC version for 5 hours, it ran around 60FPS at 1080p with a 2080ti with all the bells and whistles turned on. It wasn't locked to 60 either and went up at times especially when driving they said. I assume some of the RT stuff is disabled while driving.

    1080/60 with a 1k gpu is not exactly great.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭EoinHef


    BloodBath wrote: »
    According the laymen gaming,who played the PC version for 5 hours, it ran around 60FPS at 1080p with a 2080ti with all the bells and whistles turned on. It wasn't locked to 60 either and went up at times especially when driving they said. I assume some of the RT stuff is disabled while driving.

    1080/60 with a 1k gpu is not exactly great.

    No that doesn't sound great,who are laymen gaming though?

    Are they not a game review rather than hardware? Not sure id trust game journos to know what there doing


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Sorry I misquoted them. They said a minimum of 60fps going up as high as 100 while driving so not as bad but still not great.

    I don't think the 2060/2070 class cards will fare too well. This game will be used to push the 3000 series cards.


  • Registered Users, Registered Users 2 Posts: 6,135 ✭✭✭Cordell


    Probably this: https://www.youtube.com/watch?v=WarYN1tRS1o
    It's not clear if it was 1080p with DLSS meaning a lower internal resolution, or 1080p was the render resolution.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Around 6:45 into the video specifically.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    BloodBath wrote: »

    1080/60 with a 1k gpu is not exactly great.

    On the contrary, I think this is superb - because if the 2080 Ti struggles that much, Nvidia must have done some great advancements over the last 2yrs if they're ready to implement all this RT on their next-gen cards.

    Also don't forget that Cyberpunk is an open-world game, so it would be a fair bit more taxing than, say, CONTROL.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭EoinHef


    There could be a menu of options in game allowing choice of what RT effects to use. That would be a nice middle ground.

    Givin the delays of the game im not sure we can draw any solid performance conclusions either,maybe indicative but for all we know there could still be a load of optimisation to be done as they have said the game is done its bug fixing,polish etc thats causing delay.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    K.O.Kiki wrote: »
    On the contrary, I think this is superb - because if the 2080 Ti struggles that much, Nvidia must have done some great advancements over the last 2yrs if they're ready to implement all this RT on their next-gen cards.

    Also don't forget that Cyberpunk is an open-world game, so it would be a fair bit more taxing than, say, CONTROL.

    I'm not ****ting on RT. It doesn't change the fact that the 2000 series cards were a beta test for it. We're looking at a 4x improvement on the the 3000 series.

    Taking the load of some rendering tasks off of the normal pipeline also frees up resources to actually improve performance in that area as well. They just need to get the balance right.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    We have no benchmarks to support a 4x improvement.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Not yet but let's assume it's true. It's an area they could easily achieve a 4x improvement in and an area they will dedicate more and more gpu die space to over time.

    The current 2000 lineup is badly bottlenecked by it. It would take a 4x improvement to bring it more in line without costing many frames.


  • Registered Users Posts: 655 ✭✭✭L


    Cordell wrote: »
    No, it means they will split the GPU in 2 chips, one generic non RT GPU and one tensor cores / RT co-processor. Which makes a lot of sense if they can interconnect them with a fast bus.


    So, I said this back at RTX launch, is there any technical reason that RT should be implemented on the same card rather than as an addon card?

    I get the cynical sales reasons, but surely two chips means they're not a million miles away if they can bridge the cards adequately.


  • Registered Users, Registered Users 2 Posts: 462 ✭✭tazzzZ


    L wrote: »
    So, I said this back at RTX launch, is there any technical reason that RT should be implemented on the same card rather than as an addon card?

    I get the cynical sales reasons, but surely two chips means they're not a million miles away if they can bridge the cards adequately.


    I believe they cant get a connection quick enough or with the desired latency without having it on the same PCB. Again just a rumour i heard and maybe your method is perfectly doable.


  • Registered Users, Registered Users 2 Posts: 6,135 ✭✭✭Cordell


    L wrote: »
    So, I said this back at RTX launch, is there any technical reason that RT should be implemented on the same card rather than as an addon card?

    I get the cynical sales reasons, but surely two chips means they're not a million miles away if they can bridge the cards adequately.

    If I am to speculate probably it can be done, but with significant compromises. Even if they use some sort of NVLink to bridge the cards it won't be the same as having the chip(lets) very close together. Also, the end user will need a system that supports this arrangement (think SLI ready motherboards). So probably there is no market for such a solution.


  • Registered Users, Registered Users 2 Posts: 7,180 ✭✭✭Serephucus


    I don't think NVIDIA would ever go this route, tbh.

    It's what they had when they bought AGEIA back in the day; To take advantage of PhysX, people needed these specific cards, so no-one bothered buying the games. Once NVIDIA integrated PhysX into their GPUs, then that was no longer an issue.

    tl;dr - It could be done, but it would ruin already struggling adoption rates.


  • Registered Users Posts: 655 ✭✭✭L


    Serephucus wrote: »
    I don't think NVIDIA would ever go this route, tbh.

    It's what they had when they bought AGEIA back in the day; To take advantage of PhysX, people needed these specific cards, so no-one bothered buying the games. Once NVIDIA integrated PhysX into their GPUs, then that was no longer an issue.

    tl;dr - It could be done, but it would ruin already struggling adoption rates.

    That's more or less what I figured - by adding them to the main GPU, they can sell "new cards" with "new features", whether or not they're actually desirable on their own merit, or whether or not the new card really has much to offer in raw performance.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 13,997 ✭✭✭✭Cuddlesworth


    If they sold a GTX 2080ti alongside the RTX 2080ti for 200 quid cheaper, I'd guess the GTX card would have heavily outsold the RTX card.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    It also would've stunted the growth of ray tracing development, which will be important in coming years.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    More sources saying the demo was running at 1080p with DLSS 2.0 enabled meaning it was actually running at half of 1080p resolution and still only achieving around 60 fps in places. With a 2080ti. That was with 3 of the 4 RT features enabled.

    The 2000 series is not cut out for this at all.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    30fps is an option - and will probably be the default on consoles :pac:


  • Registered Users, Registered Users 2 Posts: 8,000 ✭✭✭Mr Crispy




  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Looking forward to this 1. Amd gonna be beaten to the punch yet again.

    Big RT improvements and DLSS 2.0 should be a game changer.

    Maybe 4k 144hz will finally be doable even on mid range cards. Granted not native 4k but if it's close who cares.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 28,352 ✭✭✭✭TitianGerm


    BloodBath wrote: »
    Looking forward to this 1. Amd gonna be beaten to the punch yet again.

    Big RT improvements and DLSS 2.0 should be a game changer.

    Maybe 4k 144hz will finally be doable even on mid range cards. Granted not native 4k but if it's close who cares.

    I nearly bought a 2070 Super for £400 this morning. Kinda glad I held off now.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭EoinHef


    I really hope AMD will get some performance figures out there for RDNA2 around a similar time so can at least weigh both up then.

    I've been sitting on a 1080 long enough that I can wait a month or so if it was worth it.


  • Registered Users, Registered Users 2 Posts: 7,810 ✭✭✭Calibos


    Nov announcement and Dec launch for RDNA2 I heard.

    For my new VR build I'm waiting for Zen 3 on the CPU side of things because that probably will finally topple Intel in gaming and drivers are not an issue but drivers are always an issue with AMD GPU's aren't they. While AMD will likely finally reach parity with nVidia in GPU Hardware performance/specs and potentially even pull ahead slightly this gen, the fact remains that nVidias Driver division is bigger than the whole of AMD's GPU division and I can well imagine the following scenario plays out after waiting. "Get the BiG Navi! It'll be faster than the 3080ti when they sort out their drivers!!.............drivers finally manage to unleash RDNA2s' latent potential on the eve of nVidia Hopper and AMD RDNA3 release..... LOL.

    Nah, I need every frame I can get for MSFS2020 in VR on a 2160x2160 per eye (4320x2160@90hz) HP Reverb VR Headset.

    3080ti/90 for me this time round.


  • Registered Users, Registered Users 2 Posts: 8,000 ✭✭✭Mr Crispy


    Lots of rumours this morning that AIB cards are ready to go, and will be launched alongside the Founders cards. Pinch of salt and all that.

    And Nvidia have tweeted again. Anyone here fluent in dial-up? :pac:

    https://twitter.com/NVIDIAGeForce/status/1293090205054902273?s=20


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    EoinHef wrote: »
    I really hope AMD will get some performance figures out there for RDNA2 around a similar time so can at least weigh both up then.

    I've been sitting on a 1080 long enough that I can wait a month or so if it was worth it.

    Sucks that they won't be launching closer together. Gives Nvidia an excuse to start their prices gouging high and drop when AMD's arrive if they are competitive.


  • Advertisement
  • Registered Users Posts: 655 ✭✭✭L


    Well, this is interesting. Looks like Nvidia's getting raytracing support setup for some big name games including WoW.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭EoinHef


    The rumours going round bout the potential price of a 3090 are eye watering.....


  • Registered Users Posts: 321 ✭✭Mucashinto


    It’s really f’d up the way GPUs are going. Spend 500euro to be in the budget range 🙁. Consoles could really claw some popularity back if they play their cards right this generation I think.

    Such diminishing returns at the high end of GPUs as well. Spend all that extra and get what exactly?


  • Posts: 0 [Deleted User]


    But budget range GPU's are not 500euro. Most people who don't have high tier cards are using around 300euro cards in their systems.

    And while PC gaming is doing really well these days consoles would be still more popular. Nearly all AAA games are developed for consoles first and foremost and then ported to PC.


  • Registered Users, Registered Users 2 Posts: 4,434 ✭✭✭Homelander


    What we might consider 'budget' range GPU's are the £200-250 class GPU's that would be the most common for 1080P setups....but they're not budget, they're more lower mainstream.

    For example, recently, it would've been the GTX1060 or RX4/5 70/80/90 cards would be the most common, and they were reasonably priced, probably always will be due to the genuine competition in those brackets.

    True budget cards are the GTX1650 or whatever, cards that pack a decent punch for small money, predecessor was the 1050, before that, the 750Ti. They tend to perform very well in their day, but two or three years later and they struggle with 1080p low settings.

    Upper mainstream would be your RTX2060's or 5700's, but still mainstream territory.


  • Registered Users, Registered Users 2 Posts: 6,135 ✭✭✭Cordell


    And while PC gaming is doing really well these days consoles would be still more popular. Nearly all AAA games are developed for consoles first and foremost and then ported to PC.

    But the things are much better now. The most popular engines are truly multiplaftorm and mature enough, and the architecture is much more alike between consoles and PS. I would say the PC port is not really a thing anymore, but the games are developed and released as multiplaftorm from day 1.


  • Registered Users, Registered Users 2 Posts: 8,000 ✭✭✭Mr Crispy


    This is allegedly a pic of the 3090 FE.

    NVIDIA-GeForce-RTX-3090-1-1.jpg

    link

    You're gonna need a bigger boat case.


  • Registered Users, Registered Users 2 Posts: 28,352 ✭✭✭✭TitianGerm


    Mr Crispy wrote: »
    This is allegedly a pic of the 3090 FE.

    NVIDIA-GeForce-RTX-3090-1-1.jpg

    link

    You're gonna need a bigger boat case.

    That's a tiny electric heater.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    TBF there's MSI/ASUS cards that are as big.

    *edit*

    Someone on Videocardz forum fixed it.

    76OBZdp.jpg
    VjI5CFI.png


  • Registered Users, Registered Users 2 Posts: 462 ✭✭tazzzZ


    Looks like ill have to add a new case to the cart when i buy one of these!!! for the price they should include one for free though!


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 7,180 ✭✭✭Serephucus


    Anyone else thinking this might be another Fermi round?
    (not that it won't have the perf to back it up, mind)


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    The top end will suck juice but the coolers seem way better now so it shouldn't be a problem.

    Any confirmation or not of whether the RT and Tensor stuff is separated from the main die? That would make things a lot easier to cool.


  • Registered Users, Registered Users 2 Posts: 8,000 ✭✭✭Mr Crispy


    "PSU Makers Ready 12-Pin Micro-Fit Connectors For NVIDIA’s GeForce RTX 30 Series Ampere Graphics Cards" - WCCFTech (I know).

    OeAjRMK.jpg?1


  • Registered Users, Registered Users 2 Posts: 7,180 ✭✭✭Serephucus


    BloodBath wrote: »
    Any confirmation or not of whether the RT and Tensor stuff is separated from the main die? That would make things a lot easier to cool.

    If I remember right the PCB photo a while ago showed two separate die areas.


  • Registered Users, Registered Users 2 Posts: 740 ✭✭✭z0oT


    Serephucus wrote: »
    Anyone else thinking this might be another Fermi round? (not that it won't have the perf to back it up, mind)


    I can't help but remember these memes of the GTX 480 when I saw that alleged 3090 reference card.


    523982.png


  • Registered Users, Registered Users 2 Posts: 7,755 ✭✭✭Inviere


    Mr Crispy wrote: »
    "PSU Makers Ready 12-Pin Micro-Fit Connectors For NVIDIA’s GeForce RTX 30 Series Ampere Graphics Cards"

    850W or higher recommended? :eek:


  • Registered Users Posts: 1,016 ✭✭✭Ultrflat


    Mucashinto wrote: »
    It’s really f’d up the way GPUs are going. Spend 500euro to be in the budget range 🙁. Consoles could really claw some popularity back if they play their cards right this generation I think.

    Such diminishing returns at the high end of GPUs as well. Spend all that extra and get what exactly?

    I dunno who the hell has the money to buy €1300 cards during a pandemic, I recon they wont be able to shift them or turn a profit. If anything I recon cards will be cheaper. Like you said the added competition of the new consoles. Times are changing.


  • Registered Users, Registered Users 2 Posts: 5,417 ✭✭✭.G.


    Inviere wrote: »
    850W or higher recommended? :eek:

    Thought they same! Even more expense on top of the pricey GPU.


  • Registered Users, Registered Users 2 Posts: 7,180 ✭✭✭Serephucus


    Ultrflat wrote: »
    I dunno who the hell has the money to buy €1300 cards during a pandemic, I recon they wont be able to shift them or turn a profit. If anything I recon cards will be cheaper. Like you said the added competition of the new consoles. Times are changing.

    I dunno. Most companies in general, but NVIDIA in particular it seems: If they could get away with legally charging your first born for their product, they would.

    The 20 series didn't sell fantastically well in comparison to the 10 series, but then there wasn't the same perf. jump from 10-20 as there was from 9-10. It's looking like this may be another 10 series performance-jump-wise.

    So, we have a company with products it know will be good, customers it knows really want them, and a competitor that may or not have something they should worry about, but not right away. If you're Jensen, what would you do, other than price them as high as you could get away with to start?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,135 ✭✭✭Cordell


    I think we're looking at this the wrong way: the prices in fact do align with the 2000 series, but 3090 is not the next gen 2080, it's a new tier completely, and the rest of the lineup seems to match the 2000 series msrps.


Advertisement