Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

Options
189111314209

Comments

  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    The part you're not getting is that we're not talking about a single card, we're talking about an entire generation of radically inflated prices.

    As previously noted, the high-end, radically disporportionately priced cards have always existed in every generation without any complaint or issue. As you rightly say, most of those who were happy to pay those prices, the costing was irrelevant.

    Since AMD stopped being competitive (or mildly competitive on poor footing, such as radically increased power demands), Nvidia pricing has rocketed in virtually every class, maybe bar the 750Ti/1050 level.

    It's not much different than if Microsoft pulled out of the console race and next thing, the PS5 is suddenly €800. Would that be OK, just because a lot of people would buy it anyway?

    The RTX2080 is around the same, maybe a little faster than a 1080Ti. In previous generations, this would basically mean the RTX2080 would be around £500, if the 1080ti was £650 at launch or whatever. Doesn't really matter about the new features, this has happened countless times in past generations also. Like GTX480 - GTX580 - GTX680 - GTX780 - GTX980. These were all priced pretty similarly on release, and each offered huge gains on the previous generation.

    Now have a look at what happened pricing from the GTX970 - GTX1070 - RTX2070 when AMD stopped being competitive.

    Or the GTX980 - GTX1080 - RTX2080. In fact, you could chart either line back the whole way with very few hiccups until the 1070 generation.

    With AMD more or less helpless in this space, Nvidia just slap massive premiums on every class card just because they can. So instead of paying the same to get increased performance, you're either paying the same to get the same performance years later, or else paying a huge, crazy premium to get an upgrade....years later.

    You can defend or make excuses for it, but it's a load of crap. I don't get why anyone would try and defend it, regardless of whether or not they can afford an RTX. I'm sure plenty people here can, but they've had enough of taking it over the table from Nvidia.

    I think you're over simplifying it and completely ignoring inflation and the tech in the cards. It's not a simple die shrink. You're looking at more memory which has also increased in price a lot and the extra cores.

    Are they still overpriced? Yes. Will people buy them regardless? Yes.

    The reviews are overlooking DLSS as well. Granted the games have to support it but it's basically free AA that's far superior to the likes of T-AA.

    I would say offloading some task to the ray tracing core should increase performance as well. Rather than going all out ray tracing and having to render the game at 1080p you could just use it for ambient occlusion or some other single task.

    Again games would have to support that kind of fine tuning and they don't yet so it's not that appealing for early adopters but at the end of the day they will still be at least 30% faster than previous gen for normal rasterised tasks. Not the jump people wanted and not the jump that justifies the price sure but if you have a 1080ti already you really don't need an upgrade. Skip a gen and see how things pan out.

    Once games properly implement DLSS and RT options you will see your performance go up at least 50% from previous gen. Utilising those cores is the key. Benchmarks that completely ignore them don't really tell the whole story.


  • Registered Users Posts: 6,984 ✭✭✭Venom


    BloodBath wrote: »
    Once games properly implement DLSS and RT options you will see your performance go up at least 50% from previous gen. Utilising those cores is the key. Benchmarks that completely ignore them don't really tell the whole story.


    Once games fully implement DLSS and RTX and if it works 100% like Nvidia claim, you will just be running around the same FPS as current non RTX cards, only with all the new bells and whistles turned on, give or take a couple of frames. I don't believe for a second that Nvidia went so out of their way to not talk about performance numbers prior to the RTX launch, if there was anything close to a 50% boost in performance when using RTX and DLSS tech.


  • Posts: 0 [Deleted User]


    Digital Foundry have gone over the DLSS in the demo's and it does look very promising for performance if it's implemented. Supposed to be easy to do too.


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    My card is shipped arriving Monday so ahead of schedule on the behind schedule new schedule ;)


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Venom wrote: »
    Once games fully implement DLSS and RTX and if it works 100% like Nvidia claim, you will just be running around the same FPS as current non RTX cards, only with all the new bells and whistles turned on, give or take a couple of frames. I don't believe for a second that Nvidia went so out of their way to not talk about performance numbers prior to the RTX launch, if there was anything close to a 50% boost in performance when using RTX and DLSS tech.

    They didn't exactly have benchmarks to show did they. It's brand new tech on early drivers. The demos we did get were knocked together in 2 weeks.

    We have benchmarks now from others that show 40-50% increase with DLSS alone with little to no loss of visual fidelity so yeah it does work.

    I was just wondering if the RT core could be used for improving performance by doing a single reasonably demanding task like ambient occlusion rather than trying to do multiple things with performance suffering as a result.

    Here's digital foundry's analysis of DLSS. This looks really promising. 4k like results but only rendering at 2k. Hopefully it's just as good rendering at 1k but upscaling to 2k.



  • Advertisement
  • Registered Users Posts: 18,704 ✭✭✭✭K.O.Kiki


    BloodBath wrote: »
    They didn't exactly have benchmarks to show did they. It's brand new tech on early drivers. The demos we did get were knocked together in 2 weeks.

    We have benchmarks now from others that show 40-50% increase with DLSS alone with little to no loss of visual fidelity so yeah it does work.

    I was just wondering if the RT core could be used for improving performance by doing a single reasonably demanding task like ambient occlusion rather than trying to do multiple things with performance suffering as a result.

    Here's digital foundry's analysis of DLSS. This looks really promising. 4k like results but only rendering at 2k. Hopefully it's just as good rendering at 1k but upscaling to 2k.

    Wait a second - that's CONSOLE TECH!!
    *boo hiss*


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Yeah there's a bit of PC snobbery to combat first with this but the results speak for themselves.

    This enables high refresh 4k monitors with a single GPU. Or you could stick at 2k and use the other DLSSX2 for a superior AA solution.


  • Registered Users Posts: 6,984 ✭✭✭Venom


    K.O.Kiki wrote: »
    Wait a second - that's CONSOLE TECH!!
    *boo hiss*


    The big problem I see going forward is that it isn't console tech. When it comes to games being released on PC, far too many developers just don't put in the extra work needed to enhance a games graphics to the level that makes use of the extra power a PC's graphics card has over any of hardware found in consoles. Taking into account that both the current and the next generation of mainstream consoles make use of AMD GPU's, will game developers bother to take the extra time to use this tech considering they treat the PC as the red-headed stepchild of gaming as it is?


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    I think its always important to consider what resolutions actually game at. The steam hardware survey put the resolutions as follows.

    1024 x 768 0.65%+0.01%
    1280 x 720 0.42%+0.04%
    1280 x 800 0.85%+0.02%
    1280 x 102 42.19%+0.21%
    1360 x 768 1.97%+0.18%
    1366 x 768 14.18%+0.85%
    1440 x 900 3.62%+0.25%
    1536 x 864 0.31%+0.02%
    1600 x 900 3.68%+0.13%
    1680 x 1050 2.63%+0.17%
    1920 x 1080 60.66%-3.06%
    1920 x 1200 0.95%+0.11%
    2560 x 1080 0.95%+0.11%
    2560 x 1440 3.62%+0.50%
    3440 x 1440 0.42%+0.06%
    3840 x 2160 1.33%+0.19%
    Other1.56%+0.18%

    So most people game at 1080p....fair enough RTX is not for this, but only 0.19% of people game at 4k. there are 3.5 million pixels in difference between ultra wide 1440p the next highest resolution and 4k.

    I tried gaming at 4k and the proplem was that 4k sucks. Using windows in 4k is terrible, scaling sucks and the pixel density is too great on a small monitor. Responce times and refresh rates on TV are crap too. The cost on the GPU has always been excessive also. and never worth it. I work in ultrawide a much better form factor for a desktop pc. All this 4k talk is nice, but not very practical.

    This leave a lot of headroom for new features etc, so RTX on with DLSS at UHD whould be a reasonable experiance FPS wise. Also some chap at the launch said tomb raider ran at 30fps in 1080p with RTX and everyone jumped on that...the metro devs say 60fps @ 1080p minimium and thats on early drivers and early metro compile.

    PC gaming is making a big resurgence mostly due to indy games, the older age of a lot of gamers, steam sales and the incredible value of low to medium end gaming PC;s which blow consoles out of the water.


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    3440 x 14400 - wow that's some tall image :pac:


  • Advertisement
  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy



    So most people game at 1080p....fair enough RTX is not for this, but only 0.19% of people game at 4k. there are 3.5 million pixels in difference between ultra wide 1440p the next highest resolution and 4k.

    Most people game at 1080p as it is the most affordable, most advertised and most supported resolution.
    Did you look at the other hardware stats? Like most popular video cards?
    The GTX 1060 and 1050ti are by far the most popular cards. If RTX takes off it will be when people who buy at these price point can afford it.
    I don't think many here are saying RTX has no potential just that it has no place for a few more generations.


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    Most popular card is actually a 1060 but the 1080ti is the 10th most popular, the 1070 the 5th and the 1080 the 8th, so the high end is not as uncommon as you would think.

    If everyone took that view the RTX had no place Tuxy, then there would be total stagnation, progress is only possible with early adopters available to pay for the R&D so it can trickle down to masses later. RTX is still the most powerful card available today and trounces a titan v for half the price, that's its place right there. I understand peoples disappointment that RTX is not the iterative die shrink performance bump we always got, but for the next working class hero to emerge you first have to create the work.....


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    Most popular card is actually a 1060 but the 1080ti is the 10th most popular, the 1070 the 5th and the 1080 the 8th, so the high end is not as uncommon as you would think.

    I said cards and talked about the top two as more own the top two cards than the next 10 cards down the list combined. That shows a price point that interests pc gamers the most (€150-€300)
    IF 1070 and 1080 are high end (release price €450-€700) what is the 2080ti?

    Only 1.45% of steam users were prepared to go to the 1080ti price point. You can add a fee .xx% for titan owners, are you saying this is what is funding new tech at Nvidia? And that the big money is not in 1050 - 1060 sales?

    The die on the 2080 ane 2080 ti is massive. They could have added more performance in a more traditional way with more cuda cores.

    Then when ray tracing is actually ready release it on the full range of cards. Perhaps also have RTX titan cards released a year in advance so there are actually games people can play when RTX cards drop.


  • Registered Users Posts: 7,582 ✭✭✭Inviere


    Thread title :o Poor Fitz. At least he’ll have the mother of all gpu’s to console him


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    Inviere wrote: »
    Thread title :o Poor Fitz. At least he’ll have the mother of all gpu’s to console him

    In fairness as he said he's not too bothered and if I had the money to throw away I'd get one too.
    But I'd also be buying lots of stuff that's of no use if I had more money.
    He also has admitted to playing devils advocate, which is fine by me. It keeps the discussion going.

    He's already said he will get a turing Titan as soon as it comes out no matter the price or performance.


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    I agree, great value in the mid and low end. The 2080ti is ultra enthusiast level, thats how most class it. I am selling a 980 + 2700k machine at the moment (adverts name is the same) and its got cracking performance for very little sheckles. I am amazed at the fact that 1.5% of gamers are dropping 700-1000 euros on 1080ti's, thats a lot of people. Of course most people worldwide will be at the low end.

    Yes every 2080 and 2080ti sold out from Nvidia and its board partners. In the next round of production ever one will sell out too. So this launch from Nvidia has been a success. THey cannot produce them fast enough, so yes its paying for the R&D and production of this first next gen card.

    SLI figures are looking very very good indeed even on older titles, may need to really bug the **** out of all ye filthy casuals and get another. ;);)


  • Registered Users Posts: 7,582 ✭✭✭Inviere


    tuxy wrote: »
    In fairness...

    I know, as I said in the thread earlier, more power to him. I agree with Terror that the pricing structure has gone to farce levels, but if I had the cash, I’d still buy one. It’s an enthusiast buying an enthusiast product, there’s much worse you could be doing with your money :)


  • Registered Users Posts: 2,915 ✭✭✭cursai


    Truth bombs!!


    I myself think also that RTX is another gimmick like 4k or 3D that most people don't notice or care about.


  • Registered Users Posts: 4,241 ✭✭✭god's toy


    Or NVIDIA PhysX...



    R.I.P


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    god's toy wrote: »
    Or NVIDIA PhysX...



    R.I.P

    6034073

    People seem to care a lot about 4k performance too, I dont know why but they do.


  • Advertisement
  • Registered Users Posts: 7,410 ✭✭✭Icyseanfitz


    to be fair, 4k is not a gimmick its just super hard to run well


  • Registered Users Posts: 7,806 ✭✭✭Calibos


    I'd like to appologise to my non VR PC brethren that because AMD are so far behind, nVidia took the opportunity presented by this generational breathing space to implement some rendering architecture changes required for VR to really spread its wings soon, but at the expense of larger improvements in rasterisation speeds this gen.

    There is a keystone technology for VR called foveated rendering that leverages the human physiological fact that we can actually only see in 20/20 vision in a tiny area of our vision called the foveal region and visual acuity rapidly drops off outside this small area which is the size of a €2 coin held at arms length. A GPU need only render that small area in full detail and full resolution and can render the vast majority of the image in graduated rings of lessor detail and res and the eye/brain wont notice the difference.

    Maximise this video window and note that all multicoloured cogs on screen are actually turning but only in the Foveal region of your vision can you see them turning.

    With fast enough eyetracking and certain rendering pipeline changes the GPU processing power reduction is a magnitude difference. ie. x10. VR goes from being the most GPU intensive type of game as it is in Gen 1 to being the least GPU intensive where low tier GPU's from now on will be able to drive the massive resolution of future VR Headsets like the rumoured 4000 x 4000 pixels per eye of the Rift 2. With performance still to spare, its VR where you are going to see the likes of in-game graphics from the Star Wars Stormtrooper/Phasma Ray-tracing demo first.

    For us VR-philes, this commitment by nVidia to the future of VR and AR is thrilling to see. Mark my words, in as little as 5 years, it'll be as strange to see a PC or Console gamer without a VR Headset as it is to see someone playing on a 720p Monitor or TV now.

    Thats not signalling the death-knell of 'flat' gaming. Its just that we'll be playing those games on a small, comfortable high res, high Fov, Tetherless inside/out tracked and wireless VR headset on a 4k equivelent Virtual Screen of any size we want while vrtually relaxing on a tropical island beach or on the surface of Saturns Moon Titan if we want. Who would not want to be able to play their flat games on a Virtual 20ft wide, 4k 144hz Gsync OLED equivalent virtual 'monitor/tv' for the princely sum of about €400??

    This to me will be the 'Killer App' that drives mainstream PC and Console acceptance of VR. Then these people find that actual VR games are a whole new level...and cream themselves when they see VR porn. In a decade when we have high res full positional 3D video enabling pay per view best seat in the house court/ring/pitch side views of sporting events and concerts like you were actually there, and AR mode on the eventual combo VR/AR sunglasses replacing every energy and resource guzzling physical TV, projector, monitor and tablet in the house, thats when everyone and their literal granny will own and use a VR/AR device.


  • Registered Users Posts: 3,495 ✭✭✭Lu Tze


    Calibos wrote: »
    4k equivelent Virtual Screen of any size we want while vrtually relaxing on a tropical island beach or on the surface of Saturns Moon Titan if we want

    Unfortunately this coincided with the introduction of ray tracing and your virtual screen will have terrible glare off it from the tropical sun


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    Calibos thats very interesting indeed. I found with VR ( I was a early Vive adopter since cahsed out on that) that the issue was movement. This teleporting around was not great, + I got motion sick even with high frame rates. I think the buzz around VR has died quite a bit and games developers are not seeing the interest there. Games were mostly designed to keep you static, and while it was cool for a few minutes the though of a long gaming session or spending 50 hours in a game was impractical....its really the other side of the arguement.....

    The anti RTX crew say games are about the story telling and the extra visual fidelity is unnecessary....the VR people see visual fidelity as the goal but forget gameplay and storytelling. The truth lies somewhere in the middle I think. Until haptic feedback, foot tracking and movement are sorted VR will remain a interesting tech demo.


  • Registered Users Posts: 5,572 ✭✭✭EoinHef


    Is anybody really anti RTX though?

    Most people seem to be anti rip off nvidia pricing. While some have just taken the marketing bait.


  • Closed Accounts Posts: 9,538 ✭✭✭btkm8unsl0w5r4


    With a lot of youtubers needing pushing 1 video a day, the narrative that RTX is not recommended seems to prevail. In this age of embargos and deadlines, you have to push a story, and the nuance of the matter is lost.

    I dont get it....say you are in the market for a 1080ti....are you really going to buy that expensive card or push a little more euro, save another month and get a 2080 which is a little faster today, potentially a lot faster tomorrow, and supports all the new technology?

    There is an inconsistent logic in the reviews and no acceptance that the 1080ti was fast enough for anything already and the answer to progression was not incremental speed increases, which came, but which were not seen as enough, even though fast enough is fast enough already.

    Looking at the thermal, the cards, the memory speed and the die size and complexity its amazing that they were not more expensive. people dont think much of spending 1500 euro on a new 4k TV with Oled and all that stuff to watch the same tv shows, but charge that money for a vastly more complex piece of technology and people are all gone communist....

    My card arrived so interesting evening ahead, will post some photos and benchmarks. Packaging is in another league if your interested in that part of it. Wont open it, will share that with you guys.


  • Registered Users Posts: 3,711 ✭✭✭Praetorian


    EoinHef wrote: »
    Is anybody really anti RTX though?

    Most people seem to be anti rip off nvidia pricing. While some have just taken the marketing bait.

    I'm not anti rtx. Progress is good.

    The launch has been terrible though. Nvidia still should have ensured the expected rasterisation gains people expected, i.e. the 2080 should have well beaten the 1080ti. This is 2 years later. Also, it was a rushed launch for some reason. They should have delayed the launch until some games were ready. Apparently, some game companies had 72 hours with the hardware... and some had that much time with only the sdk.

    The pricing is really horrific. I'd normally buy a Nvidia card every generation. Pascal lasted so long I actually had 3, 1070 to 1080 to 1080ti. I will certainly not be going rtx unless I see a huge discount at some stage. Probably I'll just wait for 7nm as I expect a huge bump in performance then. Also, it could be argued it's never a good idea to buy the first of anything.

    Imagine the rasterisation speeds we could be enjoying if Nvidia had dedicated all that silicon to traditional rendering :)

    The graphics card industry is not in a great place right now. Amd is not competing on the high end letting Nvidia do what they want. There are rumours that Amd next card will not be aiming for the high end either. God knows when they will be able to implement ray tracing.

    Hopefully, the future will be brighter. I'm hoping for huge benefits from the 7nm node. I'm hoping Intel will really compete with Raja in charge of graphics there. I hope AMD can continue their huge comeback with a monster gpu in the next couple of years.


    I expect ray tracing to cause huge perf hits.


  • Registered Users Posts: 655 ✭✭✭L


    I dont get it....say you are in the market for a 1080ti....are you really going to buy that expensive card or push a little more euro, save another month and get a 2080 which is a little faster today, potentially a lot faster tomorrow, and supports all the new technology?

    I'm exactly that case though Fitz. Being able to use EVGA step up to get a 2080/2080 TI was a big part of my logic for ordering a 1080 TI.

    The benchmarks really don't support spending another ~€250 or so. It's literally 40% again of what I paid for the 1080 TI. That's a pretty hard sell for "potentially". :P

    What I was expecting was a moderate bump in performance for the 2080 (at roughly price parity with my discounted 1080 TI), and a very significant bump for the 2080 TI (at somewhere in and around the full price point, so a net cost of maybe a hundred or two hundred to me).


  • Registered Users Posts: 13,983 ✭✭✭✭Cuddlesworth


    With a lot of youtubers needing pushing 1 video a day, the narrative that RTX is not recommended seems to prevail. In this age of embargos and deadlines, you have to push a story, and the nuance of the matter is lost.

    When the 680 was released, did you see a lot of people talking about buying the 580ti new at rrp?

    When the 780 was released, did you see a lot of people talking about buying the 680ti new at rrp?

    When the 980 was released, did you see a lot of people talking about buying the 780ti new at rrp?

    When the 1080 was released, did you see a lot of people talking about buying the 980ti new at rrp?

    So why is it with the release of the 2080 and 2080ti, the fact that the rest of Nivida's product stack is relevant in terms of pricing(above rrp) and performance, is not something to report on?

    They have released a better product and have created new pricing tiers. This is what happens when you loose competition.


  • Advertisement
  • Registered Users Posts: 7,582 ✭✭✭Inviere


    I dont get it....say you are in the market for a 1080ti....are you really going to buy that expensive card or push a little more euro, save another month and get a 2080 which is a little faster today, potentially a lot faster tomorrow, and supports all the new technology?

    Well said, and that's all she wrote.


Advertisement