Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

Options
12930323435209

Comments

  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    Venom wrote: »
    Nvidia launched multiple versions of the 1060 card with different amounts of ram and ram speeds, with some of the versions performing worse than the 1050 card. There is also the fact that uptake of RTX cards is nowhere close to the levels of the 10 series GPU's, so I don't think it's inconceivable that we might see GTX 1160, 1170 and 1180 cards along with Ti versions, later in the year especially if AMD have anything decent GPU wise coming out this year.

    Unless you're talking about mining, there's not a single 1060-card outperformed by a 1050-card.


  • Registered Users Posts: 13,981 ✭✭✭✭Cuddlesworth


    tuxy wrote: »
    Yes but what about the releases from 2018 and before that are listed as having RTX and DLSS features in the future? Will these be patched or will it only be new games? I can see many of these games being obsolete by the time they get the features, if they ever do.
    Getting DLSS support for games on the list seems to be particularly challenging.
    How many months do they usually take Nvidia to do their side of things on the Saturn V cluster?

    For a huge code base like a completed AAA game, it can be a pretty big timesink to go back and retrofit features like DLSS and raytracing in without breaking the game.

    I would guess most of those listed games company's put a person or two on the project, who went slowly insane then quietly dropped it.

    Nobody is going to put serious money down on older games for features that only a tiny subset of the market can use and that tanks performance.


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    What about hitman 2 that was released without DLSS?
    It just seems like many developers have decided to drop support.
    Isn't that something to worry about?
    I honestly can't see ray tracing becoming popular until cobsoles support it. That's several generations of GPUs away.


  • Registered Users Posts: 13,981 ✭✭✭✭Cuddlesworth


    tuxy wrote: »
    What about hitman 2 that was released without DLSS?
    It just seems like many developers have decided to drop support.
    Isn't that something to worry about?
    I honestly can't see ray tracing becoming popular until cobsoles support it. That's several generations of GPUs away.

    AAA games take years to develop. Ray tracing in windows and RTX feature sets have really only been around for a few months.


  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    tuxy wrote: »
    What about hitman 2 that was released without DLSS?
    It just seems like many developers have decided to drop support.
    Isn't that something to worry about?
    I honestly can't see ray tracing becoming popular until cobsoles support it. That's several generations of GPUs away.

    Considering Xbone/PS4 are AMD, adoption of Nvidia's raytracing is unlikely.


  • Advertisement
  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    K.O.Kiki wrote: »
    Considering Xbone/PS4 are AMD, adoption of Nvidia's raytracing is unlikely.

    Yeah I was just thinking that since they will be still using AMD hardware with future consoles AFAIK.
    Perhaps ray tracing will be adopted in the generation of consoles after that, so 10+ years for it to become mainstream with AAA developers.


  • Registered Users Posts: 6,984 ✭✭✭Venom


    K.O.Kiki wrote: »
    Unless you're talking about mining, there's not a single 1060-card outperformed by a 1050-card.


    Linus did a video on some China-focused 1060 that had really dodgy Vram and it was slower.
    For a huge code base like a completed AAA game, it can be a pretty big timesink to go back and retrofit features like DLSS and raytracing in without breaking the game.

    I would guess most of those listed games company's put a person or two on the project, who went slowly insane then quietly dropped it.

    Nobody is going to put serious money down on older games for features that only a tiny subset of the market can use and that tanks performance.


    I think maybe for some of the big MMO's such as WoW or ongoing cash shop driven games like Destiny 2, there might be some RTX tech added but for the most part I just can't see dev studios going back and dealing with all the hassle unless RTX edition games become a thing.

    K.O.Kiki wrote: »
    Considering Xbone/PS4 are AMD, adoption of Nvidia's raytracing is unlikely.


    Agreed. While Nvidia does hold a huge percentage of the PC market when it comes to GPU's I just don't see developers putting in the extra effort just for the PC version.

    tuxy wrote: »
    Yeah I was just thinking that since they will be still using AMD hardware with future consoles AFAIK.
    Perhaps ray tracing will be adopted in the generation of consoles after that, so 10+ years for it to become mainstream with AAA developers.



    Even if AMD has amazing GPU tech dropping this year with their own version of RTX that they have admitted their working on, it's probably way too late in the PS5 and the next Xbox development cycle for it to be included at this stage. Saying that, both Sony and MS had hardware refreshes of their consoles this generation, so maybe around 3-5 years we could see raytracing tech show up on consoles?


  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    Venom wrote: »
    Linus did a video on some China-focused 1060 that had really dodgy Vram and it was slower.

    GTX 1060 5Gb is still faster than GTX 1050.



  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    New AdoredTV video.



  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    The 2060 is a bad joke.

    Only 6gb of memory when devs are using 8-12 as standard. Rtx features that will never work because they are so cut down.

    Why not cut the bloody rtx features from these cards and give us more memory?

    The performance is quite good but 6gb of memory kills it for me at that price.


  • Advertisement
  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    How much vram do high end games need at 1080p?
    And how much at 1440p?

    Just want an idea on if this is how much is needed at these resolutions as this is probably the resolution this card is aimed at.
    My limited research shows demanding titles using anywhere up to 5GB of VRAM with AA enabled at 1440p!


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Lot's of new games are using well over 5gb. The likes of BF5 and far cry 5 are only using around 5-5.5gb sure. Pubg can use around 7gb. Black out uses 10.5gb or more on a 1080ti. Rainbow 6 siege with ultra texture pack uses around 8gb.

    Have a look at memory usage on some new AAA games on youtube for people benchmark high end cards.

    Not sure how much AA settings are impacting this though. Need some site to do a proper test on this.

    A lot of the current games were designed around PS4 and xbox one limitations of around 5.5gb Vram. PS4 pro and Xbox X have around 8gb available. The next gen will have even more. There's only 1 way Vram usage is going, and that's up.


    Some references.





  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    Any reference for 1440p?
    I don't think anyone is trying to claim that the RTX 2060 is a 4k card.

    I've searched youtube and have yet to find a game that exceeds 6GB at 1440p


  • Registered Users Posts: 13,981 ✭✭✭✭Cuddlesworth


    Nvidias memory compression is pretty good though, I think 6gb might be comparable to 8gb on a AMD card.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    tuxy wrote: »
    Any reference for 1440p?
    I don't think anyone is trying to claim that the RTX 2060 is a 4k card.

    From what I've seen resolution has a negligible effect on Vram usage. The majority of it is texture information as always.

    Maybe high lvls of old school MSAA as well. Most new games don't even support MSAA anymore though.


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    BloodBath wrote: »
    From what I've seen resolution has a negligible effect on Vram usage. The majority of it is texture information as always.

    Maybe high lvls of old school MSAA as well. Most new games don't even support MSAA anymore though.

    But if I look up the Rainbow Six Siege with the lower resolution of 1440p it has significantly lower VRAM usage.

    https://youtu.be/IpPiO4ApavI?t=255

    Blackout doesn't really count as VRAM leak is a well known problem in that game so if the card has 6GB it will use all of it at any resolution, same for cards with 8 or 11GB it uses everything available because of the VRAM leaks.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    tuxy wrote: »
    Any reference for 1440p?
    I don't think anyone is trying to claim that the RTX 2060 is a 4k card.

    I've searched youtube and have yet to find a game that exceeds 6GB at 1440p

    1080p and 1440p using roughly the same amount of memory maxing out a 2080. The 1080yi would use even more.

    https://www.youtube.com/watch?v=IhA04-UP3b0
    tuxy wrote: »
    But if I look up the Rainbow Six Siege with the lower resolution of 1440p it has significantly lower VRAM usage.


    https://youtu.be/IpPiO4ApavI?t=255
    Blackout doesn't really count as VRAM leak is a well known problem in that game so if the card has 6GB it will use all of it at any resolution, same for cards with 8 or 11GB it uses everything available because of the VRAM leaks.

    Siege needs the high resolution texture pack to push it up that high. It's an optional download. It's still a 3 year old game that uses 8gb.

    There is no memory leak that I know of in black out. If that was the case it would start out low and build up over time, eventually even causing performance problems. I have never experienced this in blackout. It's just using all memory available instead of having to swap out information all the time which will cause dips.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    The majority are using around 5gb though it seems. This makes sense when Ps4 and and xbox ones Vram is around 5gb.

    As usual consoles are dictating the norm with very few PC titles getting an improvement on this.

    The pro and X version have 8gb vram available though. Next gen might have even more so I'd expect the norm to 8gb soon enough.

    6gb might be enough for the here and now in most cases but if I'm spending €350-400 on a graphics card I'd expect at least 8gb.


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    The blackout vram memory leak is fairly well documented by players I don't have the game but know to people who have had issues with it on both a 1060 and a 1070. Maybe it's not enough of an issue with whatever card you are using to cause problems. If you google it you can see how widespread the problem is.

    With R6S I've looked up loads of videos with the ultra HD texture pack and 1440p and it seems to sit at 4.5 GB Vram usage but does go much higher at 4k.

    By 4k I actually mean 3,840x2,160. Perhaps that's the confusion. Yes the 2060 is totally unsuitable for gaming at that resolution.


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    BloodBath wrote: »
    6gb might be enough for the here and now in most cases but if I'm spending €350-400 on a graphics card I'd expect at least 8gb.

    I'll agree it's not very future proof. But nether are the tensor cores which I expect to be obsolete on the 2000 series cards before long.

    Since the xbox one x has 12GB DDR5(I know some is used as system memroy)
    I'd expect that to be minimum for newer consoles. Meaning that even 8GB may not be enough but there isn't even a hint at release dates for such things yet.


  • Advertisement
  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Fair enough I don't know anything about that memory leak. Maybe it's just people using settings higher than their cards are capable of?

    Pubg has a similar problem if you push the textures settings up too high on low memory cards. Most new games will dynamically adjust texture quality depending on vram available regardless of settings. Some don't.

    I think with siege they reduced the memory footprint at some point with a big overhaul of map textures. The game actually got worse looking.

    I can play blackout with a 970 at 2k with around 80-100fps with lower settings at around 80% render resolution. I need a new card but there's no obvious choice at the moment.

    I'd be all over the 2060 if it had 8gb Vram.

    -edit- Read up a bit on the memory leak in blackout and it's a system ram leak not a vram leak so the vram usage in that game is valid.


  • Registered Users Posts: 2,994 ✭✭✭Taylor365


    Wonder where prices will be in 6 months?

    Looking to start a SFF build so interested in blower styles of 2060 or 2070.


  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    Quake 2 Raytracing

    close to 60 FPS (2560x1440, RTX2080Ti)





  • Registered Users Posts: 655 ✭✭✭L


    Hrrm. Apparently we're getting GTX Turing. Some one of you lads called it but I can't remember who. ;)


  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki




  • Registered Users Posts: 13,981 ✭✭✭✭Cuddlesworth


    Is that a surprise, they said a while back they were releasing more GTX branded cards.


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    All depends on pricing. It we get a 1660 at something like 199, great.

    If it's another gouging exercise and ends up at 249-279 while they phase out the older 1060, they can flip off.


  • Registered Users Posts: 27,856 ✭✭✭✭TitianGerm


    All depends on pricing. It we get a 1660 at something like 199, great.

    If it's another gouging exercise and ends up at 249-279 while they phase out the older 1060, they can flip off.

    Is it not meant to be faster than the 1060? Would the £249 not be a reasonable price?


  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    [H]ardOCP source:
    Nvidia GeForce GTX 1660 Ti will launch on February 15, with an MSRP of $279. The GTX 1660 will launch in early March at $229, while the GTX 1650 will go on sale for $179 in late March. Additionally, Nvidia will continue to supply the 1050 TI to retailers, and its price will drop to keep it competitive.


  • Advertisement
  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    TitianGerm wrote: »
    Is it not meant to be faster than the 1060? Would the £249 not be a reasonable price?

    Generational leaps are supposed to bring better performance at similar price points.

    Whereas now, Nvidia is gouging terribly, and introducing new cards at the same or higher RRP than the generation they're supposed to be succeeding.

    Offering better performance at the same inflated price point isn't value, it's stagnation due to monopoly.

    This GTX1660 should be £199....not the same price as the GTX1060 when it came out two years ago.


Advertisement