Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

Options
1203205207208209

Comments

  • Registered Users Posts: 5,556 ✭✭✭Slutmonkey57b


    You can get a variant of the 3060 which has 12gb but they may be a bit thin on the ground now.

    If you are doing 4k video editing then nvidia have definitely screwed you over this generation. Equally, AMD have not released any mid range cards at all and it looks increasingly unlikely that they will.


    So a 12 or 16 gig card from the previous generation is your best bet I think. Whatever you get well be a big upgrade over a 9 year old card.



  • Registered Users Posts: 4,507 ✭✭✭TheChrisD


    I was in a similar boat earlier this year, went from a 7 year old 980Ti; replaced it with a 4070Ti. Then again, I'm primarily a gamer; so gaming and streaming performance was a key factor for me; video editing might not necessarily need such a powerful SKU, but definitely can make use of the VRAM and the encoding chips.

    Theoretically you could get a 3060 12GB (about €350); though how long that would last until needing another upgrade is a question that we don't really know the answer to right now. Above that it's definitely 4060Ti 16GB (€550?) or 4070 (€650) territory.



  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    RTX 3060 12GB can be found for a smidge over/under 300EUR easily.

    RTX 4070 12GB are around 620EUR now and would be nearly twice as fast as the 3060.



  • Registered Users Posts: 386 ✭✭Coyler


    Don't forgot 4060 are only running at x8. I'd wager C14N's machine is PCI-E 3.0 only and bandwidth would be quite important for editing purposes. That would stick in my craw if I paid €500.

    As Kiki says, you can get 3060 12GB under €300 all in https://amzn.eu/d/31MGhBX (delivery is around €10).

    CEX still asking €300 for the same type of card is nuts.



  • Registered Users Posts: 2,656 ✭✭✭C14N


    I didn't know that. Mobo is about 5 years old too but it was a pretty high-spec one at the time. It was this one: https://www.gigabyte.com/Motherboard/Z370-AORUS-Gaming-7-rev-10#kf

    Which does say it conforms to PCI-E 3.0, not 4.0. So basically that means that throughput would be capped at 8GB/s with the 4060. Is that a major bottleneck? And did the 3060 not share this same limitation?

    I'm the 3060 12GB editions on Prime sale for under 300 may be more what I would lean towards at the moment. Tbh, the CPU is also pushing 5 years old at this point too (8th gen i7) so the extra €200 or so saved on the GPU will go toward being able to get an updated CPU/mainboard sooner.



  • Advertisement
  • Registered Users Posts: 2,656 ✭✭✭C14N


    So basically, would you say there's not going to be much benefit to getting one of the 8GB 4000/3000 series cards over the 12GB 3060 to justify the price difference? It would have to be the 16GB 4060 Ti or 12GB 4070 or just save the money and stick with the older 12GB model for now?



  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    ✭✭✭✭



  • Registered Users Posts: 5,825 ✭✭✭Cordell


    Is that a major bottleneck? And did the 3060 not share this same limitation?

    IIRC artificial tests done by disabling 4 lanes on a PCIe 4.0 system have shown a 5-10% degradation. Not major but not insignificant either, and depending on your usecase it may be more or less. 3060 doesn't have this limitation.



  • Registered Users Posts: 4,507 ✭✭✭TheChrisD


    For productivity-focused workloads, there is little benefit to the 40-series unless you go 4080/4090 for the likes of Blender rendering. Or you may even be better served by AMD if you don't mind your power bill having a fair spike.



  • Registered Users Posts: 386 ✭✭Coyler


    Good information here. Even with PCI-E 4.0 support this reviewer goes for the 3060 12GB.



  • Advertisement
  • Registered Users Posts: 2,656 ✭✭✭C14N


    Thanks, I think that seals it for me. I decided to go and just order the 3060 12GB version. I'll keep the extra money and put it toward the CPU in the next few months.



  • Registered Users Posts: 5,825 ✭✭✭Cordell


    https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-3-5-ray-reconstruction/

    Coming to ALL RTX GPUs - of course without frame generation on 2xxx and 3xxx series.



  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    Seems Gigabyte launched a low-profile RTX 4060.

    Only seen it for sale in USA so far (for $330).



  • Registered Users Posts: 2,255 ✭✭✭Shlippery


    Eyeing up a 4070 which seems like a semi-reasonable price coming from my original Windforce 2080.

    Any recommendations on where to purchase? Is Komplett (now Paradigit) worth it?

    https://www.paradigit.ie/components/graphics-cards/#price/brand/stock/feature/29D1E564/sorting/Popularity/page/1



  • Registered Users Posts: 5,825 ✭✭✭Cordell


    Amazon.de is usually cheaper.



  • Registered Users Posts: 5,556 ✭✭✭Slutmonkey57b


    Also no reason to buy a 4070 when the 7800 is cheaper and faster.



  • Registered Users Posts: 5,825 ✭✭✭Cordell


    There are reasons, better RT performance and DLSS.



  • Registered Users Posts: 5,556 ✭✭✭Slutmonkey57b


    Where rt performance is a bit different are usually scenarios where neither card is giving you 50 fp in the first place so you probably wouldn't be using it. I'll take a decent memory buffer over "sometimes it looks OK on the very limited number of games it's trained on and you don't look at the smearing effects" DLSS any day of the week.


    Better textures and models always result in better visual than anything else.



  • Registered Users Posts: 5,825 ✭✭✭Cordell


    Better textures and models always result in better visual than anything else.

    This is simply not true anymore. Better lighting (using raytracing) and DLSS can improve the visuals much more that just cranking the texture resolution and polygon count. We're at that point where we should also compare features, not just the raw power. DLSS doesn't need to be trained on specific games anymore. The majority on AAA games released in the past 1-2 years do support DLSS with the notable exceptions of some AMD sponsored games. DLSS 3.5 is still in development but the initial experimental release in Cyberpunk 2077 finally allows path tracing graphics with reasonable framerates on mid range GPUs.

    But we had this conversation before, didn't we?



  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    7800 XT is only about 5fps faster on average at native resolution.

    Factor in DLSS though and I can make the argument for the 4070, especially since prices are about equal now (4070s are getting below €600).



  • Advertisement
  • Registered Users Posts: 5,556 ✭✭✭Slutmonkey57b


    It's anywhere between 5-8% on average, and up to 25%-50% faster in some games. I'll take that over the smeary mess that is DLSS or using RT Ultra at 20fps and calling that a "win". More memory makes it more futureproof. As far as I'm concerned technologies like DLSS aren't a feature, they're a "feature" that exists to try and cover up otherwise sub-par performance, not to provide actual benefit to gamers.

    I just can't get over the shift from "No point getting 200fps if the image is tearing and blurring, better to have Freesync/Gsync and a lower framerate" to "Look at this 200fps framerate, sure it's not genuine and the image is tearing and blurring but hey". Just shows how easily marketing can shift the picture.





  • Registered Users Posts: 5,556 ✭✭✭Slutmonkey57b



    Yes and you're still repeating company talking points like DLSS "doesn't need game specific training" despite being told that out of 4 technologies under that umbrella, only one doesn't require game-specific training; or that "AMD Sponsor = no DLSS", despite Moore's Law documenting that it was an Nvidia astroturfing misinformation campaign.


    So I wouldn't take anything you recommend to users here as honest, face value advice.



  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    • TechPowerUp: 3-5fps diff. average across 25 titles
    • Kitguru: 4-6fps diff. averaged across 12 titles
    • TechSpot (Hardware Unboxed): 5-7fps diff. averaged across 15 titles

    But both cards are averaging ~144fps at 1080p, 120fps at 1440p and over 60fps at 4K anyway.

    The 4070 can run Cyberpunk 2077 with RT Overdrive at 1080p60 / 1440p40 with just Quality Upscaling, which is not noticeable. And then it can DOUBLE the frame rate with Frame-Generation.



  • Registered Users Posts: 7,409 ✭✭✭Icyseanfitz


    The 7800xt is a disappointment imo, it's essentially the same performance as the 6800xt, again it feels like AMD had an open goal (6950xt performance at less than 500) and bottled it.

    As for the 4070, it's overpriced and a glorified xx60 card, the difference in ray tracing is negligible, but for me dlss 3 frame gen is definitely a selling point



  • Registered Users Posts: 31,014 ✭✭✭✭Lumen


    The 4070 does only have 12GB RAM (compared to 16GB in the 7800xt), which is fine for now but might require lower settings (medium+1080p) in a couple of years according to game developer rumours. There's also DisplayPort 2.1 which again is going to be more relevant in the next couple of years for certain display hardware.

    I'll still pick the 4070 though. Aside from the RT and DLSS capability there's better power efficiency (like 200W vs 260W) which makes it nicer to run.

    The 4070Ti with the same 12GB RAM is the real shocker at, what, 900 euros?

    I wish Nvidia would pull their finger out and support DP 2.1 but I guess that'll have to wait for the next gen.



  • Registered Users Posts: 5,825 ✭✭✭Cordell


    Both AMD and NVidia current gen are somewhat disappointing, and that means they're pushing the technology to its limits. By limits I mean what is technically possible while keeping it economically feasible. So they need to focus more on new technologies and features rather than pure raw power, and that what NVidia is doing with raytracing, DLSS, Reflex and other tech they introduced.



  • Registered Users Posts: 31,014 ✭✭✭✭Lumen


    We're currently in a painful transitional period towards full adoption of ray tracing features, where game developers have to implement both new and old techniques.

    I'm hopeful that in the next few years that will be complete and, along with game engine tech like nanite and procedural generation, and benefitting from vast stocks of cinema-quality assets, game developers will be freer to focus on gameplay, animation and NPC interaction. We're still deep in the uncanny valley as far as rendering and behaviour of humans in games is concerned, and a lot of the craziness in the gaming industry comes from migrations of players from one online game to another - the game devs build huge maps which require certain volumes of human players and then when the next new game comes along the older games depopulate to the extent of becoming unplayable. Over the next few years I expect AI to totally transform gaming, using it to fill in pixels and frames is nice but not transformative IMO.



  • Registered Users Posts: 5,825 ✭✭✭Cordell


    We're currently in a painful transitional period towards full adoption of ray tracing features

    I don't think we are there yet. Full ray tracing is now only possible on mid range and upper NVidia cards, and it's only been implemented as a technology preview on a single current AAA game. So we are yet to be in that painful transitional period, unfortunately.



  • Registered Users Posts: 31,014 ✭✭✭✭Lumen


    You mean path tracing? I don't think that's necessary to obviate the need for baked lighting, but I'm not a game dev.

    From what I can tell the recent RT features just make it more efficient/effective.

    tbh I absolutely hate the shininess of Cyberpunk. We don't live in a world of shiny puddles everywhere. I just think it's nice to have games that have realistic lighting from moving sources, particularly indoor scenes (I realise CP does that too).



  • Advertisement
  • Registered Users Posts: 5,825 ✭✭✭Cordell


    Yes, path tracing - that is the key for realistic rendering and that's how they do it for movies. There are a lot of shortcuts, cut corners and smart tricks to get something near that in real time rendering on consumer GPUs but the goal is to get as close as possible to full raytracing.



Advertisement