Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

Options
14344464849209

Comments

  • Registered Users Posts: 5,572 ✭✭✭EoinHef


    Well i wouldnt say a new tier,there was always the titan.

    So if it was a new tier we would get a titan this gen too. Maybe we will.

    Also this gen may align with 20 series pricing but that was the first series to really take the piss imo.


  • Registered Users Posts: 5,857 ✭✭✭Cordell


    Well maybe 3090 is the new Titan, or maybe there will be a 3000 Titan at a whole new price level, in any case it's not right to compare it with 2080 for both price and performance.
    As for taking the piss, they do it because they can, the lack of competition tend to do that.


  • Registered Users Posts: 5,572 ✭✭✭EoinHef


    Is anybody saying the 3090 should be priced the same as a 2080 though? I haven't really seen anyone say that.

    The rumor I saw was the 3090 at possibly $2000,so givin past gens if that card doesn't take the titan spot and there's still a titan after that its a massive increase in price for the top end Nvidia consumer based GPUs,just like the 20 series was before it. The newer rumors of $1399 do seem a better fit though,its an increase but much a less noticeable one.

    If there is a 3080 ti id expect that to be priced similarly to a 2080 ti alright. As in its priced based on where it sits in stack rather than the 3080 ti is 25% than a 2080 ti so should cost more.

    Also creating new tiers helps no one but Nvida,just segmenting there range more makes it easier to charge higher prices. Everyone thinks of course a XX90 should cost more than a XX80,90 is bigger than 80!!

    Maybe the 3090 should have been the 3080 ti and some marketing big wig said call it a 3090 and charge $200 more,it is Nvidia at the end of the day:pac:


  • Registered Users Posts: 3,724 ✭✭✭Metric Tensor


    They are always going to piss around with naming conventions and "numbers" in order to maximise profit. I don't think people should be too concerned with comparing a 1080 with 2080 or 3080 - that's what nVidia wants you to do so you can be sure the naming is designed to maximise profit.

    I think it's better instead to compare price point. I paid circa €400 for my 1070 in 2016. The question I should ask is how much more performance can I get for €400* now - it doesn't matter if it's called the X-Pro-Max-Extreme-Eleventy-One or the Sh!tballer5000 - all the matters is the performance increase for the same price level. Then I can also decide if it's worth the investment or not ... or would I like to step up to the next price point or drop a level.

    *Probably should say 410 for inflation, etc.


  • Registered Users Posts: 4,027 ✭✭✭H3llR4iser


    They are always going to piss around with naming conventions and "numbers" in order to maximise profit. I don't think people should be too concerned with comparing a 1080 with 2080 or 3080 - that's what nVidia wants you to do so you can be sure the naming is designed to maximise profit.

    I think it's better instead to compare price point. I paid circa €400 for my 1070 in 2016. The question I should ask is how much more performance can I get for €400* now - it doesn't matter if it's called the X-Pro-Max-Extreme-Eleventy-One or the Sh!tballer5000 - all the matters is the performance increase for the same price level. Then I can also decide if it's worth the investment or not ... or would I like to step up to the next price point or drop a level.

    *Probably should say 410 for inflation, etc.

    The issue I see is that while the absolute performance increase is usually there (a GTX 1660Ti is as fast as a GTX 1070), what tends to happen is that the same amount of money tends to "drop back" one level in the relative scale at each generation of cards. The 400€ you mentioned, what would afford a GTX 1070, are now enough "only" for a 2060 Super. Which is indeed a level below where the 1070 positioned itself amongst its peers. To maintain the "position", I went for a 2070 Super, which set me back around 600 - which would have gotten you a GTX 1080 in the previous generation. Prices have since gone a bit down, but still.


  • Advertisement
  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    I don't trust any pricing rumours.
    According to GN, Nvidia CEO Jensen Huang finalizes the price on the day of reveal.


  • Registered Users Posts: 3,724 ✭✭✭Metric Tensor


    H3llR4iser wrote: »
    The issue I see is that while the absolute performance increase is usually there (a GTX 1660Ti is as fast as a GTX 1070), what tends to happen is that the same amount of money tends to "drop back" one level in the relative scale at each generation of cards. The 400€ you mentioned, what would afford a GTX 1070, are now enough "only" for a 2060 Super. Which is indeed a level below where the 1070 positioned itself amongst its peers. To maintain the "position", I went for a 2070 Super, which set me back around 600 - which would have gotten you a GTX 1080 in the previous generation. Prices have since gone a bit down, but still.

    I think that's the game they play. The purchaser feels they have to stay at the "070" level because buying an "060" feels like a step back. So you're effectively upsold by 200 odd quid.

    I'd love to see some sort of standardised cross generation performance per euro test regardless of card name and position in the manufacturers tiered offering. However I have no idea how you'd standardise across generations because everything the card works on is also advancing at the various rates.


  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee


    Nvidia gross profit margin is 58% to 65% this year 2020 it's lowest profit margin was in 2019 and went to 55% then went up to 65% in 2020
    it dropped in July 2020
    To 59% as AMD sold more gpu Nvidia is hoping the new cards bring in gross profit of 69%


  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee


    NVIDIA RTX 3000 Series Rumored Pricing: RTX 3090 for $1399, RTX 3080 for $799, RTX 3070 for $599 and RTX 3060 for $399 you will need bigger case and 850 w power. Supply

    The performance increase is 20% for the 3090 over the RTX 2080 Ti
    Nvidia. Will wait to see what amd does to increase or decrease it's prices some cards will come out middle September to see what amd does then more middle of October then cheaper cards later


  • Registered Users Posts: 5,857 ✭✭✭Cordell


    The performance increase is 20% for the 3090 over the RTX 2080 Ti
    If that is true, and if the rumored prices are true, this will be very disappointing.


  • Advertisement
  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Cordell wrote: »
    If that is true, and if the rumored prices are true, this will be very disappointing.

    The major gains are going to be in ray-tracing and DLSS imho.


  • Registered Users Posts: 5,857 ✭✭✭Cordell


    Sure, but a raw performance increase at that level or better is to be expected from 3080 over 2080.


  • Moderators, Business & Finance Moderators Posts: 2,449 Mod ✭✭✭✭Rob2D


    Ha, and people thought I was mad investing in a 1200W PSU. I knew this would happen someday.

    Getting ready to upgrade next year and my wallet is scared even thinking about it. €1000 would really want to be the limit I'd spend on a GPU. So hopefully it comes in at £1200-1300 and I'll knock a couple hundred off that in VAT with Amazon Business.

    But still, it's crazy money. And the alternative is to have an AMD card. Which is essentially akin to Russian Roulette.


  • Registered Users Posts: 5,857 ✭✭✭Cordell


    There is no alternative AMD card at that price point, unless you're buying 2 of them :)


  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    Cordell wrote: »
    There is no alternative AMD card at that price point, unless you're buying 2 of them :)

    Intel Xe is coming :pac:

    New 12-pin power connector detailed: https://www.techpowerup.com/271301/nvidia-12-pin-connector-pictured-next-to-8-pin-pcie-its-tiny


  • Registered Users Posts: 5,857 ✭✭✭Cordell


    Reading some comments there reveals it's not a new connector, as in it's not proprietary: https://www.digikey.com/product-detail/en/molex/0430251200/WM1788-ND/252501. And this also means we'll see crap adapters and funny pictures of melted wires :)


  • Registered Users Posts: 5,572 ✭✭✭EoinHef


    Rob2D wrote: »
    Ha, and people thought I was mad investing in a 1200W PSU. I knew this would happen someday.

    Getting ready to upgrade next year and my wallet is scared even thinking about it. €1000 would really want to be the limit I'd spend on a GPU. So hopefully it comes in at £1200-1300 and I'll knock a couple hundred off that in VAT with Amazon Business.

    But still, it's crazy money. And the alternative is to have an AMD card. Which is essentially akin to Russian Roulette.

    Yup,1200W PSU justified because one GPU not even released yet may require up to a 850W PSU:pac:


  • Registered Users Posts: 5,857 ✭✭✭Cordell


    There are quite a few issues with that power figure: the new connector is supposed to be rated for 600W so in this sense the recommendation for a 850W PSU kind of make sense. But on the other end it has 2x 8pin PCIe power connectors which by the spec are rated for 300W combined, so by design this adapter is out of spec, and using it with a PSU that is not able to deliver more power than the spec on the 8pin connectors will be a big problem.
    But the PSU requirements are based on the card itself, not some spurious adapter.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Thought that 12 pin was bogus. I guess not.

    Recommended PSU is always way higher than it actually needs to be. Good quality PSU's are pretty efficient even when pushed close to it's limits.

    If this thing is using Samsungs 8nm process and also using 350-400w then it has to be at least 40-50% faster than Turing unless they put all of the die space/power into RT and DLSS. This 3090 if it's not a titan is a new tier of consumer card as well so I'd expect it to be at least 50% faster than the 2080ti and 4X faster with the RT stuff.

    Looks like Nvidias going all out to capture the enthusiast market. AMD won't be beating them on performance in that segment. AMD's top card is only 500mm square. They will compete with the standard 3080 and below. Hopefully better performance at a better price. Nvidia are starting to really take the piss with their pricing.


  • Registered Users Posts: 1,403 ✭✭✭spiritcrusher


    Will there be any reasonably imminent knock on effect of these new cards on the 20 series' price do ye think? Building my first proper PC for gaming and the graphics card is the only thing I'm struggling to decide on! Was going to go for a 2060 but might just up the budget to around 500 for a 2070 super, but I'd hang on for a month if there was something better in that price (or a price reduction) on the way...


  • Advertisement
  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Will there be any reasonably imminent knock on effect of these new cards on the 20 series' price do ye think? Building my first proper PC for gaming and the graphics card is the only thing I'm struggling to decide on! Was going to go for a 2060 but might just up the budget to around 500 for a 2070 super, but I'd hang on for a month if there was something better in that price (or a price reduction) on the way...

    There should be. I'd hold out for a few weeks and see what happens. Nvidia unveiling some stuff September 1st with releases coming in late Sept/Oct.


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    Rob2D wrote: »
    Ha, and people thought I was mad investing in a 1200W PSU. I knew this would happen someday.

    Getting ready to upgrade next year and my wallet is scared even thinking about it. €1000 would really want to be the limit I'd spend on a GPU. So hopefully it comes in at £1200-1300 and I'll knock a couple hundred off that in VAT with Amazon Business.

    But still, it's crazy money. And the alternative is to have an AMD card. Which is essentially akin to Russian Roulette.


    I hear ya bro, I bought a Coolermaster 1000w when I bought my current i5-2500k and people laughed at me big time. Still running perfect and its going to power my new Ryzen 9 rig I just ordered.

    Thank God is was modular and even back in the day came with 4x8 pin and 4x6pin connectors for GPUs. Which was unheard off!


  • Registered Users Posts: 4,377 ✭✭✭Homelander


    That hardly negates the criticism though? It's not as if the increased wattage has proven its value.

    The 850W thing is to account for garbage supplies. You can buy a GT1030 which consumes about 40w of power and on the box it will say "500w PSU recommended" even though your entire machine will use a grand total of about 100w.

    "Investing" in a 1200W power supply if you had an FX9590 and triple GTX1080 setup, fair enough. The same power supply for a bog standard 300w machine under load is money flushed down the toilet.


  • Registered Users Posts: 5,572 ✭✭✭EoinHef


    So the event is 5pm on the 1st?

    It will be nice to be able to watch it at a reasonable hour. Hope there's not too much PR bs but the 21 thing makes me feel like the marketing department for Nvidia will be in full flow.

    Id really like to hear from AMD soon too. Givin that they may decide a few things after Nvidia drop there new GPUs i wouldnt be expecting anything between now and the 1st but some info before Nvidia can actually release these new cards would be nice.


  • Registered Users Posts: 5,857 ✭✭✭Cordell


    Not only there is no value in having an oversized PSU, it actually makes it very inefficient to run it under such a low load, so it will waste more power than a properly sized one.


  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    Cordell wrote: »
    Not only there is no value in having an oversized PSU, it actually makes it very inefficient to run it under such a low load, so it will waste more power than a properly sized one.

    A good PSU will still be efficient at low loads.

    What causes wasted power is really the fact we're using ATX-spec PSUs with 3.3V, 5V and 12V rails.
    Intel's new ATX12V0 PSUs reduce idle power draw by half.


  • Registered Users Posts: 5,857 ✭✭✭Cordell


    Yes, it may still be 80% efficient across the range, but it most reach peak efficiency 90+% at 50% load. Now, it's definitely not a big deal to waste a few more percents at low loads, but still, it's more waste than a properly loaded PSU.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Still have my Corsair 860i Plat which never get's pushed hard even under load.

    Peak 92% efficiency around 50% load. It's still above 90% in the 25-85% load area. Drops to 85% at 10% load which ain't great but you're using so little power at that point that efficiency doesn't matter much.

    Peak load of my PC atm is only around 250-300w. Could easily go up to a 16 core Ryzen and a top end Nvidia card and it still wouldn't be close to pushing this thing hard. This is overkill. 1200w is taking the piss. 1200w's are for SLI machines but who does that anymore?


  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    Videocardz just posted that RTX 3080 will have 10Gb VRAM, 3090 a mammoth 24Gb.

    https://videocardz.com/newz/confirmed-nvidia-geforce-rtx-3090-has-24gb-memory-rtx-3080-gets-10gb


  • Advertisement
  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    Also a very good thread here on why Turing actually wasn't as overpriced as everyone claims them to be:
    https://www.reddit.com/r/hardware/comments/ih6gvd/analysis_of_nvidia_gpu_prices_over_time_or_why_is/


Advertisement