Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

Options
16061636566209

Comments

  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    z0oT wrote: »
    I can't help but notice the prices announced aren't *too* unreasonable, certainly not cheap but I honestly expected them to gouge much more.

    Hopefully, Nvidia know something we don't and that AMD will be more competitive than their previous generations with whatever they have coming next and we'll have actual competition in the GPU market once again even if it's not at the super high end (3090 class).

    Although, I suspect if the early words about limited availability are true, the early prices will be prohibitive.

    Very good video I posted by PC Centric, basically he was saying that the main reason that nVidia are being super aggressive is to get a real strong foothold in the GPU market. If millions of people are using new nvidia cards with their specific tech (RT & DLSS and this new Storage IO) then that will dictate alot to developers who will be optimising their games for that tech and it will be and money making loop for yrs in the making.

    Nvidia would love to do to AMD what Intel did for the last decade or so....make them a non runner in the CPU market but in this case the GPU market.


  • Registered Users Posts: 5,572 ✭✭✭EoinHef


    Think people are getting a bit carried away when it comes to 30 series pricing,in reality its "good" pricing when you compare it to the 20 series.

    Its actually more in line with GPU prices pre 20 series. So in that sense its pretty typical pricing.

    I was just surprised by it because I totally expected Nvidia to try keep the higher tier pricing. Reverting to pre 20 series prices was a nice surprise.


  • Moderators Posts: 5,554 ✭✭✭Azza


    Wonda-Boy wrote: »
    Very good video I posted by PC Centric, basically he was saying that the main reason that nVidia are being super aggressive is to get a real strong foothold in the GPU market. If millions of people are using new nvidia cards with their specific tech (RT & DLSS and this new Storage IO) then that will dictate alot to developers who will be optimising their games for that tech and it will be and money making loop for yrs in the making.

    Nvidia would love to do to AMD what Intel did for the last decade or so....make them a non runner in the CPU market but in this case the GPU market.

    NVIDIA are already at 80% of the GPU market, I think they are well beyond the point of a really strong foothold at this stage.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Exactly. AMD are miles behind Nvidia in the gpu market and I don't think RDNA2 is going change that.

    They have a bad rep for a reason, crap drivers, not updated often enough, terrible stock coolers, not competitive in the high end for over 5 years. Nvidia have already capitalized on that and are not giving AMD an inch to catch up. They are only extending their lead.

    AMD need massive investment into their gpu division, both in hardware and software if they want to compete again and change their deserved poor image in the gpu market. Even selling at little to no profit to try and regain some reasonable market share and public image.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,662 Mod ✭✭✭✭Hyzepher


    AMD's poistion in the CPU and GPU market has given them great gains over the last 2 years. Their involvement in both upcoming consoles will position them at striking distance for the GPU generation after this one i.e. to challenge Nvidia 4000 range

    Do not under-estimate AMD's ability to provide a CPU/GPU product - akin to both their Xbox and PS5 offering - this will be crazy important down the road when mainstream consumers will be content with Ryzen/Radeon integrated APU chips once GPU price/performance no longer matters. When GPUs can provide 4k @ 144fps then GPU increases will no longer matter to 90% of the descrete graphics consumer. Being able to pair a powerful Ryzen with a 4k/144fps Radeon into one chip will be the end of Nvidia.

    Imagine €600 for a powerful APU. It might prompt an Intel/Nvidia merger.

    It might be a few years off but it's coming


  • Advertisement
  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Hyzepher wrote: »
    AMD's poistion in the CPU and GPU market has given them great gains over the last 2 years. Their involvement in both upcoming consoles will position them at striking distance for the GPU generation after this one i.e. to challenge Nvidia 4000 range

    Do not under-estimate AMD's ability to provide a CPU/GPU product - akin to both their Xbox and PS5 offering - this will be crazy important down the road when mainstream consumers will be content with Ryzen/Radeon integrated APU chips once GPU price/performance no longer matters. When GPUs can provide 4k @ 144fps then GPU increases will no longer matter to 90% of the descrete graphics consumer. Being able to pair a powerful Ryzen with a 4k/144fps Radeon into one chip will be the end of Nvidia.

    Imagine €600 for a powerful APU. It might prompt an Intel/Nvidia merger.

    It might be a few years off but it's coming

    That's not even remotely close to reality though. The bar on graphics demands is always being raised higher. AMD aren't sticking out some magical APU that can do 4k/120hz anytime soon, they can't even make a discrete gpu that can do that, unless we stay at the current graphics level for the next 5-10 years.

    That's not happening. Discrete GPU's are going nowhere. Meanwhile Nvidia is the one doing all of the innovation.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,662 Mod ✭✭✭✭Hyzepher


    BloodBath wrote: »
    That's not even remotely close to reality though. The bar on graphics demands is always being raised higher. AMD aren't sticking out some magical APU that can do 4k/120hz anytime soon, they can't even make a discrete gpu that can do that, unless we stay at the current graphics level for the next 5-10 years.

    That's not happening. Discrete GPU's are going nowhere. Meanwhile Nvidia is the one doing all of the innovation.

    Maybe that's true for the enthusiast gamer but even now 90% of gamers use 1080p resolutions or less. They'd love 4k but will never be able to afford that screen/GPU/CPU combination.

    Nvidia are making great products and do innovate but maybe their innovation is directed in the wrong areas. 3070, 3080 and 3090 cards are great but even the 2080ti had less then 1% market share. The whole RTX range had less then 10%.

    The success of the PS5 and Xbox Series X will be interesting as they are essentially APUs

    Nvidia might be able to produce a GPU that will dance over any future AMD APU but can they survive on 10% market share? Because 90% will be happy with an APU that performs at 4k/60fps let alone 4k/144fps


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Hyzepher wrote: »
    Maybe that's true for the enthusiast gamer but even now 90% of gamers use 1080p resolutions or less. They'd love 4k but will never be able to afford that screen/GPU/CPU combination.

    Nvidia are making great products and do innovate but maybe their innovation is directed in the wrong areas. 3070, 3080 and 3090 cards are great but even the 2080ti had less then 1% market share. The whole RTX range had less then 10%.

    The success of the PS5 and Xbox Series X will be interesting as they are essentially APUs

    Nvidia might be able to produce a GPU that will dance over any future AMD APU but can they survive on 10% market share? Because 90% will be happy with an APU that performs at 4k/60fps let alone 4k/144fps


    Yes and a few years ago the majority were at 720p. Eventually they will be at 2k while the mid range will be 4k and high end will be 8k.

    That's generally how progression works.

    But they also have 80% of the discrete market on PC. Consoles will always be their own thing with their own audience with some crossover with PC.

    APU's have shown no signs of taking off outside of consoles and laptops. AMD have not even attempted a high end apu yet outside of the consoles.

    They are entry level chips that can barely do 1080p, nm 4k.

    There's also the issue of massive hardware and software gaps between them and Nvidia atm. Nvidia took a gamble changing their arch to a multi specialized chip role but it's paying off in spades and most of their software inovation is coming from this.

    AMD aren't even close.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,662 Mod ✭✭✭✭Hyzepher


    BloodBath wrote: »
    Yes and a few years ago the majority were at 720p. Eventually they will be at 2k while the mid range will be 4k and high end will be 8k.

    That's generally how progression works.

    But they also have 80% of the discrete market on PC. Consoles will always be their own thing with their own audience with some crossover with PC.

    APU's have shown no signs of taking off outside of consoles and laptops. AMD have not even attempted a high end apu yet outside of the consoles.

    They are entry level chips that can barely do 1080p, nm 4k.

    There's also the issue of massive hardware and software gaps between them and Nvidia atm. Nvidia took a gamble changing their arch to a multi specialized chip role but it's paying off in spades and most of their software inovation is coming from this.

    AMD aren't even close.

    I agree with most of this and that's why I prefaced my comments with the statement that this is unlikely until the step up in GPU performance no longer matters to the general gamer.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Hyzepher wrote: »
    I agree with most of this and that's why I prefaced my comments with the statement that this is unlikely until the step up in GPU performance no longer matters to the general gamer.

    As long as the step up is big which it will be then there's always a market for it.

    I would love to see AMD attempt a high end desktop APU that maybe even bundles in some HBM graphics memory on the package but for whatever reason they aren't doing it.

    I think down the line that may well be the way things go. The majority of power usage in PC's is spend transferring data. The further it has to go the less efficient it's going to be. Bundling everything onto the 1 package is probably the future but it doesn't seem too close.


  • Advertisement
  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    Macker1 wrote: »
    I was looking for a white coloured GPU for a recent white themed build. Not many options at the current 2000 series. Any thoughts on when or if white coloured 3000 series GPU's would become available.

    Very niche market based on lack of options.

    You'll have to import from China.

    11Bmb6l.jpg
    xOw4gBY.jpg
    OozSvSg.jpg

    I also love their LEGO GAMER card :pac:

    UctY93k.jpg


  • Registered Users Posts: 8,615 ✭✭✭grogi


    BloodBath wrote: »
    As long as the step up is big which it will be then there's always a market for it.

    I would love to see AMD attempt a high end desktop APU that maybe even bundles in some HBM graphics memory on the package but for whatever reason they aren't doing it.

    I think down the line that may well be the way things go. The majority of power usage in PC's is spend transferring data. The further it has to go the less efficient it's going to be. Bundling everything onto the 1 package is probably the future but it doesn't seem too close.

    We already have heat issues, the zen2 CPUs are so tight that they cannot generate any more heat. If you start bundling all together, you would have to throttle parts down to conserve the heat budget.


  • Registered Users Posts: 5,859 ✭✭✭Cordell


    Some all in one CPU/GPU/VRAM may suit the laptop and console market but for the desktop users will take away one of the most important feature, which is the ability to upgrade only what needs to be upgraded.
    And it was already tried with no real success by Intel: https://www.engadget.com/2018-01-07-intel-amd-rx-vega-m.html


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Disagree on the slow updates to AMD drivers for GPUs. We got three updates in August, one of them was even working properly.


  • Registered Users Posts: 8,615 ✭✭✭grogi


    Disagree on the slow updates to AMD drivers for GPUs. We got three updates in August, one of them was even working properly.

    Which ones? I don't want to be guinea pig here :D


  • Moderators Posts: 5,554 ✭✭✭Azza


    BloodBath wrote: »
    Exactly. AMD are miles behind Nvidia in the gpu market and I don't think RDNA2 is going change that.

    They have a bad rep for a reason, crap drivers, not updated often enough, terrible stock coolers, not competitive in the high end for over 5 years. Nvidia have already capitalized on that and are not giving AMD an inch to catch up. They are only extending their lead.

    AMD need massive investment into their gpu division, both in hardware and software if they want to compete again and change their deserved poor image in the gpu market. Even selling at little to no profit to try and regain some reasonable market share and public image.

    AMD drivers being poor use to be true but has not been the case for a year or more, and there released at a rate of one a month, same as NVIDIA. But stock coolers is a fair complaint though. AMD have also proven to be rather crap at marketing.

    As for not being competitive a the end high, its true at the ultra high end but I'd still classify card like the GTX 1070/1080 has high end cards, at least when looking at pricing, that's pricing I would consider to be high end GPU.

    Vega was late and power hungry compared to Pascal but they where competitive in terms of performance. Vega 56 beat the 1070 and the Vega 64 though slower at first than the GTX 1080 arguably had better legs due to it handling DX12 and Vulkan better than the Pascal architecture ever did.

    The gap widened last generation, AMD have no counter to the RTX 2070 Super and above (Vega 7 was just a stop gap counter) though you could argue the 5700XT compete on value against the RTX 2070 Super. Feature wise I don't think RT and DLSS have been essential features this generation of GPU's as the technology need time to be implemented into games but going forward AMD does need to have a counter to them in in Big Navi as RT and DLSS look set to become widely implemented features.

    I think the main issue AMD had over the last years was money. They pumped everything into the CPU side of things and on the GPU side where more focused on the console side of things to secure a steady cash flow. They where in trouble on all fronts with their CPU and GPU's and with limited resources decided to focus on the CPU side of things to steady the ship at the expense of the GPU division. Looks to have been the right move and now that they are on more stable financial ground they may be able to be divert more resources into the GPU side of things.


  • Registered Users Posts: 6,391 ✭✭✭jonski


    I'm guessing here, and going to wait until proper reviews are out but I'm thinking my 3 year old EVGA SuperNOVA 650 P2, 80+ PLATINUM 650W is going to be as bit under what I would need for the 3080 ?


  • Registered Users Posts: 1,389 ✭✭✭KillerShamrock


    jonski wrote: »
    I'm guessing here, and going to wait until proper reviews are out but I'm thinking my 3 year old EVGA SuperNOVA 650 P2, 80+ PLATINUM 650W is going to be as bit under what I would need for the 3080 ?

    There is a lot of ah sure it be enough floating around not just here but other forums and such too. It maybe the case to be honest.
    Or 650 power supplys will be the 2080ti of PSUs ...😉

    What would happen to a computer that has a decent 650w and the 3080 brings it over the limit (all other parts being reasonable like a 3700x or 3800x) will the PSU just crap out, explode or just force the system to shut down?


  • Registered Users Posts: 5,859 ✭✭✭Cordell


    It depends on the PSU, but good ones will have the overcurrent protection saving everything else, including themselves.

    But there may be another issue, pulling more power than the spec on a single PCIe cable while still being under the PSU limit, this is the really problematic one.


  • Registered Users Posts: 21,629 ✭✭✭✭Squidgy Black


    A lot of build calculators are still saying 650w is fine with the TDP for a 3080 once you're not rocking a huge TDP cpu like the 10900k that Nvidia used for testing. For example my Ryzen 5 3600, 2x8gb DDR4 sticks, 2 SSDs + NVME, HDD, 4 Fans and an AIO are showing at a max load wattage of 540w on OuterVision when you add a 3080 into the equation.

    Now that's without any overclocking on the either the CPU or GPU, when you start pushing up on either the load gets closer to around 600w


  • Advertisement
  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    grogi wrote: »
    Which ones? I don't want to be guinea pig here :D

    20.8.3 seem pretty stable. I was on 20.4.2 previously I think.


  • Registered Users Posts: 6,794 ✭✭✭cookie1977


    jonski wrote: »
    I'm guessing here, and going to wait until proper reviews are out but I'm thinking my 3 year old EVGA SuperNOVA 650 P2, 80+ PLATINUM 650W is going to be as bit under what I would need for the 3080 ?
    Unless you're oveclocking I cant see why the platinum 650W wouldn't be enough. I plan on using a Corsair 650 80+ goldwith my 3080 and amd 3900x cpu. Whats your processor?


  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    A lot of build calculators are still saying 650w is fine with the TDP for a 3080 once you're not rocking a huge TDP cpu like the 10900k that Nvidia used for testing. For example my Ryzen 5 3600, 2x8gb DDR4 sticks, 2 SSDs + NVME, HDD, 4 Fans and an AIO are showing at a max load wattage of 540w on OuterVision when you add a 3080 into the equation.

    Now that's without any overclocking on the either the CPU or GPU, when you start pushing up on either the load gets closer to around 600w

    You can't trust any calculators if no one has benchmarked the cards yet.


  • Registered Users Posts: 6,794 ✭✭✭cookie1977


    K.O.Kiki wrote: »
    You can't trust any calculators if no one has benchmarked the cards yet.
    Nvidia are quoting 750W recommended with an intel processor. They've said the 3080 TGP is 320W. Most likely they are being conservative. I think even before benchmarks we can take educated guesses and be right.


  • Registered Users Posts: 7,882 ✭✭✭frozenfrozen


    Cordell wrote: »
    But not on today's desktop grade MBs and CPUs...

    Source? it's usually disabled purposefully unless you go to their very high end cards. That announcement says it's working


  • Registered Users Posts: 5,859 ✭✭✭Cordell


    The issue is with the MB/CPU/Firmware not supporting SR-IOV, not with the card itself. So it may be working, but only on server motherboards.


  • Registered Users Posts: 21,629 ✭✭✭✭Squidgy Black


    K.O.Kiki wrote: »
    You can't trust any calculators if no one has benchmarked the cards yet.

    I'm not saying to blindly trust them, but Nvidia have said the max TDP is 320w. It's not hard to then add up the rest of the TDP for your system with a calculator.

    Unless Nvidia are changing their specs and the load is infact another 100w.


  • Registered Users Posts: 7,882 ✭✭✭frozenfrozen


    Cordell wrote: »
    The issue is with the MB/CPU/Firmware not supporting SR-IOV, not with the card itself. So it may be working, but only on server motherboards.

    Ok well bios with sr-iov have been available for a lot of consumer boards since x79 and I've seen a lot of threads about sr-iov working with TRX40 so I don't think that will be an issue.

    It's usually disabled on the card itself, it throws error 43 or something if it detects a virtual machine. I'm led to believe that won't be the case with 3000s which would be very nice


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Disagree on the slow updates to AMD drivers for GPUs. We got three updates in August, one of them was even working properly.

    I hadn't had an update in ages, realised I only had recommended updates enabled. I thought the optional stuff was beta.

    Hopefully this is more stable. The reason for me ****ting on their drivers is having a 5700 for nearly a year now that still has driver issues.
    A lot of build calculators are still saying 650w is fine with the TDP for a 3080 once you're not rocking a huge TDP cpu like the 10900k that Nvidia used for testing. For example my Ryzen 5 3600, 2x8gb DDR4 sticks, 2 SSDs + NVME, HDD, 4 Fans and an AIO are showing at a max load wattage of 540w on OuterVision when you add a 3080 into the equation.

    Now that's without any overclocking on the either the CPU or GPU, when you start pushing up on either the load gets closer to around 600w

    That's absolute peak, you're practically never going to get near that. Most good quality PSU's can handle spikes above their rating for short periods as well.


  • Advertisement
  • Registered Users Posts: 7,767 ✭✭✭Mr Crispy


    OcUK main man Gibbo just posted this on their forum, re Ampere availability;
    Hi there

    So a little more info for you guys.
    It is still early days but on the 17th we will have stock on 3080's, volumes it is still to soon to comment but even if we get 100 or 1000 cards they will sell out quickly, the amount of searches on our webshop for the 3080 is into the thousands.

    Also 3090 shall be in stock on 24th also, but again same situation as 3080, the demand is looking to be huge.

    They're also limiting purchases to one per customer.


Advertisement