Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi all! We have been experiencing an issue on site where threads have been missing the latest postings. The platform host Vanilla are working on this issue. A workaround that has been used by some is to navigate back from 1 to 10+ pages to re-sync the thread and this will then show the latest posts. Thanks, Mike.
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

AMD Navi Discussion

1235758

Comments

  • Registered Users, Registered Users 2 Posts: 14,309 ✭✭✭✭wotzgoingon


    After console launch no?

    I'm pretty sure Lisa Su said a while back Navi 2 will be released before the consoles.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭EoinHef


    October I've seen a few places but thats about it.


  • Registered Users, Registered Users 2 Posts: 7,036 ✭✭✭circadian


    Aye information is scarce. Apparently the high end model will have 80 compute units and possibly 16gb ram but at the minute it's all chatter and no solid news of anything.

    I get the feeling the cards will perform well, especially regarding power usage but how they compete with nvidia is anyone's guess. We don't even know how the 30 series cards actually do the either mind you.


  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




  • Registered Users, Registered Users 2 Posts: 8,001 ✭✭✭Mr Crispy


    I think Big Navi will compete well in rasterisation. Priced well, and if they offer larger VRAM amounts and lower power consumption, I think they'll do okay with people who don't yet care about ray tracing or DLSS, and have sense enough to wait for reviews. But the number of people in that camp isn't that big!


  • Registered Users, Registered Users 2 Posts: 1,879 ✭✭✭Simi


    Mr Crispy wrote: »
    I think Big Navi will compete well in rasterisation. Priced well, and if they offer larger VRAM amounts and lower power consumption, I think they'll do okay with people who don't yet care about ray tracing or DLSS, and have sense enough to wait for reviews. But the number of people in that camp isn't that big!

    Ray tracing is going to be standard in non-competitive titles sooner rather than later though. AMD really need to be competitive on that front this time round.

    They'll also need a counter to DLSS eventually. It may only be supported by a dozen titles now. But if Nvidia substantially increases that support and DLSS 4K is indistinguishable from actual 4K but with a 20% boost to frame rate vs the comparable AMD card in the game you want to play, then it's pretty obvious which card you're going to pick.

    I really hope they're being quiet because they actually have something good to show us, and not because they're figuring out how the best way to quietly push out cards that can barely match Nvidia's last generation flagship parts.


  • Registered Users, Registered Users 2 Posts: 14,309 ✭✭✭✭wotzgoingon


    Going by Nvidia's pricing I think big Navi is going to be good. Sure Nvidia would have engineering samples of big Navi. May even have final samples depending on how close we are to launch of Navi 2.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,664 Mod ✭✭✭✭Hyzepher


    Given their console offerings I'd be surprised if AMD don't have something special this time around. Especially now that they have a couple of months to rethink their sku positioning


  • Registered Users, Registered Users 2 Posts: 13,997 ✭✭✭✭Cuddlesworth


    Going by Nvidia's pricing I think big Navi is going to be good. Sure Nvidia would have engineering samples of big Navi. May even have final samples depending on how close we are to launch of Navi 2.

    I'd usually say otherwise but its really unusual for Nvidia to firesale their last generation of product offerings, the last few years just have them come in at a higher price range.

    Its worth pointing out to those that don't know, AMD hasn't really made a "large" graphics chip in a long time. Most product comparisons over the last few years have been a large Nviida chip over a much smaller AMD chip in the same price segment. Its part of the reason why AMD gets blasted over it power usage, since a node shrink and significantly smaller chip should use less power. I'm interested in Big Navi, if they can finally make a chip thats not slamming off power limits to get minimal gains and make it large enough to compete at the high end, it could actually be competitive.


  • Advertisement
  • Moderators Posts: 5,561 ✭✭✭Azza


    I also suspect Big Navi is going be good as well

    In terms of rasterization performance I think AMD will at least have a card that can beat the 3070 and 2080Ti

    I'm also certain they will be able to offer some form of Ray Tracing as well.

    The question is will they be capable of running the current form for RT used by NVIDIA Turing and Ampere GPU's or opts for their own non proprietary form of RT that NVIDIA GPU's will also be able to run.

    Whether their potential own form of RT will as good, better or worse than NVIDIA's remains to be seen but they could be banking on the fact that it will become the de-facto standard of RT as it will be available on consoles as well. Would developers go to the extra expense of developing a different RT implementation or features just for NVIDIA GPU's, when they already have one that works on all formats and GPU's?

    I also suspect they have something in the works comparable to RTX IO for speeding up data decompression.

    They do need something to counter DLSS though, that seems to be NVIDIA's trump card at the moment.


  • Registered Users, Registered Users 2 Posts: 8,001 ✭✭✭Mr Crispy


    I'm guessing they'll use the ray tracing built into DX12 Ultimate, or whatever it's called.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Yeah I can't see them catching up with Nvidia this gen. They may well have specialist RT cores but will they have tensor cores?

    Nvidia have nearly tripled the tensor cores with the 3000 series over 2000. That's how much they believe in them. They are driving lot's of innovation like DLSS and all of the other cool AI driven features like the microphone voice isolation, virtual green screen, 2d picture or video to 3d model and animation, voice to facial animation, camera to skeletal mesh animation and a bunch of other things.

    There's a bunch of software utilizing these cores for workloads now as well. It's like having a super computer at home. You can or will be able to train game ai using them, you can create new unique 2d images and/or 3d models based on source samples of thousands and choose the 1 you like rather than trying to create it from scratch.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,664 Mod ✭✭✭✭Hyzepher


    Azza wrote: »

    I also suspect they have something in the works comparable to RTX IO for speeding up data decompression.

    I think Nvidia mentioned that RTX IO is dirived from implementations by xbox, so with AMD dominating next gen consoles I wouldn't be surprised if their IO offering was already superior to Nvidia and Nvidia just got their hype in first


  • Registered Users, Registered Users 2 Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    My predictions are better performance per watt.
    Something approaching the 3080 at a similar price.
    More RAM on the cards - which will be quickly countered.
    Nothing to counter DLSS.


  • Registered Users, Registered Users 2 Posts: 2,449 ✭✭✭Rob2D


    Nothing to counter DLSS.

    Which is what they actually need.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Rob2D wrote: »
    Which is what they actually need.

    I don't disagree.


  • Registered Users, Registered Users 2 Posts: 10,013 ✭✭✭✭Wonda-Boy


    Really hope they have something good and give us consumers something to think about.....competition is good and the only winner is the end user (us lot).


  • Registered Users, Registered Users 2 Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Wonda-Boy wrote: »
    Really hope they have something good and give us consumers something to think about.....competition is good and the only winner is the end user (us lot).

    The issue is though it'll have to be really good. I mean at least 20% better at the same price point with a USP. nVidia have the 'mindshare' or whatever it's called.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    RX 6000 series named in Fortnite?
    https://twitter.com/MissGinaDarling/status/1301721126846963712?s=19

    Possible pricing:
    https://twitter.com/coreteks/status/1301839482287796224

    Igorslab spilling possible details:
    [...]Big Navi with 275 watts is somewhere between 3070 and 3080 and possibly with more power consumption (300W+) somewhere around 3080 performance. Big Navi will not be able to attack the 3090. Take this with a lot of salt.
    [...]No AIBs in 2020.

    Also reminder, reference cooler confirmed to not be a blower model.
    beuenrwiaxk41.png


  • Registered Users, Registered Users 2 Posts: 21,721 ✭✭✭✭Squidgy Black


    No AIBs in 2020 would be disappointing, unless AMD go down a similar route and have a much different design than the usual blower.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭EoinHef


    No AIBs in 2020 would be disappointing, unless AMD go down a similar route and have a much different design than the usual blower.

    The pic above makes it look like a dual fan design. Thats been around awhile.

    That blower cooler has done AMD damage over the years,how they still used it for the 5 series is beyond me.

    Too many reviews on launch say hot and loud because they are reviewing the reference design. Talk about self harm


  • Registered Users, Registered Users 2 Posts: 2,471 ✭✭✭SweetCaliber


    EoinHef wrote: »
    The pic above makes it look like a dual fan design. Thats been around awhile.

    That blower cooler has done AMD damage over the years,how they still used it for the 5 series is beyond me.

    Too many reviews on launch say hot and loud because they are reviewing the reference design. Talk about self harm

    They should go back to their AIO cards, the R9 Fury for example...: :pac::pac:

    81HRs3ta2tL._AC_SL1500_.jpg


  • Registered Users, Registered Users 2 Posts: 14,309 ✭✭✭✭wotzgoingon


    Cost of those is the thing. I'm not saying they were mad expensive to buy for end user but they were expensive for AMD to produce.

    I actually had one of those cards. Pure silence while gaming. Not that noise bothers me as I were headphones while gaming but cards can sometimes ramp up the fans of non AIO gaming cards while running a demanding game.


  • Moderators Posts: 5,561 ✭✭✭Azza


    The 16GB VRAM might be a handy thing to have with the advent of the next generation consoles. At the start of this generation of consoles, PC GPU's of the time where lagging behind the consoles with the amount of VRAM they where offering. Performance was still better on PC of course but if you didn't have at least 4GB VRAM you had to lower texture quality settings. I know I did with my GTX 680 2GB at the time with games like Titanfall and Watch Dogs. I remember thinking crikey when Middle-earth: Shadow of Mordor offered an optional HD texture pack and it required 6GB VRAM.

    Considering next gen consoles will come with 16GB VRAM, VRAM requirements in PC games may go up as well.


  • Registered Users, Registered Users 2 Posts: 21,721 ✭✭✭✭Squidgy Black


    Azza wrote: »
    The 16GB VRAM might be a handy thing to have with the advent of the next generation consoles. At the start of this generation of consoles, PC GPU's of the time where lagging behind the consoles with the amount of VRAM they where offering. Performance was still better on PC of course but if you didn't have at least 4GB VRAM you had to lower texture quality settings. I know I did with my GTX 680 2GB at the time with games like Titanfall and Watch Dogs. I remember thinking crikey when Middle-earth: Shadow of Mordor offered an optional HD texture pack and it required 6GB VRAM.

    Considering next gen consoles will come with 16GB VRAM, VRAM requirements in PC games may go up as well.

    They're 16gb shared memory as opposed to dedicated graphics memory though are they not? The same as an 8gb VRAM card now.


  • Registered Users, Registered Users 2 Posts: 14,309 ✭✭✭✭wotzgoingon


    I'm not sure but I think it works out at around 12GB VRAM while the rest of the 16GB's is for OS. That's from the top of my head so could be wrong and I didn't look up the spec of consoles since they were announced.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki




  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




  • Registered Users, Registered Users 2 Posts: 14,309 ✭✭✭✭wotzgoingon


    I don't follow leakers well I used to watch Adored until he went for pod cast type videos. Who is that _rogame fellow? It is believable though what he is saying although I still do not think AMD can match the 3090 and this is from someone who likes AMD.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki




  • Advertisement
  • Registered Users, Registered Users 2 Posts: 10,013 ✭✭✭✭Wonda-Boy


    Id love to wait and see what AMD brring to the table, if they even released some info about a possible release date. Id also love to wait and see if the 3080ti would be along quite soon after the launch of the 3080 but I am badly stuck for a card now.


  • Moderators Posts: 5,561 ✭✭✭Azza


    I'm itching to know myself. Think you are best off waiting a few months before deciding either way.

    The last two generations AMD's competition has forced NVIDIA to counter with faster cards. The Vega 56 beat the GTX 1070 in terms of performance and forced NVIDIA to release the GTX 1070Ti. The 5700 and 5700XT forced NVIDIA to counter with the 2060 Super and 2070 Super.

    I'd imagine that will likely be the case again.

    I'm probably going to go AMD if its remotely close between them. I do want to loose FreeSync support or be forced to change my monitor to get G-Sync if opted for NVIDIA.

    I'm not into spending thousands on a GPU nor do I need the absolute best performance. What I would like is to be able to max out the 165Hz refresh rate of my monitor at its native 1440P resolution more often in games that use rasterization and be able to use ray tracing features in games that use it and still hit 60FPS at 1440P. If the new GPU's are capable of accelerating data decompression as well that would a good plus to have as well.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Azza wrote: »
    I'm itching to know myself. Think you are best off waiting a few months before deciding either way.

    The last two generations AMD's competition has forced NVIDIA to counter with faster cards. The Vega 56 beat the GTX 1070 in terms of performance and forced NVIDIA to release the GTX 1070Ti. The 5700 and 5700XT forced NVIDIA to counter with the 2060 Super and 2070 Super.

    I'd imagine that will likely be the case again.

    I'm probably going to go AMD if its remotely close between them. I do want to loose FreeSync support or be forced to change my monitor to get G-Sync if opted for NVIDIA.

    I'm not into spending thousands on a GPU nor do I need the absolute best performance. What I would like is to be able to max out the 165Hz refresh rate of my monitor at its native 1440P resolution more often in games that use rasterization and be able to use ray tracing features in games that use it and still hit 60FPS at 1440P. If the new GPU's are capable of accelerating data decompression as well that would a good plus to have as well.

    Nvidia cards also support support freesync now.


  • Moderators Posts: 5,561 ✭✭✭Azza


    BloodBath wrote: »
    Nvidia cards also support support freesync now.

    Thanks for the tip, wasn't aware of that.

    Just looking into it and my monitor even though it not officially listed as doing so does supports G-Sync.

    That's great, removes a constraining factor in my purchasing decision.

    Still going wait and see what Big Navi is like before deciding.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki




  • Moderators Posts: 5,561 ✭✭✭Azza


    Rumored price cut for Big Navi already.

    https://www.pcgamer.com/amd-rx-6000-series-price-drop-pre-launch/

    Would indicate that in terms of performance AMD don't have anything to directly compete with the 3080 and will again have to compete on price.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    Azza wrote: »
    Rumored price cut for Big Navi already.

    https://www.pcgamer.com/amd-rx-6000-series-price-drop-pre-launch/

    Would indicate that in terms of performance AMD don't have anything to directly compete with the 3080 and will again have to compete on price.

    https://twitter.com/_rogame/status/1301955306021416970

    The above article is speculation & rumors, but I'll be VERY surprised if AMD doesn't have a 3080 competitor.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    I doubt it. Biggest NAVI is only supposed to be around 500mm die size. The 3080 is 637mm. Granted there's a few caveats. Nvidia have RT and Tensor cores on that die and it's Samsungs 8nm process vs TSMC's 7nm.

    They could go all out for rasterisation and maybe beat Nvidia there but there's supposed to be at least dedicated RT cores on big Navi as well. There's rumors that it's delayed until next year now as well so it's going to be too late to compete with the 3070 and 3080 and by the time it comes out, if it's close, Nvidia will just release the Ti or super versions.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭EoinHef


    Delay till next year and they will barely have any share at all in the GPU market


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 8,001 ✭✭✭Mr Crispy


    What's the source for the delay rumours?


  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee


    Amd needs to fix it's drivers a lot. of youtube
    Computer builders will not build with amd because of problems. With some cards drivers


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Mr Crispy wrote: »
    What's the source for the delay rumours?

    Just rumors I'm sure partially based on AMD revealing nothing so far. Also based on TSMC's 7nm process being maxed out because of Ryzen and console development and some other contracts.

    Until they reach Sony and Microsofts console quotas they might not be able to mass produce RDNA2.

    That's why Nvidia had to go with Samsungs 8nm with the super variants maybe using TSMC's 7nm.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,664 Mod ✭✭✭✭Hyzepher


    The clever move from AMD would be to launch a single card this side of Thanksgiving. A 16Gb DDR6 RX6900 card that performs equal or slightly better than the RTX 3070 for $450

    Nvidia's decision to launch both the 3080 and 3090 before the 3070 only goes to highlight that these two cards need all the exposure they can get if they are to maximise their sales. Based on the current generation sales, the top end RTX cards are likely to max out at somewhere around 10% of sales - currently 2080to ownership is <1% of cards in use by gamers.

    The vast majority of card sales will be at the 3070 level. Nvidia are aware of this and that's why they are going out of their way to compare the performance of the 3070 with the 2080ti. The launch dates of the 3080 and 3090 have been chosen to maximise impulse sales of the more expensive cards

    A cheaper 16Gb DDR6 RX 6900 at RTX 3070 performance levels would do more to maximise AMD market share than trying to compete at the higher end. It also gives AMD time to react to possible RTX 3060, 3070ti or 3080ti cards in the future.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    https://twitter.com/_h0x0d_/status/1303252607759130624?s=19

    New Xbox Series S is 299 and promises 1440p 120fps.

    My guess is that Navi 23 will be slightly more powerful & 299-350.


  • Registered Users, Registered Users 2 Posts: 9,562 ✭✭✭weisses


    I would be happy enough with perf between 3070\80 ... For less money, more power efficient and 16 gigs of ram .... Bring it on


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 28,354 ✭✭✭✭TitianGerm


    K.O.Kiki wrote: »
    https://twitter.com/_h0x0d_/status/1303252607759130624?s=19

    New Xbox Series S is 299 and promises 1440p 120fps.

    My guess is that Navi 23 will be slightly more powerful & 299-350.

    1440P at up to 120FPS. Important difference there.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,664 Mod ✭✭✭✭Hyzepher


    AMD's advancemnts with PS5 and Xbox mmight have aided in their goal for a more competive gpu


  • Registered Users, Registered Users 2 Posts: 10,684 ✭✭✭✭Samuel T. Cogley




  • Registered Users, Registered Users 2 Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    K.O.Kiki wrote: »
    My guess is that Navi 23 will be slightly more powerful & 299-350.

    The consoles can be sold at a loss, so I'm not as hopeful on pricing of the GPUs.


  • Registered Users, Registered Users 2 Posts: 21,721 ✭✭✭✭Squidgy Black


    Hyzepher wrote: »
    The clever move from AMD would be to launch a single card this side of Thanksgiving. A 16Gb DDR6 RX6900 card that performs equal or slightly better than the RTX 3070 for $450

    Nvidia's decision to launch both the 3080 and 3090 before the 3070 only goes to highlight that these two cards need all the exposure they can get if they are to maximise their sales. Based on the current generation sales, the top end RTX cards are likely to max out at somewhere around 10% of sales - currently 2080to ownership is <1% of cards in use by gamers.

    The vast majority of card sales will be at the 3070 level. Nvidia are aware of this and that's why they are going out of their way to compare the performance of the 3070 with the 2080ti. The launch dates of the 3080 and 3090 have been chosen to maximise impulse sales of the more expensive cards

    A cheaper 16Gb DDR6 RX 6900 at RTX 3070 performance levels would do more to maximise AMD market share than trying to compete at the higher end. It also gives AMD time to react to possible RTX 3060, 3070ti or 3080ti cards in the future.

    If they can split the 3070 and the 3080 with a card that costs $550-600 they'll claw back sales. But with reports that AIB cards won't be available until next year, and next gen console demand potentially slowing down any chance of them getting mainstream inventory this side of Christmas even if they were to release is going to leave them on the back foot big time.

    I really can't understand why AMD haven't made any sort of solid leak, or even an announcement for an announcement date when it's getting so close to the actual release date of the Nvidia cards. Even if there's low stock for the 3080, with reviews/benchmarks and even a low amount of cards being out in the wild, it's going to be a real uphill battle to try convince people to go with them unless they've something spectacular in terms of performance per cost.


  • Advertisement
Advertisement