Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi all! We have been experiencing an issue on site where threads have been missing the latest postings. The platform host Vanilla are working on this issue. A workaround that has been used by some is to navigate back from 1 to 10+ pages to re-sync the thread and this will then show the latest posts. Thanks, Mike.
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

AMD Navi Discussion

13468958

Comments

  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    The consoles can be sold at a loss, so I'm not as hopeful on pricing of the GPUs.

    Yes but that loss is including a 512Gb NVMe drive, 16Gb GDDR6 & 8-core Zen 2.
    If the rumours are correct, Navi 23 could have half the VRAM.
    So I can see AMD having an entry-level GPU around the same price (or lower) as Series S.


  • Registered Users, Registered Users 2 Posts: 3,573 ✭✭✭2ndcoming


    The consoles also get a pared down minimal version of the GPU dye that's incorporated into their board rather than a full consumer product on its own board with fans and rgb and a faceplate etc.

    No marketing expense either.

    Can't really compare the two.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    If they can split the 3070 and the 3080 with a card that costs $550-600 they'll claw back sales. But with reports that AIB cards won't be available until next year, and next gen console demand potentially slowing down any chance of them getting mainstream inventory this side of Christmas even if they were to release is going to leave them on the back foot big time.

    I really can't understand why AMD haven't made any sort of solid leak, or even an announcement for an announcement date when it's getting so close to the actual release date of the Nvidia cards. Even if there's low stock for the 3080, with reviews/benchmarks and even a low amount of cards being out in the wild, it's going to be a real uphill battle to try convince people to go with them unless they've something spectacular in terms of performance per cost.

    It doesn't look good. The usual story with AMD graphics division. What they had was probably exciting if they had got it to market sooner. By the time they release it won't be. You just know their own coolers are gonna be crap as well so you have to wait even longer for AIB cards.

    I think they overextended with contracts, new Ryzen lineup + consoles and TSMC's 7nm foundries are just maxed out with no room for RDNA2 until the pre Christmas console release order is met.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭EoinHef


    This time we may actually get decent reference coolers based on the pic below. The rumor is they are concentrating on dual and tri fan designs. If they release with a good cooler it doesn't even matter if AIB's don't release this year.


  • Registered Users, Registered Users 2 Posts: 8,007 ✭✭✭Mr Crispy


    EoinHef wrote: »
    This time we may actually get decent reference coolers based on the pic below. The rumor is they are concentrating on dual and tri fan designs. If they release with a good cooler it doesn't even matter if AIB's don't release this year.

    Yeah, back in march Scott Herkelman went so far as to say there'd be no blower cards at all in their next gen of gaming cards.

    Speaking of stuff AMD employees say on social media...........

    https://twitter.com/AzorFrank/status/1303492877989879810?s=20


  • Registered Users, Registered Users 2 Posts: 7,038 ✭✭✭circadian


    I'm guessing they'll schedule their launch announcement, hardly just going to announce anything more without much notice.


  • Moderators Posts: 5,561 ✭✭✭Azza


    Is the Sun coming out tomorrow a metaphor for how hot Big Navi will run at?


  • Registered Users, Registered Users 2 Posts: 8,007 ✭✭✭Mr Crispy


    Well, it is named after a star, so.....


  • Registered Users, Registered Users 2 Posts: 13,998 ✭✭✭✭Cuddlesworth


    Azza wrote: »
    Is the Sun coming out tomorrow a metaphor for how hot Big Navi will run at?

    Did they have to implement a new power connector for the power it will pull?


  • Registered Users, Registered Users 2 Posts: 8,007 ✭✭✭Mr Crispy




  • Advertisement
  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    Very pretty, I like it.

    3X6FItl.png


  • Registered Users, Registered Users 2 Posts: 6,756 ✭✭✭Thecageyone


    K.O.Kiki wrote: »
    Very pretty, I like it.

    3X6FItl.png

    Let's be honest, they all look much of a muchness - couple fans, basic 1970's era radiator style casing ... best hidden away - don't think I've ever found a Gpu 'good looking.


  • Registered Users, Registered Users 2 Posts: 8,007 ✭✭✭Mr Crispy


    After some of the terrible coolers they've had in the past, I just want this to be effective and (preferably) quiet.


  • Registered Users, Registered Users 2 Posts: 8,007 ✭✭✭Mr Crispy




  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,100 Mod ✭✭✭✭AlmightyCushion


    Mr Crispy wrote: »

    These youtubers need to learn how to get to the point and keeping things concise. He keeps saying the same thing multiple different ways. It shouldn't take 5 minutes to say AMD's reference coolers are rubbish but this one looks better. There is decent information in there but he could easily cut out the crap and the video would be half the length.


  • Registered Users, Registered Users 2 Posts: 14,309 ✭✭✭✭wotzgoingon


    I've a reference 5700 XT and find it good. I've a custom fan curve with 75% max and with my headphones on I can't hear it. If I was playing a demanding game and took the headphones off you can hear the fans as they would be at 75% but as I said headphones on while gaming no noise and completely silent when just browsing.


  • Registered Users, Registered Users 2 Posts: 8,007 ✭✭✭Mr Crispy


    Interesting that he suspects AMD are using HBM (although he could be reading far too much into that model).


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    These youtubers need to learn how to get to the point and keeping things concise. He keeps saying the same thing multiple different ways. It shouldn't take 5 minutes to say AMD's reference coolers are rubbish but this one looks better. There is decent information in there but he could easily cut out the crap and the video would be half the length.

    That's just buildzoid, he generally does sleep-deprived rambles & what you're hearing is take 4.
    Watch at 1.25x


  • Registered Users, Registered Users 2 Posts: 8,007 ✭✭✭Mr Crispy


    Yeah, and to be fair he had the thing up pretty quick, so probably take 1 in this case.

    I see JayzTwoCents giving AMD a hard time for marketing in such a kiddies' game (paraphrasing here). He really irritates me.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,756 ✭✭✭Thecageyone


    I find most of the Tech reviewers really irritating, from the so laid back types they sound really disinterested to the obnoxious shouty types who tend to have a bunch of dopes running about taking orders from them [that Linus guy for one] - but, we have to get the info someplace and here and there between them all we do learn something [eventually] as they will always get the gear first. They are obviously going to milk the absoloute crap out of these new cards and convince us all that we 'need' to upgrade asap. Their channels wouldn't be doing much if there wasn't new tech to push/review/moan about


  • Registered Users, Registered Users 2 Posts: 13,998 ✭✭✭✭Cuddlesworth


    Mr Crispy wrote: »
    Interesting that he suspects AMD are using HBM (although he could be reading far too much into that model).

    The rest of the industry has serious doubts about that. Its just too expensive to produce.


  • Posts: 0 [Deleted User]


    I find most of the Tech reviewers really irritating, from the so laid back types they sound really disinterested to the obnoxious shouty types who tend to have a bunch of dopes running about taking orders from them [that Linus guy for one] - but, we have to get the info someplace and here and there between them all we do learn something [eventually] as they will always get the gear first. They are obviously going to milk the absoloute crap out of these new cards and convince us all that we 'need' to upgrade asap. Their channels wouldn't be doing much if there wasn't new tech to push/review/moan about




    Yeah Linus can be really grating. If I have to I would listen to someone like JayzTwoCents; yeah he can rant about random things but comes across as a general grumbling tech head as opposed to screaming into the screen and having peons stagger around.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    They are obviously going to milk the absoloute crap out of these new cards and convince us all that we 'need' to upgrade asap. Their channels wouldn't be doing much if there wasn't new tech to push/review/moan about

    GamersNexus' Steve recently admitted that he still runs an FX-8350 at home and affirmed his stance that you should buy for longevity where possible.


  • Registered Users, Registered Users 2 Posts: 10,013 ✭✭✭✭Wonda-Boy


    Have to say I to find it rediculous advertising a GPU in fortnite....I mean come on ffs! Especially the very first insight into a GPU that is so important to AMD, maybe after a few weeks of launch for some PR but its madness.


  • Registered Users, Registered Users 2 Posts: 8,007 ✭✭✭Mr Crispy


    Wonda-Boy wrote: »
    Have to say I to find it rediculous advertising a GPU in fortnite....I mean come on ffs! Especially the very first insight into a GPU that is so important to AMD, maybe after a few weeks of launch for some PR but its madness.

    It really isn't. People getting upset over this need to calm down.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 740 ✭✭✭z0oT


    I find most of the Tech reviewers really irritating, from the so laid back types they sound really disinterested to the obnoxious shouty types who tend to have a bunch of dopes running about taking orders from them [that Linus guy for one] - but, we have to get the info someplace and here and there between them all we do learn something [eventually] as they will always get the gear first. They are obviously going to milk the absoloute crap out of these new cards and convince us all that we 'need' to upgrade asap. Their channels wouldn't be doing much if there wasn't new tech to push/review/moan about
    What irks me about some of the Tech Youtubers/Websites is when they try and go into detail about the technical electronic principles behind things. It's often evident they know nothing about what they're talking about most of the time in that case.

    As for Navi, this video is worth a watch. The prediction is that AMD will have a 3070 & 3080 competitor, and they'll probably do something like what the Radeon VII was to compete with the 3090. Time will of course tell.
    https://www.youtube.com/watch?v=BR70xbcwB6U


  • Posts: 0 [Deleted User]


    Mr Crispy wrote: »
    It really isn't. People getting upset over this need to calm down.




    It's a highly popular game, massive throughput, many people with high end rigs play, AND has a large base of people who have basic enough rigs.


    Those low enders are prime targets for enticing towards a self-build mid/high end rig, with no affiliation to Nvidia.


  • Registered Users, Registered Users 2 Posts: 18,731 ✭✭✭✭K.O.Kiki


    Wonda-Boy wrote: »
    Have to say I to find it rediculous advertising a GPU in fortnite....I mean come on ffs! Especially the very first insight into a GPU that is so important to AMD, maybe after a few weeks of launch for some PR but its madness.


    There's no insight in the Fortnite thing.
    It's just a big render.

    First insight will be on 28. October, as they already announced earlier in the month.


  • Registered Users, Registered Users 2 Posts: 10,013 ✭✭✭✭Wonda-Boy


    Mr Crispy wrote: »
    It really isn't. People getting upset over this need to calm down.

    Jaysus buddy, think of my feelings will ya....:P

    Getting upset indeed....more of an observation more then anything else :rolleyes:

    According to statistica, a survey done late 2020 found that the majority of people playing FORTNITE are 18-24 yr olds (62%)

    22% are of the age 25-43.

    12% are of the age 35-44.

    2% are of the age 45-54.

    To me, that is not the base that AMD would need to be targeting as the majority of PC high end users looking at the extreme high end GPU purchases would not be 18-24 yrs of age. More like 30+ present company included.


  • Posts: 0 [Deleted User]


    Wonda-Boy wrote: »
    Jaysus buddy, think of my feelings will ya....:P

    Getting upset indeed....more of an observation more then anything else :rolleyes:

    According to statistica, a survey done late 2020 found that the majority of people playing FORTNITE are 18-24 yr olds (62%)

    22% are of the age 25-43.

    12% are of the age 35-44.

    2% are of the age 45-54.

    To me, that is not the base that AMD would need to be targeting as the majority of PC high end users looking at the extreme high end GPU purchases would not be 18-24 yrs of age. More like 30+ present company included.




    At 37 I was in an office as a Sys-Admin trying to justify a 1080 purchase (considering kids, mortgage, car, and wife wondering about said purchase)
    Vs
    Bunch of 24 year old (paid) interns splashing out on 2080s a few months later.


    Also, think of those 16 year olds saying to 40ish parents "Dad/Mom, you like games too. Can we get a high end GPU together, for Xmas?"


    Edit: Bastard interns actually asked me if I knew what a pokéball was.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 21,722 ✭✭✭✭Squidgy Black


    There's a HUGE influx of younger people who are convincing their parents to buy them new PCs over in the states too with remote learning being pretty popular over there. And then you e college students too who will all be remote, so there's a lot of people investing in systems and saying why not go full whack.

    It's a gimmick to be honest, not a huge deal but if it gets them publicity then why not. I'm sure Epic were more than happy to work with them for it. Sure Nvidia used fortnite as one of their examples in their launch of games that would have RTX support being rolled out.


  • Registered Users, Registered Users 2 Posts: 8,007 ✭✭✭Mr Crispy


    Wonda-Boy wrote: »
    Jaysus buddy, think of my feelings will ya....:P

    Getting upset indeed....more of an observation more then anything else :rolleyes:

    According to statistica, a survey done late 2020 found that the majority of people playing FORTNITE are 18-24 yr olds (62%)

    22% are of the age 25-43.

    12% are of the age 35-44.

    2% are of the age 45-54.

    To me, that is not the base that AMD would need to be targeting as the majority of PC high end users looking at the extreme high end GPU purchases would not be 18-24 yrs of age. More like 30+ present company included.

    I wasn't aiming that specifically at you, moreso the general outrage about it online (although you did sound upset :p ).

    Anyway, ignoring those figures (which just show that there are still tens of millions of Fortnite players over the age of 24), you have to remember that this is only one element of AMD's marketing. If their whole plan is to publicise this launch through Fortnite, then yeah, I'll grab a pitchfork alongside the rest of the internet. But if it's just one measly element of getting this thing talked about, I really don't see the harm.

    Maybe they should have just revealed it by pulling it out of an oven. *shrugs*


  • Registered Users Posts: 663 ✭✭✭MidlanderMan


    The screw hole patern on the back of Navi render in Fortnite suggests that whatever that card is, it doesn't have GDDR, there are screw holes where the GDDR chips would be.
    If that's true, it means that *maybe* the highest end cards will use HBM2, which might make sense with the 80 CU's and 24GB rumours.


  • Registered Users, Registered Users 2 Posts: 10,684 ✭✭✭✭Samuel T. Cogley




    Skip to about 11 minutes in.


  • Registered Users, Registered Users 2 Posts: 8,007 ✭✭✭Mr Crispy


    Images from the video;

    526470.jpg

    526471.jpg

    526472.jpg


  • Registered Users, Registered Users 2 Posts: 7,038 ✭✭✭circadian


    I like Jayztwocents liquid cooling, hvac cooling videos. Even the benchmark stuff is decent but when it comes to news/rumours etc he comes a across as a total fan boy and completely biased.

    Mind you, most of the tech you tubers do so at least he has the other material that's decent.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    The screw hole patern on the back of Navi render in Fortnite suggests that whatever that card is, it doesn't have GDDR, there are screw holes where the GDDR chips would be.
    If that's true, it means that *maybe* the highest end cards will use HBM2, which might make sense with the 80 CU's and 24GB rumours.

    I don't think there's any truth to that. The memory can be laid out multiple ways.

    I'm pretty sure I used to have a gpu that had that x shaped pressure plate on the back as well that wasn't HBM.

    It's just too expensive for consumer cards and is only used for server/ai gpu's.

    I think AMD learned that lesson already with radeon 7.


  • Registered Users, Registered Users 2 Posts: 4,440 ✭✭✭Homelander


    AMD drivers are the reason I won't really entertain Big Navi.

    Back in the day, cards like the R9 290, HD7970, whatever else, where hugely viable cards to Nvidia. Even the RX series were fine.

    But their drivers have just been so horrible for the Vega/XT cards in comparison to Nvidia.


  • Registered Users Posts: 1,016 ✭✭✭Ultrflat


    circadian wrote: »
    Even the benchmark stuff is decent but when it comes to news/rumours etc he comes a across as a total fan boy and completely biased.

    I think his fan boys days are over, hes become very critical of companies and there practices, he gave MSI greef over a bios issue, hes called out NVIDIA multiple times in the past year over poor practices. He also has no issue in making intel look bad as well as AMD.

    He used to give the impression he was a fan boy, I just think he was going with what ever card/cpu was working best apposed sticking to one side.


  • Registered Users, Registered Users 2 Posts: 21,722 ✭✭✭✭Squidgy Black


    I'm going to assume that if the 6900XT runs close to the 3080 performance wise there's going to be zero chance of getting one this side of Christmas considering the demand for the 3080 so far.


  • Advertisement
  • Moderators, Recreation & Hobbies Moderators Posts: 4,664 Mod ✭✭✭✭Hyzepher


    I'm going to assume that if the 6900XT runs close to the 3080 performance wise there's going to be zero chance of getting one this side of Christmas considering the demand for the 3080 so far.

    There will always be more demand for Nvidia imo. But given how little both companies provide out of the gate I'm also assuming it will be hard to get one.


  • Moderators Posts: 5,561 ✭✭✭Azza


    So from what I'm hearing around the web, AMD have at least a competitor to the 3080.
    There 6700 will be there competitor to 3070 but will not be available this year.
    Rasterization performance will be around the same or in spitting distance of Ampere.
    Ray Tracing performance is abit of a unknown but the speculation is that its not not as good as Ampere but better than Turing.
    Will have more VRAM than the 3070/3080 cards with 16GB VRAM.
    Much better power consumption than Ampere.
    Its reference cooler will no longer a blower design.
    Will again undercut NVIDIA on price.


  • Registered Users, Registered Users 2 Posts: 1,881 ✭✭✭Simi


    It won't matter if it has more vram, better power consumption and a lower price. Unless it solidly beats the equivalent Nvidia card and has zero driver issues from day one, people simply aren't going to pick an AMD card over an Nvidia one.

    It's not good enough to get within 10% of or trade blows with the 3080. The AMD equivalent has to be unquestionably better in every way to stand even a chance of stealing some market share.


  • Registered Users, Registered Users 2 Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Simi wrote: »
    It won't matter if it has more vram, better power consumption and a lower price. Unless it solidly beats the equivalent Nvidia card and has zero driver issues from day one, people simply aren't going to pick an AMD card over an Nvidia one.

    It's not good enough to get within 10% of or trade blows with the 3080. The AMD equivalent has to be unquestionably better in every way to stand even a chance of stealing some market share.

    As much as I agree with the above it simply wont, sadly.

    It'll be as good if not slightly better at 1440P
    Worse at 4K
    RT performance will cause a bigger hit than on nVidia Cards
    No answer to DLSS
    Cheaper made reference and AIB designs
    Driver issues, especially on launch

    This won't be Radeon's Zen moment unless they massively undercut on cost - which I don't think they can aford to do and would also be easily countered.


  • Moderators Posts: 5,561 ✭✭✭Azza


    Simi wrote: »
    It won't matter if it has more vram, better power consumption and a lower price. Unless it solidly beats the equivalent Nvidia card and has zero driver issues from day one, people simply aren't going to pick an AMD card over an Nvidia one.

    It's not good enough to get within 10% of or trade blows with the 3080. The AMD equivalent has to be unquestionably better in every way to stand even a chance of stealing some market share.

    Same sources I was looking at believe three cards are coming the 6700, 6800 and 6900.

    If AMD want to win on rasterization they can do so but by upping power consumption well outside of the chips sweet spot. I suspect that will be the 6900 card.

    Looking at hardware unboxed reviews of the 3080, they show that in terms of Ray Tracing performance the improvements are pretty marginal relative to Turning, so if AMD do get in between Turing and Ampere you are not going to notice a difference.

    VRAM may prove a factor down the road.

    As for DLSS, same sources say they are working on a counter solution, but absolutely no ETA on it.


  • Registered Users, Registered Users 2 Posts: 4,573 ✭✭✭Infini


    Simi wrote: »
    It won't matter if it has more vram, better power consumption and a lower price. Unless it solidly beats the equivalent Nvidia card and has zero driver issues from day one, people simply aren't going to pick an AMD card over an Nvidia one.

    It's not good enough to get within 10% of or trade blows with the 3080. The AMD equivalent has to be unquestionably better in every way to stand even a chance of stealing some market share.

    In all honesty though Ive an AMD rig as well and there's not a bother out of it for me. As it stands AMD isnt thag bad they are improving if anything its because of them that nvidia arent gouging this time around as much on prices.

    Would wait and see what happens but AMD might be in better shape this time around.


  • Moderators Posts: 5,561 ✭✭✭Azza


    I've had a Vega 56 and a 5700XT and driver wise I never had an issue with the Vega 56.
    On the 5700XT I've got the black screen issue once and had an issue with micro stutter introduced by a GPU driver update in Sniper Elite 4. I ended up doing 4-5 driver rolls backs an updates before it fixed it, as each rollback/update would temp fix the issue before it came back. Weirdly it resolved itself when I went back to the driver that first introduced the issue. Not sure what happened, was annoying at the time but overall I wouldn't say AMD driver support has been bad, nor has it been perfect but I think there reputation for poor drivers is a little over exaggerated or a little out of date.


  • Registered Users, Registered Users 2 Posts: 7,038 ✭✭✭circadian


    I'm definitely curious to see what RDNA2 can do but I'm waiting until the next generation before upgrading anyways. It seems AMD have taken the same approach as Zen by creating a more modular architecture and focusing on different characteristics each generation. I suspect rdna3 and 4 will have a very mature GPU offering that could well be able to tackle NVIDIA at the high end.


  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




  • Advertisement
  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




Advertisement