Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

AMD Navi Discussion

19192949697

Comments

  • Registered Users Posts: 7,408 ✭✭✭Icyseanfitz


    Considering the 7900 xtx is going for 1300€ and the xt just over a 1000€ I can see the 7800 going for 800-900€. Pricing in Europe on top of GPUs is so bonkers right now



  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    I was banned for the past 4 weeks but anyway I wasn't able to post yesterday so I will mention it today. I was on AMD's que on launch day and finally got in and the XTX was gone and there was only XT models which I didn't want so I left it.



  • Registered Users Posts: 5,415 ✭✭✭.G.


    My main use case is VR gaming and it seems from the one review I can find that bothered to benchmark VR games the XTX struggles against the 4080. I think I'll wait now until Nvidia hopefully see sense on the pricing of that one.



  • Registered Users Posts: 5,547 ✭✭✭Slutmonkey57b


    I saw an interesting comment from one reviewer that retailers are getting a tiny margin ~5% on these cards.


    AMD should be getting better yields and saving production costs on these compared to a monolithic die so you would hope for some price correction over the next couple of years. However there are two factors making me think that won't necessarily be the case.

    1. Stock of the xtx appears to be tiny, whereas stock of the xt is good. (ignoring sales numbers, just the % breakdown of cards released to retail).
    2. The pricing gap began two cards performing so differently is tiny, given that one is already a "recovered die" scenario with one mcd disabled.

    So I think that the quasi-chiplet strategy is probably helping prices, but at least in this first generation it's not the game changer people are hoping it is. AMD are not getting the yields they would need to make a game changing improvement on the top end cards yet.


    If the lower end parts are substantially physically smaller than the 300mm full fat, then maybe the midrange will start being price competitive compared to consumer's wallets again.


    Ultimately I am still of the opinion that gaming and hardware in general have become insane over the last couple of years. By concentrating so hard on 4k 120+hz raytracing halo benchmarks on games with insane graphically intense open worlds, the gpu hardware designs are massively complex. However most gamers aren't playing on those setups and the numbers being put up for older games at lower resolutions which most online players are on are already well above ridiculous.


    For most gamers, we should have been seeing the needle shift to greater bang for the buck. But instead, we're paying insane prices for hardware designed to run in situations that most of us just aren't playing at all.



  • Registered Users Posts: 7,408 ✭✭✭Icyseanfitz


    All I know is buying a 1080ti for €850 has absolutely ruined my expectations of what I should get for my money. Every generation since has continued to increase the cost of every tier by a huge amount, I'll be lucky if a 4060 - 4070 can be got for the price I paid for the second highest tier you could get 5 years ago 🤷‍♂️ it's lunacy



  • Advertisement
  • Registered Users Posts: 31,001 ✭✭✭✭Lumen


    Factoring in inflation, €850 is now €983.

    Second highest AMD, 7900 XT, is €1099 MSRP, and they are available for that price, so that's 12% more than you paid in real terms, and consensus is that the XT is a bit overpriced.

    It's mainly Nvidia that have lost the plot, but in fairness their cards are better (performance, features and efficiency).

    edit: this reads as making excuses for the bastards, which was unintentional.



  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    Well the new XTX is on par with a 3090 in RT which is a pretty good achievement by AMD considering Nvidia go on about how good their dedicated RT cores are. I cannot be bothered to search again for the benchmarks but someone did already over on OCN yesterday.



  • Registered Users Posts: 7,408 ✭✭✭Icyseanfitz


    Thing is we all know there's a 7950 and possibly 2 variations of it (xt/xtx) coming as well, so the 7900xt will probably be 3 to 4 tiers from the top. Nvidia's current equivalent to the 1080ti is probably the 4090 or 4080ti (probably an unveiled 4090ti waiting in the wings) and that's somewhere around 2k euro.

    So imo while AMD are better they are still taking the piss, at least in Europe.







  • Registered Users Posts: 7,408 ✭✭✭Icyseanfitz




  • Advertisement
  • Registered Users Posts: 738 ✭✭✭minitrue


    Where it "went wrong" is they both discovered that people would pay more and as a result they are now making the cards using process nodes in and around the cutting edge rather than a cheaper older one which was the norm until now. The 600 mm^2 4N 4090 die at $1600 for the complete card actually looks cheap to me when you start to compare die sizes to the AM5 cpus and even cheaper compared to Raptor Lake :-o

    Just the 300 mm^2 die of the 7900 cards is already larger than all the dies of say a 7950X and with the relationship of yields to die sizes being as it is I don't think you can complain about their pricing if you aren't frothing at the mouth at the prices of the cpus?

    Of course the 7900XT and 4080 both look horribly priced when compared to their flagships but until they know for sure they can't sell them they've no incentive to turn down profit.



  • Registered Users Posts: 18,689 ✭✭✭✭K.O.Kiki


    Inflation is meaningless with wage stagnation though.



  • Registered Users Posts: 7,408 ✭✭✭Icyseanfitz


    Well tbh I wish they would go back to using a less advanced more affordable process note if that's the case, I had zero issues with how much performance we used to get every generation, particularly the fantastic value that used to be at the xx60 level cards for like 300euro.



  • Registered Users Posts: 321 ✭✭Mucashinto


    Yeah, it really seems like they're focussing on the high end now and bizarrely that's where the actual 'value' is (if you can afford the high buy-in). Like the 4090/7900xtx end up seeming like excellent value cards for what you get, 1 step down and it's already diminishing returns. Saw somebody make the point that it used to be the opposite, spending premium money to only get modest increases.

    Switch of focus from mainstream consumers to data centres/commercial buyers maybe?



  • Registered Users Posts: 738 ✭✭✭minitrue


    Before I begin let me be clear, the 4080 and 7900XT are both truly horrible at their prices when put beside the 4090 and 7900 XTX. The naming and pricing is all about marketing games. nvidia didn't quite get away with the 4080 12GB but am I a cynic to say it helped them get the 4080 over the line?

    A 6600 or 3050 are both in and around €300 now and what you are looking for and so is a 2060 which is pretty much exactly what you are asking for.

    Looking at the naming scheme's to decide what level a card is at is basically playing into their marketing hands and how they have made people feel the need to spend more. If they kept following the very linear path (in raster performance Vs time) from the 560, 660, 760, 960 the 3050 would be nicely ahead of the curve and in fact the 1650 was only just a hair behind!

    Take away all the names and look at die sizes and performance (and price if you want) and the 3050 is clearly part of the line going 560->660->760->960->1060->1660->3050. The 1060 (I'm talking about the 6GB to keep it a little simpler) was a bit of an outlier in terms of the fact it made a bigger leap forward in performance (when they went from 28nm to 16nm) despite going to the smallest die but it also saw the price back at the highest end of those cards instead of the lowest.

    Now there are two new lines moving at a faster pace than before though I presume they will both revert unless they are willing to outbid Apple for the absolute bleeding edge node. One line is the 1050->1650->3050 and the other the 960->1060->2060->3060.

    Despite all the above, the $249 for a 760 in 2013 is about $305 in 2022 so in real terms the 3060 at MSRP is less than 10% more expensive despite being maybe 50% ahead of the projected path from 560->960, or put the other way the $249 MSRP for the 3050 lands between the $199 for the 560 in 2011 and 960 in 2015.

    Don't get me wrong, the 4080 is close enough the same size as the 560 (largest *60 until the 2060) and I'd love it if I could buy it for the inflation adjusted $250 but given that 560 was on a refinement of a 7 year old node the equivalent would at very best be 360mm^2 of TSMC's 12nm node (16nm refinement) used on the 16/20 series so in between a 1660 and 2060 (or between a 1080 and 1080ti on the unrefined TSMC 16nm might be closer really). Funnily enough a 3050 happens to fall right in there in terms of performance with the extra bit of ram and lower power consumption.

    TLDR; how would we be any better off if nvidia had just announced an upcoming €300 4060 identical to what was actually released as a 2060 and the 4090 was somewhere between a 2080 and 2080ti in terms of performance, whether or not that was at a "bargain" 580 equivalent price of maybe $625 or 690 equivalent of maybe $1250? Hint, a €500 3060ti is in that 2080 to 2080ti zone as is the €400-€450 ish 6700 or 6700xt.



  • Registered Users Posts: 690 ✭✭✭Confused11811


    I haven't been here in a while lads, has it been established what days the AMD drop will be going forward ?



  • Registered Users Posts: 31,001 ✭✭✭✭Lumen


    This is all far too complicated.

    At the end of the day we need a 1080p card, a 1440p card and a 4k card. It needs to be that simple so that game developers can have a reasonably constrained set of capabilities to work with over, say, three generations of card in the marketplace at once, with players turning down the quality settings as their cards age.

    Those core cards could be priced at (say) 300, 600, and 1000 euros, inflating over time, and ought to be released together and in large volumes so that people aren't "forced" to buy a card out of their budget due to timing or availability.

    It's fine for there to be halo cards above that for niche applications like VR and triple screen SIM racing, or for people with more money than sense, but the normalisation of €1500+ cards in the public debate is not healthy.

    Relatedly, I don't even understand why partner cards still exist. I like shopping but roll my eyes when presented with page after page of arbitrarily unique products, it's a tyranny of choice. Just engineer the products to be cool and quiet and make loads of them.



  • Registered Users Posts: 5,547 ✭✭✭Slutmonkey57b


    The thing is, at 1080, you don't need to upgrade anymore. Any card from the last 3 generations will play well at that resolution.

    At 1440, there might be a meaningful fall off in lower end cards, but not much.


    Is only at the stupid end of things or people pretending the human eye can tell the difference between 150fps and 300fps that you "need" an upgrade.



  • Registered Users Posts: 1,007 ✭✭✭Longing


    In 2017 I bought GTX 1080 for Warhammer to run at 1440p. Average 80FPS. Today running Warhammer 3 on the same card only getting 37FPS more or less same settings. I made a new build earlier in the year waiting for the new cards to arrive. There is no way I will be paying the prices of 4080. Ridiculous. Today I'm just looking for a card that will run WH3 at 60FPS at 1440p. I'm seriously thinking of getting one of three thousand series or 6800xt from AMD if I can it around 600 to 650 max. Cards are nearly the same price has a son or daughter getting there first car to learn in. Crazy world.



  • Registered Users Posts: 1,859 ✭✭✭Simi


    So has anyone actually managed to order a 7900 XTX or did anyone go for a 7900 XT?

    I have a 6900 XT so don't actually need an upgrade, just want one. But If I knew for certain a 7990 XT was coming in the summer I might hold off until then.



  • Advertisement
  • Registered Users Posts: 5,415 ✭✭✭.G.


    Don't know if any XTX went on sale today but the XT still freely available. I wonder who'll blink first on a price drop, AMD for that or Nvidia on the 4080



  • Registered Users Posts: 1,859 ✭✭✭Simi


    I reckon Nvidia will go first the 4080 is so badly overpriced.



  • Registered Users Posts: 675 ✭✭✭Gary kk


    Think Amd might have dropped price on the 7900xt.



  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    Not direct from them they didn't as I just checked.



  • Registered Users Posts: 675 ✭✭✭Gary kk


    Have to* sorry 😞



  • Registered Users Posts: 4,512 ✭✭✭savemejebus


    Worth dropping €420 on a 6700xt now or better to wait till after CES?



  • Registered Users Posts: 18,689 ✭✭✭✭K.O.Kiki


    TBH yeah, I don't think any of the manufacturers will bother with lower-budget offerings.



  • Registered Users Posts: 31,001 ✭✭✭✭Lumen


    This hotspot temp/throttling issue is fascinating. I'm not sure I entirely buy the diagnosis (poor vapour chamber design or manufacturing faults) but it reminds me of fuss over memory temps on the early 3080s (which IIRC could be mitigated by thermal pad mods, which won't work here).

    It seems like AMD is a generation behind on cooling, at least at the high end.




  • Registered Users Posts: 1,859 ✭✭✭Simi


    Very insightful video. I'm interested in his and others follow-up. Definitely going to hold off purchasing for the time being (not like I can anyway) until things are cleared up.



  • Advertisement
  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    Water cooling the card would fix it or just buy a AIB card with heatpipes as the active cooler.



Advertisement