• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

I do hope AMD don't mess about and just push them out at $500 for initial reviews and get them out before Nvidia, aside from the good deal i'm interested to see how Nvidia would react, they could easily spoil AMD's party by slapping them in th face with a $500 to $550 RTX 5070 or stay on course for gradually ramping the price of GPU's up and pushing them out at $650, i would be interested to see which direction Nvidia take.
 
Last edited:
I'm curious about something and I think you might be able to answer it. How does a Ryzen 7900 compare in chip size with AMD GPUs?

That's a complicated question, the dual CCD Ryzen Zen 4 are 5nm for the two logic dies and 6nm for the IO die. The two logic dies are about 160mm combined, about 80mm each with IO die being about 120mm.

RDNA 3, is also 5nm for the logic die and 6nm for the memory dies, Navi 31 (7900 series) has a 300mm logic die and 6 37mm memory dies, about 220mm combined for the memory dies.

Navi 32 (7800 / 7700 series) has a 200mm logic die and 4 memory dies.

So....
Ryzen 7950X: 160mm 5nm dies + 120mm 6nm die.
7800 XT: 200mm 5nm die + 150mm 6nm dies.
7900 XTX: 300mm 5nm die + 220mm 6nm dies.
 
Last edited:
5nm / 6nm (5 and 6 mano metre) refers to the printing lithography of the die, another word for basically chip. How the actual chip is made, the smaller the number the smaller and better / later that is, all made at TSMC.

The first image is Navi 31, the 7900 series RDNA 3 GPU's

The second image is of a Ryzen 7900 series CPU with the heat spreader removed.

0GjSuei.jpeg

hbOMIdF.jpeg
 
Last edited:
Thank you Humbug.
So, what I'm thinking is: should AMD stop making big chips and focus on something the size of a Ryzen 9700?

Or, if slightly older nodes are price competitive, should AMD try to squeeze the best possible from nodes one generation behind and openly offer players a trade-off between price and efficency?
Basically declaring the best node available is for CPU and Instinct while GPU stays on the previous CPU node/cheaper alternative (Samsung?).

This way volume woes should be significantly lessened and price could be competitive.
There are also some R&D advantages: You set the microarchitecture with iGPU/APU and refine it on the same one when you actually switch it to dGPUs, basically pulling a tick/tock strategy having sinergy with fab node refinements.

Intel tried this and failed but AMD is much better positioned to take advantage of a full stack strategy.

The strategy is to go back to making workstation GPU's and gaming GPU's the same silicon, so if they don't sell them as gaming GPU's they stand a better chance moving them as workstation GPU's, i don't know if this is true for upcoming RDNA 4, probably not, possibly RDNA 5.

AMD wouldn't go Samsung, too far behind and they have a good relationship with TSMC that they would want to maintain, the best and most expensive node right now is TSMC N3, that's a 3nm node, its what Apple are currently using and Intel for their 200 series CPU's and APU's.
I think AMD are very likely to use TSMC N4, 4nm for RDNA 4, they use that for their Ryzen 9000 series, its a very solid node and as shown with Ryzen 9000 AMD are able to design very good efficiency with it.
They may also go back to it being monolithic, as you can see from my pictures and explanation current Radeon 7000 series (RDNA 3) is a multichip design, its a very successful design and the first MCM GPU but i think AMD want to go back to basics with RDNA 4, i think that's a shame, AMD are the best when it comes to inventing innovative architectural designs and i would have liked to have seen the next evolution of their designs but i think they don't want to spend the R&D for it anymore, they can't justify it. AMD are such a talented company in this way, many world firsts to their name, its a shame they don't have Nvidia money to play with.

The new GPU's will be smaller, they are not making high end GPU's anymore, highest end RDNA 4 will be the RX 8800 XT, equivalent in performance to an RX 7900 XTX with much better Ray Tracing and about the size on an RX 7800 XT, around 250 - 300mm, that's a bit larger than a Ryzen 9700, about 50 to 100mm larger but i don't see how you can get it any smaller with that level of performance and 300mm or less is quite small for a GPU.

You're right that AMD will continue to design GPU technology because their APU's are very successful.

The problem tho is this: An RX 7800 XT chip, just the chip costs about the same to make as a Ryzen 7950X, they currently sell that CPU for £500, bargain... its £100 cheaper than a Core Ultra 285K and better.
So take another look at the images i posted, what you see for the 7950X is pretty much how it sells, the only thing that is missing is the lid, the heat spreader, $5, they sell that at a supply chain who take their share who sell it to OCUK who take their share, so AMD probably sell this £500 CPU for £350 to £400, its costs less than £100 to make it retail ready, £200 to £250 profit, while that seems like a lot remember than they spent serval hundred million $ designing it, i don't know that but i would imagine so.
The same R&D cost applies to the GPU. AMD sell that to their partners, like Sapphire who design and manufacture their own PCB's and coolers for them, CPU's don't need to be shipped with coolers and they don't have PCB's, as such. So to sell what is now a £420 GPU they need to leave enough money for Sapphire to make a profit after designing and making the PCB and cooler.
Look at this thing, this is a 7800 XT Nitro PCB, i can't find the Pulse PCB but they are made to the same standard, believe it or not... there are thousands of individual components on this thing and some of them are quite expensive, costing multiple $ individually.
Sapphire have to design all of this, make it and then they sell it in to a supply chain, AMD are not selling these chips to Sapphire for £350 to £400, the thing costs £420 retail, they are selling it at a little above cost, that's fine if you're selling 30 million of them, but they aren't, that's why AMD profit from these in the last quarter was $12 Million, or in other words nothing, if they sold 120K units that's $10 profit on each one sold. Now ask your self how many hundreds of millions did AMD spend developing this thing? Ryzen is propping up Radeon in a very big way.


OKvXcud.jpeg
 
Last edited:
But AMD was actually in talks with Samsung and TBH I doubt Instinct has the volume to actually impact Radeon's costs.
AMD is not in an easy position TBH so either they manage to find a way to provide at least 80% of the value for 50% of the price or they won't be able to expand their market share.

Look at what's happening in the smartphone market.
Mediatek is basically playing AMD against Qualcomm but they're leaving the budget segment unguarded, what's happening is that Unisoc is starting to eat their lunch there.
Intel could potentially do the same if they can make Battlemage viable so AMD has to try something disruptive or they will find themselves Matroxed into Instinct and semi-custom.

I didn't know AMD was in talks with Samsung, its an interesting one to watch.

The way i see it, the thing about workstation GPU's is that segment is growing, more and more people want them and not just in huge batches, more small outfits and even individuals are looking for them, for the latter it is about cost and if instead of selling them for $2000 a pop what if they sold them for half that or even less? AMD could carve out a new market for it self, its probably not very politically correct to make this segment so affordable, Nvidia will hate them for it but so what AMD are opening up that market to more people.

AMD did this first with HEDT where a Ryzen 1800X was half the price of a 6900K and just as good if not better, they did it again in the server space, Intel was charging up to $50,000 per chip, AMD brought that down to $18,000 with better chips, its why Intel are in trouble, running fabs is hugely expensive and you can only do that as an individual by selling chips for $50,000 a pop to pay for it all.

They could disrupt the GPU workstation segment in the same way.

Honestly i don't follow the smartphone market, other than one Samsung collaboration i didn't think AMD have any presence there at all?

Intel have all but given up on the dGPU market, the rumour is Battlemage consists of one small SKU for laptops, i don't blame them, they thought they could just walk in and take AMD's market share but quickly found out just how brutal this segment is.
 
Last edited:
But then nvidia would then drop prices of their tier of similar performing cards in response.

When faced with similar performing products, the consumer will probably go for the nvidia option.

It’s a bit of a no win situation for AMD.

It is, the truth is the vast majority only want AMD to be competitive so they can get their Nvidia cards for less money, they literally blame AMD for Nvidia being too expensive, even a lot of the mainstream tech tubers do this.

These smooth brains are soo smooth they don't understand they are the reason the GPU pricing is how it is, if one is stupid does one poses the necessary intelligence to realise one is stupid? Critical thinking is a big part of intelligence.

If no one buys AMD cards they are irrelevant to Nvidia.
 
Last edited:
HL2 outrage lasted years. It was literally one game and most were still optimised for Nvidia back then.

Also about W3 had a bigger outrage from Nvidia users! Lots of Kepler and Fermi users had very poor performance(they were the majority of Nvidia users at the time). IIRC,a GTX960 was better in some cases than a GTX780TI!

AMD users could adjust tessellation manually via drivers,but Nvidia users couldn't. An example of the sorts of threads back then:

CDPR got so much bad press,they had to put in a manual tessellation slider in a later update.

Didn't Geralt's hair have 64 triangle per pixel? which is insane, its why AMD's next driver after it launched automatically culled it to 4 triangles, that's all you need for a single pixel, you cannot see anything above that.
 
Last edited:

Oh dear. If this is true my 3080 12gb might have to carry on a while longer (or I remortgage).
meh.... the 6800 XT has 20% more cores and yet its slower vs the 7800 XT, the 7800 XT is on par with the 6900 XT which has 33% more cores, how many cores a GPU has is irrelevant, RNDA 3 is about 30% faster per core vs RDNA 2.

We all knew AMD wasn't going to be doing high end GPU's for at lease this generation if not more, = to a 4080 or 5070, which would put it 50% ahead of your 3080, not bad, it all depends on price, the 5070 is going to be £600 or £650 or £700, probably £650, if its £550 it should do well, if its £500 it will do really well.

At 7900XT and better RT for £500 I think they'd sell like hot cakes in the current market. All depends if NV can bring out 4070Ti performance for a similar price...

The fun thing is with a few tweaks to the 7800 XT is near enough on par with the 4070 Ti in raster.
 
In RT possibly, no way in raster from a small 220W chip.
You might be surprised, the 7800 XT with its 4 chiplets is near the same size as the RX 6700 XT (335mm vs 346mm) and 47% faster. Thats only from 6nm to 5nm, here we are going from 5nm to 3nm.

If the rumour was the 7800 XT was 3% larger than the 6700 XT and near 50% faster you would have said the same thing, understandably so but it is what it is, so its very possible. :)

Also 225 watts vs 250 watts.
 
Last edited:
Is humbug not saying on paper the cores are 30% faster. I don't think he's saying that equates to 30% more fps in games. Perhaps a misunderstanding here.

What i said was this.

meh.... the 6800 XT has 20% more cores and yet its slower vs the 7800 XT, the 7800 XT is on par with the 6900 XT which has 33% more cores, how many cores a GPU has is irrelevant, RNDA 3 is about 30% faster per core vs RDNA 2.

I'm probably a bit off with that but not far, its not a good way to measure it and had this be the point i was making i might have gone the extra mile to find a better way. My point was simply this what i said in the context that was edited out and highlighted here in bold, people quoting me out of context to start an argument i never made with me it really tiresome, which is why i ignored it. :)
 
Boycott the game! :p

Seriously though, AMD really need to get the finger out. 90% Nvidia cards on the market ffs. AMD should be aiming for at least 30%.

And please don't tell me it is because people like me keep buying Nvidia cards. It is AMD's job to make cards and price them correctly to make me buy them.

I gave up on supporting the so called underdog years ago when they started falling behind yet still wanting to charge silly moneys like Nvidia.

I agree DLSS is better than FSR, for sure it is, its not enough for me to pay £600 for a 12GB GPU that is a bit less crap at RT than a 16GB GPU costing £480.
 
Still baffles me this pricing thing people are so hung up about it. I just don't get why certain people have a hard time paying the price for a GPU, when Nvidia can charge top whack for gimped cards (£800/£900) or £2k for a halo card and people are like yeah take my money I don't care. Surely, the cost of building and developing a GPU is the same for Nvidia and AMD. It's not like it costs Nvidia £500 for RAM whereas AMD get the RAM for £50. So I fail to understand why people think AMD have to sell their cards cheaper. People can happily spend £300 on a motherboard, £400 on a CPU, £1k on an OLED gaming monitor but then whinge when they have to pay £600 for a GPU. I get that obviously the price has to be justified by performance. I get that. But AMD cards are not rubbish like some people want you to believe, they perform well have solid software, and FSR3.1 and AMFM2 do a good job and hopefully FSR4 will build upon that. Only weakness I see with AMD is RT. But that's just me. I personally don't think AMD are charging "top" dollar. I know a lot will disagree which is fine. :)

Thing is mostly they are cheaper...

7900 XTX: £780
4080 Super: £1000

7900 XT: £600
4070 Ti: £770

7900 GRE: £570 < over priced i agreed
4070 Super: £570

7800 XT: £450
4070: £500

7700 XT: £360
4060 Ti 16GB: £420
4060 Ti: £320

The last two are difficult to place given that the 4070 is only 10% faster than the 7700 XT while it is 20% faster than the 4060 ti, given that and the 12GB on both the 4070 and 7700 XT i would put the 7700 XT much more a competitor to the 4070, in which case its £140 more expensive.

They are cheaper, and yet Nvidia outsell AMD 9:1, the 7700 XT is near 30% cheaper than the 4070, the 7900 XTX 25% cheaper than the 4080 S, the 7900 XT also 25% cheaper than the 4070 Ti, but its not enough, no where near enough, see the difficulty AMD have in competing? People keep saying to gain marketshare they need to be cheaper, how much cheaper, 30% isn't enough, 50%? 60%? 70%? if its costing you money for every GPU you sell just stop as its suicide.
 
Well Turing came along and it shifted to "but ray tracing".

However now its been 6 years and not many great games to speak of, people are now questioning if its really feasible especially in the lower end.

It isn't and it never will be as Nvidia use it as a marketing tool for £2000 GPU's, the RT just keeps getting cranked and putting it out of reach of mortals because the more you have to pay to get RT that makes a visual difference at reasonable FPS the more money Nvidia make.

Its why RT 4070 vs 7800 XT had no bearing on my decision, at levels where the 4070 is faster it is also unusable, just a bit less crap than on the 7800 XT and at levels that are usable they are actually very similar in performance.
 
Intel just launched a card that's $50 cheaper than Nvidia and everyone is saying it's the best thing once slice bread and the card is sold out everywhere.. but the Intel card is also 20% faster than the Nvidia. That's where AMD got it wrong, Intel is getting praised because it's faster and cheaper, where is AMD loses because it's only cheaper


So yes, if the 8800xt is $50 cheaper than the Nvidia equivalent and they both have identical performance then sure, AMD will get hammered again, the 8800xt needs to be faster than the Nvidia equivalent

If only this were actually true, the B580 is little more than marginally faster than the 4060 and the same price.

The 7800 XT was a similar amount faster than the 4070 and $100 cheaper, it got at best a lukewarm reception, half of them still hated on it, the difference in how they treat AMD vs Intel is night and day, its blatantly hypocritical and it is the reason there is no competition.

Right now AMD's GPU's are an average of 25% cheaper than Nvidia, they still aren't selling, is that still a problem with AMD or is there something else going on?

Not sure why the video reviewers are going wild for this - it's 5% better than a 4060 (and in the UK is still the same price)

wO2Dpf9.png




EDIT:
7% average in Techspot's review
JcgjUPC.jpeg


 
Last edited:
The point I was making was not whether upscaling tech is needed / wanted for the future, but it being damaging to AMD that Nvidia’s DLSS is ‘widely known’ to be vastly superior than AMD’s own FSR. This might be having a lingering impact on how people assume quality varies between the brands.

If you’re buying a mid range card, I would assume these things might be important if you want to play next gen games - I imagine most people use upscaling for things like Cyberpunk with a midrange card - but I cannot say myself.

But, maybe that knowledge is known amongst tech enthusiasts only. I sort of assume that most PC gamers are tech enthusiasts, to some degree!

It doesn't justify a significant price difference between the two tho, the danger was always overhyping a feature to the point of anything without that branding being entirely uncompetitive and with that prices running out of control, DLSS "adding 30% value" actual Hardware Unboxed quote is exactly how Nvidia see this branding, tech jurno marketing made that happen.

What boggles my mind is that no one calls them out on it... Do we like overpriced GPU's?
 
Last edited:
I suppose that some consumers, like me, are in the mindset of either: “pay for the best and enjoy it” or “compromising on price by paying more is generally better than compromising on performance / quality.” I’m inevitably going to get a 5090 - my way of ‘budgeting’ for - or justifying - the purchase is to try not to buy every gen. OK, I did buy several 4090s but the coil whine was total **** so they all went back… my approach to not buying counts! :o :p

I’m not entirely sure how this scales to the mid-range cards, but maybe the same applies, to some people. In which case, traditional ideals of ‘value’ get shifted around a bit. However, I don’t deny that some mid-range Nvidia cards seem poor value… I don’t think I’d buy them!

I get the 4090 and i don't begrudge people buying it, its the best you can get, you have the money for it why not? i get it.

Recent GPU history.

GTX 970
GTX 1070 < great GPU
RTX 2070S

Skipped the 3070

4070 like the 3070 was overpriced garbage, the RT is next to useless, just as it is on the RX 7800 XT, the one thing it has going for it is DLSS but its really not a good GPU and priced for its DLSS / Nvidia branding, the 7800 XT is much better as a GPU, and it was a whole lot cheaper. So i bought it despite most tech jurnoes telling me i should spend more money on the 4070.

That's where we are now at, and years of this nonsense is why there isn't a market for anything other than Nvidia.

6 Moths of ownership with this AMD GPU now, i love it, its a fantastic thing and solid.
 
Last edited:
I don't think it is in dispute that fsr is inferior to Dlss and Rt on Nvidia is better. What i wonder is how many people by an inferior Nvidia card for everything else for more money when they most likely don't use these features ie each tier AMD are faster for less money with more Vram 4090 aside. AMD as a standard card are better and offer more. It's only if you want upscaling and a chance to use RT does it make sense to pay the Nvidia tax.

Because people are being told DLSS is everything, anything else is worthless.
 
Back
Top Bottom