• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

We all know there will be RT improvements. That's what they are specifically working and focusing on. What percentage that works out to be, we have to wait and see. I'm all for a great mid-range GPU. So I'm pretty excited to see what comes out.

In Gaming Graphics, revenue declined year-over-year as we prepare for a transition to our next-gen Radeon GPUs based on our RDNA 4 architecture. In addition to a strong increase in gaming performance, RDNA 4 delivers significantly higher ray tracing performance and adds new AI capabilities.

We are on track to launch the first RDNA 4 GPUs in early 2025.

Lisa Su – AMD Q3 2024 Earnings Call
 
Those relative RT performance charts are very misleading. Any game with even moderate levels of RT and AMD cards get hit harder than a ginger haired stepchild.

They do OK and to be fair even the mid range Nvidia card suffer. But still, AMD need to see at least a 20% relative improvement in RT performance for the 8800XT. So if it matches a 7900 XTX in raster, it needs to be at least 20% and closer to 30% faster in RT.

Agreed and looking through the results at 4K the 12 GB cards don't work at all with RT at 4K in some games, that explains the large discrepancy between 1440P and 4K with the 7800 XT / 7900 GRE and the 70 class cards

But i still think at 1440P the 7800 XT / 7900 GRE are a very usable RT card's, i do push back against the idea among some, not necessarily anyone here who holds the idea that these cards are not proper RT cards, in fact i would go even further than that, the slides below are the two extremes, so the 4070 at 37 FPS vs the 7900 GRE at 30 FPS puts the 4070 23% ahead, impressive, however no one in thier right mind is going to argue that either of these cards are useable in Cyberpunk at these settings and while i can't find the video now one of these Youtubers took the very brave step to see how much difference there actually is between these GPU's when you tuned to game to run at 60 FPS on one of them and then transferred those setting to the other, the answer is at the same setting in that scenario for 60 FPS there is very little if any difference between them, they both run at 60 FPS.

So while yes the 4070 is technically better, its not useably better.

rt-cyberpunk-2077-2560-1440.png
rt-far-cry-6-2560-1440.png
 
Last edited:
Those relative RT performance charts are very misleading. Any game with even moderate levels of RT and AMD cards get hit harder than a ginger haired stepchild.

They do OK and to be fair even the mid range Nvidia card suffer. But still, AMD need to see at least a 20% relative improvement in RT performance for the 8800XT. So if it matches a 7900 XTX in raster, it needs to be at least 20% and closer to 30% faster in RT.

If the so far yet fictionally claimed rumours of RDNA4 having 2x leap in RT performance, an 8800XT with double the RT would be interesting in where that lands. Somewhere between a 4070Ti and a 4080 for £600?

however no one in thier right mind is going to argue that either of these cards are useable in Cyberpunk at these settings

This has always been my outlook on it where the lower down the stack you go the less impressive or even viable RT is, so that cohort just wont be able to benefit. So far its been the higher end each gen that samples it to see where its come.
 
Last edited:
This has always been my outlook on it where the lower down the stack you go the less impressive or even viable RT is, so that cohort just wont be able to benefit. So far its been the higher end each gen that samples it to see where its come.

You're right, if its 4080 vs 7900 XTX with RT the 4080 is just better, no argument.

lower down the stack Nvidia vs AMD its much more grey, if you play Cyberpunk you're not doing it 37 FPS, you're going to tune the game to run at at least 60 and if you're doing that then actually the 7900 GRE is just as good, why? Because the Nvidia cards are only better when you literally chock them to death with OT RT settings, if you don't do that RDNA 4 doesn't suffer nearly as much.

Recently people talk a lot about low resolution CPU testing for games complaining that's not realistic, is blasting your GPU with RT to the point of making it unplayable realistic? It seems to me that's the only way Nvidia win that argument which is perhaps why every tech tuber does it like that, why is no one pushing back against that? The RT charts would look very different if they used setting people would actually run the game at.
 
Last edited:
If the so far yet fictionally claimed rumours of RDNA4 having 2x leap in RT performance, an 8800XT with double the RT would be interesting in where that lands. Somewhere between a 4070Ti and a 4080 for £600?

Honestly in the fictional world where the 8800xt is £600 for that performance I'd be sold, I'd expect the rasterization performance to be closer to the 7900xtx/4080 but if they can get RT performance close to the current 4080 levels, I can see it being a no brainer for the mid tier, as I currently expect the 5070 cards to be pushing 800-900 next generation.
 
If it's up against the 5070 , 5070 in which we should expect to be close or matching 4080 performance

Will be interesting what AMD considers mid tier with RDNA 4

Last time they dropped out at the high-end they competed against the 2070 / super with the 5700xt
 
Last edited:
I do hope AMD don't mess about and just push them out at $500 for initial reviews and get them out before Nvidia, aside from the good deal i'm interested to see how Nvidia would react, they could easily spoil AMD's party by slapping them in th face with a $500 to $550 RTX 5070 or stay on course for gradually ramping the price of GPU's up and pushing them out at $650, i would be interested to see which direction Nvidia take.
 
Last edited:
I do hope AMD don't mess about and just push them out at $500 for initial reviews and get them out before Nvidia, aside from the good deal i'm interested to see how Nvidia would react, they could easily spoil AMD's party by slapping them in th face with a $500 to $550 RTX 5070 or stay on course for gradually ramping the price of GPU's up and pushing them out at $650, i would be interested to see which direction Nvidia take.
I'm curious about something and I think you might be able to answer it. How does a Ryzen 7900 compare in chip size with AMD GPUs?
 
I'm curious about something and I think you might be able to answer it. How does a Ryzen 7900 compare in chip size with AMD GPUs?

That's a complicated question, the dual CCD Ryzen Zen 4 are 5nm for the two logic dies and 6nm for the IO die. The two logic dies are about 160mm combined, about 80mm each with IO die being about 120mm.

RDNA 3, is also 5nm for the logic die and 6nm for the memory dies, Navi 31 (7900 series) has a 300mm logic die and 6 37mm memory dies, about 220mm combined for the memory dies.

Navi 32 (7800 / 7700 series) has a 200mm logic die and 4 memory dies.

So....
Ryzen 7950X: 160mm 5nm dies + 120mm 6nm die.
7800 XT: 200mm 5nm die + 150mm 6nm dies.
7900 XTX: 300mm 5nm die + 220mm 6nm dies.
 
Last edited:
Surely the reason AMD don't want out of the discrete GPU market is that it would hamper their APUs which might in turn put their console stuff in danger?

It's a tricky situation really, you can't blame Nvidia for the way they're doing things and dominating the market and causing AMD issues. However it's hard to know what AMD can do to get themselves out of it. Making their cards cheaper will only help so much when general consumers know the Nvidia brand better. And as with the example a few posts back of someone's friend coming from an AMD card and looking to upgrade to an AMD card a lot of people coming from an Nvidia card will likely stick with what they know if they didn't have any issues. And who can blame them really, I'm a lot of us do the same with other products where we'd stick with a brand we know rather than some Chinese sounding brand we're unfamiliar with (which is what AMD might be for a lot of people).

We've a similar issue with CPUs at the minute where AMD keep knocking it out of the park and Intel keep getting swings and misses. It makes it really awkward for Intel, but you can't blame AMD and I doubt Intel are doing this on purpose. Intel probably doing better than they should be due to their brand recognition at the minute, but that will only last so long.

I agree with what's been said regarding RT and playable FPS etc., I will say that the 4060Ti could be as easily compared to the 7700XT as the 7800XT though and it does better than the 7700XT i nthe graphs above. So solely on that metric it might not be priced too badly (not a fan of the 4060Ti 16GB pricing in general though).
As you drop down though the 4060 does better than the similarly priced 7600 and the 4060Ti 8GB is better again but I don't see the 7600XT on the graphs to compare to. But none of these cards are probably useable for much RT based stuff realistically.
 
5nm / 6nm (5 and 6 mano metre) refers to the printing lithography of the die, another word for basically chip. How the actual chip is made, the smaller the number the smaller and better / later that is, all made at TSMC.

The first image is Navi 31, the 7900 series RDNA 3 GPU's

The second image is of a Ryzen 7900 series CPU with the heat spreader removed.

0GjSuei.jpeg

hbOMIdF.jpeg
 
Last edited:
If it's up against the 5070 , 5070 in which we should expect to be close or matching 4080 performance

Will be interesting what AMD considers mid tier with RDNA 4

Last time they dropped out at the high-end they competed against the 2070 / super with the 5700xt
If it's up against the 5070 , 5070 in which we should expect to be close or matching 4080 performance

Will be interesting what AMD considers mid tier with RDNA 4

Last time they dropped out at the high-end they competed against the 2070 / super with the 5700xt

It’s more about what price AMD chooses.
 
It’s more about what price AMD chooses.

Of course it's always about the pricing but as we know AMD haven't been aggressive enough in the past or release in a good manner to entice people over but now supposedly want more market share in mid tier should hopefully mean aggressive pricing, one can hope

I really want an excuse to move from the 3080 while budget staying between £500-£600
 
Last edited:
Agreed, so many on the RDNA 2 and 3000 series could use an upgrade. I have a 4080 and with upscaling it is very viable as a 4K GPU. If AMD or Nvidia release an equivalent GPU in the £500 - £600 price range, it will be a great upgrade.
 
Last edited:
I do hope AMD don't mess about and just push them out at $500 for initial reviews and get them out before Nvidia, aside from the good deal i'm interested to see how Nvidia would react, they could easily spoil AMD's party by slapping them in th face with a $500 to $550 RTX 5070 or stay on course for gradually ramping the price of GPU's up and pushing them out at $650, i would be interested to see which direction Nvidia take.

I think the other battle they could win on is actually giving reasonable vram size. With the rumoured 3GB chips and being GDDR7 they could actually up their game on this.
 
5nm / 6nm (5 and 6 mano metre) refers to the printing lithography of the die, another word for basically chip. How the actual chip is made, the smaller the number the smaller and better / later that is, all made at TSMC.

The first image is Navi 31, the 7900 series RDNA 3 GPU's

The second image is of a Ryzen 7900 series CPU with the heat spreader removed.

0GjSuei.jpeg

hbOMIdF.jpeg
Thank you Humbug.
So, what I'm thinking is: should AMD stop making big chips and focus on something the size of a Ryzen 9700?

Or, if slightly older nodes are price competitive, should AMD try to squeeze the best possible from nodes one generation behind and openly offer players a trade-off between price and efficency?
Basically declaring the best node available is for CPU and Instinct while GPU stays on the previous CPU node/cheaper alternative (Samsung?).

This way volume woes should be significantly lessened and price could be competitive.
There are also some R&D advantages: You set the microarchitecture with iGPU/APU and refine it on the same one when you actually switch it to dGPUs, basically pulling a tick/tock strategy having sinergy with fab node refinements.

Intel tried this and failed but AMD is much better positioned to take advantage of a full stack strategy.
 
Thank you Humbug.
So, what I'm thinking is: should AMD stop making big chips and focus on something the size of a Ryzen 9700?

Or, if slightly older nodes are price competitive, should AMD try to squeeze the best possible from nodes one generation behind and openly offer players a trade-off between price and efficency?
Basically declaring the best node available is for CPU and Instinct while GPU stays on the previous CPU node/cheaper alternative (Samsung?).

This way volume woes should be significantly lessened and price could be competitive.
There are also some R&D advantages: You set the microarchitecture with iGPU/APU and refine it on the same one when you actually switch it to dGPUs, basically pulling a tick/tock strategy having sinergy with fab node refinements.

Intel tried this and failed but AMD is much better positioned to take advantage of a full stack strategy.

The strategy is to go back to making workstation GPU's and gaming GPU's the same silicon, so if they don't sell them as gaming GPU's they stand a better chance moving them as workstation GPU's, i don't know if this is true for upcoming RDNA 4, probably not, possibly RDNA 5.

AMD wouldn't go Samsung, too far behind and they have a good relationship with TSMC that they would want to maintain, the best and most expensive node right now is TSMC N3, that's a 3nm node, its what Apple are currently using and Intel for their 200 series CPU's and APU's.
I think AMD are very likely to use TSMC N4, 4nm for RDNA 4, they use that for their Ryzen 9000 series, its a very solid node and as shown with Ryzen 9000 AMD are able to design very good efficiency with it.
They may also go back to it being monolithic, as you can see from my pictures and explanation current Radeon 7000 series (RDNA 3) is a multichip design, its a very successful design and the first MCM GPU but i think AMD want to go back to basics with RDNA 4, i think that's a shame, AMD are the best when it comes to inventing innovative architectural designs and i would have liked to have seen the next evolution of their designs but i think they don't want to spend the R&D for it anymore, they can't justify it. AMD are such a talented company in this way, many world firsts to their name, its a shame they don't have Nvidia money to play with.

The new GPU's will be smaller, they are not making high end GPU's anymore, highest end RDNA 4 will be the RX 8800 XT, equivalent in performance to an RX 7900 XTX with much better Ray Tracing and about the size on an RX 7800 XT, around 250 - 300mm, that's a bit larger than a Ryzen 9700, about 50 to 100mm larger but i don't see how you can get it any smaller with that level of performance and 300mm or less is quite small for a GPU.

You're right that AMD will continue to design GPU technology because their APU's are very successful.

The problem tho is this: An RX 7800 XT chip, just the chip costs about the same to make as a Ryzen 7950X, they currently sell that CPU for £500, bargain... its £100 cheaper than a Core Ultra 285K and better.
So take another look at the images i posted, what you see for the 7950X is pretty much how it sells, the only thing that is missing is the lid, the heat spreader, $5, they sell that at a supply chain who take their share who sell it to OCUK who take their share, so AMD probably sell this £500 CPU for £350 to £400, its costs less than £100 to make it retail ready, £200 to £250 profit, while that seems like a lot remember than they spent serval hundred million $ designing it, i don't know that but i would imagine so.
The same R&D cost applies to the GPU. AMD sell that to their partners, like Sapphire who design and manufacture their own PCB's and coolers for them, CPU's don't need to be shipped with coolers and they don't have PCB's, as such. So to sell what is now a £420 GPU they need to leave enough money for Sapphire to make a profit after designing and making the PCB and cooler.
Look at this thing, this is a 7800 XT Nitro PCB, i can't find the Pulse PCB but they are made to the same standard, believe it or not... there are thousands of individual components on this thing and some of them are quite expensive, costing multiple $ individually.
Sapphire have to design all of this, make it and then they sell it in to a supply chain, AMD are not selling these chips to Sapphire for £350 to £400, the thing costs £420 retail, they are selling it at a little above cost, that's fine if you're selling 30 million of them, but they aren't, that's why AMD profit from these in the last quarter was $12 Million, or in other words nothing, if they sold 120K units that's $10 profit on each one sold. Now ask your self how many hundreds of millions did AMD spend developing this thing? Ryzen is propping up Radeon in a very big way.


OKvXcud.jpeg
 
Last edited:
Back
Top Bottom