• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

The issue now with the PS5 Pro going for more RT effects,we might end up with some games with RT being optimised for the PS5 PRO and things going the other way. Would be funny if that happens and I expect people will be moaning about AMD sabotaging Nvidia,like what happened in Starfield.

People said the same with the base ps5 and games being optimized for rasterisation

I do think RT does look nice however I don't actively play any games where it will make/does a massive difference to visuals.

RT will look good if the game is made with it from the ground up and actually has gameplay scenes which takes advantage of it.

I don't know if it has it for RE Village would be one of the best games for RT imo and way more then Cyberpunk would have since a lot of Cyberpunk RT examples we get forced fed here is puddles and rain which granted will look nice just doesn't add enough but atmospheric which RE Village is full of would be great.

If I'm honest, horror games is probably where RT works best, not action games or open world games.


RT looks great in open world games when you're using full RT global illumination. The other RT effects I agree not so much worth it, but RT GI makes a big difference imo
 
Last edited:
People said the same with the base ps5: It would be funny if Rasterisation games gets optimized for the PS5 and Nvidia users moaning about



RT looks great in open world games when you're using full RT global illumination

Well they did moan about Starfield for months,with 1000s of disgruntled comments on Reddit and forums. The same with one of the Resident Evil games and huge games like COD which still work better overall on AMD cards. COD has a huge playerbase on consoles. With Starfield people were moaning,why optimise for AMD at launch when Nvidia has more marketshare,etc.I remember back 20 years when HL2 was optimised for ATI cards,despite most games being optimised for Nvidia cards even back then and the utter moaning on forums.

You also had people on here,moaning AMD sponsored games didn't have enough shiny reflections(because their RT implementation has issues with during concurrent RT and rasterised operations). I know AMD fans moan about Nvidia optimisations,but the HL2 outrage lasted for years. Valve was apparently being accused of trying to sabotage Nvidia on purpose.

So I expect if even one AAA game(say GTA 6),does reasonably well on RDNA4 because of the PS5 PRO we will have another Starfield or HL2 level outrage.
 
Last edited:
then they should ideally be giving us 7900XTX performance for 7700XT price.

The poor expectations I have makes me think we'll get 7900XT performance at 7800XT price, if we're even lucky to get that. But even that would at least be appreciated, over what we've had recently. Heck at this point I'll even take a £300 card performing like 7800XT if that's the best they give us.

But then nvidia would then drop prices of their tier of similar performing cards in response.

When faced with similar performing products, the consumer will probably go for the nvidia option.

It’s a bit of a no win situation for AMD.
 
But then nvidia would then drop prices of their tier of similar performing cards in response.

When faced with similar performing products, the consumer will probably go for the nvidia option.

It’s a bit of a no win situation for AMD.

Except Nvidia hasn't though. You can get an RX6750XT/RX6800 for almost RTX4060 money. The RX7700XT and RX7800XT are RTX4060TI 8GB money.

Nvidia like Apple doesn't really care anymore,because people will throw money at them.
 
Last edited:
Well they did moan about Starfield for months, with 1000s of disgruntled comments on Reddit and forums,one of the Resident Evil games and huge games like COD which still work better overall on AMD cards. COD has a huge playerbase on consoles. With Starfield people were moaning,why optimise for AMD at launch when Nvidia has more marketshare,etc.I remember back 20 years when HL2 was optimised for ATI cards,despite most games being optimised for Nvidia cards even back then and the utter moaning on forums.

You also had people on here,moaning AMD sponsored games didn't have enough shiny reflections(because their RT implementation has issues with during concurrent RT and rasterised operations). I know AMD fans moan about Nvidia optimisations,but the HL2 outrage lasted for years. Valve was apparently being accused of trying to sabotage Nvidia on purpose.

So I expect if even one AAA game(say GTA 6),does reasonably well on RDNA4 because of the PS5 PRO,we will have another Starfield or HL2 level outrage.


I don't know much about how GPUs worked or games worked around HL2's time, but I remember the Witcher was "optimized" for Nvidia, causing AMD users to have outrage - however what made this situation possible was that the Nvidia and AMD GPU were handling tesselation slightly differently allowing for one GPU to beat another and the developer then abusing that to "optimise" for Nvidia. With tesselation there was no visual difference in the visuals between low and ultra settings

I'm unsure how likely that is to happen with RT no matter how much games are "optimized" for the ps5 pro because the nvidia and amd GPUs process RT similarly and the main difference is the Nvidia GPU has more dedicated silicon for RT processing but the processing is about the same - and all developers can do is either create higher quality RT or lower visual quality RT and the differences are visible. If you create higher quality RT, the RT effects take up more of the frame time which leads to Nvidia leading the fps because it has more dedicated RT silicone and if you create lower quality RT, rasterisation takes more of the frame time, which can give AMD higher fps comparatively as you saw in other games
 
Last edited:
RT will look good if the game is made with it from the ground up and actually has gameplay scenes which takes advantage of it.

The issue with this is we have had over six years where the hardware is capable of doing so, and remaking/remastering titles is not showcasing it. However there has to be some weight why the games developers are not doing this (and also why some games have no RT implementation at all). Is it because most of the target hardware is not able to? Is it because the devs are not skilled enough to do it in a timely manner? Or is it because it was being pushed out before its ready to by nvidia and the past three generations were marketing spin till Blackwell or beyond deliver it properly?
 
Last edited:
RT looks great in open world games when you're using full RT global illumination. The other RT effects I agree not so much worth it, but RT GI makes a big difference imo

I think you and I might have different visions on open world games, for me open world is the likes of GTA5 and Elderscrolls and especially in Elderscrolls its open vastness of space that you can see into the far flung distance, RT looks great if there is reflection or shadows to be cast in a scene so closed areas shine in this respect so unsure which open world games you have in mind.
 
I don't know much about how GPUs worked or games worked around HL2's time, but I remember the Witcher was "optimized" for Nvidia, causing AMD users to have outrage - however what made this situation possible was that the Nvidia and AMD GPU were handling tesselation slightly differently allowing for one GPU to beat another and the developer then abusing that to "optimise" for Nvidia. With tesselation there was no visual difference in the switcher between low and ultra settings

I'm unsure how likely that is to happen with RT no matter how much games are "optimized" for the ps5 pro because the nvidia and amd GPUs process RT similarly and the main difference is the Nvidia GPU has more dedicated silicon for RT processing but the processing is about the same - and all developers can do is either create higher quality RT or lower visual quality RT and the differences are visible
HL2 outrage lasted years. It was literally one game and most were still optimised for Nvidia back then.

Also about W3 had a bigger outrage from Nvidia users! Lots of Kepler and Fermi users had very poor performance(they were the majority of Nvidia users at the time). IIRC,a GTX960 was better in some cases than a GTX780TI!

AMD users could adjust tessellation manually via drivers,but Nvidia users couldn't. An example of the sorts of threads back then:

CDPR got so much bad press,they had to put in a manual tessellation slider in a later update.
 
Last edited:
But then nvidia would then drop prices of their tier of similar performing cards in response.

When faced with similar performing products, the consumer will probably go for the nvidia option.

It’s a bit of a no win situation for AMD.

It is, the truth is the vast majority only want AMD to be competitive so they can get their Nvidia cards for less money, they literally blame AMD for Nvidia being too expensive, even a lot of the mainstream tech tubers do this.

These smooth brains are soo smooth they don't understand they are the reason the GPU pricing is how it is, if one is stupid does one poses the necessary intelligence to realise one is stupid? Critical thinking is a big part of intelligence.

If no one buys AMD cards they are irrelevant to Nvidia.
 
Last edited:
HL2 outrage lasted years. It was literally one game and most were still optimised for Nvidia back then.

Also about W3 had a bigger outrage from Nvidia users! Lots of Kepler and Fermi users had very poor performance(they were the majority of Nvidia users at the time). IIRC,a GTX960 was better in some cases than a GTX780TI!

AMD users could adjust tessellation manually via drivers,but Nvidia users couldn't. An example of the sorts of threads back then:

CDPR got so much bad press,they had to put in a manual tessellation slider in a later update.

Didn't Geralt's hair have 64 triangle per pixel? which is insane, its why AMD's next driver after it launched automatically culled it to 4 triangles, that's all you need for a single pixel, you cannot see anything above that.
 
Last edited:
Didn't Geralt's hair have 64 triangle per pixel? which is insane, its why AMD's next driver after it launched automatically culled it to 4 triangles, that's all you need for a single pixel, you cannot see anything above that.
I remember sitting and trying to play Witcher 3 on my GTX 780 and being absolutely furious that I didn't have tessellation controls like the AMD GPUs. Performance was hit HARD.
 
The latter is what AMD would do, if they really did care about regaining market share. Just price in a way that brings todays top end performance to the mainstream price-points. Performance wise, we don't know what AMD's next gen will top out at. But if similar to a previous time they didn't have high-end (RX 400), then they should ideally be giving us 7900XTX performance for 7700XT price.

This is what I'm hoping for.

It is, the truth is the vast majority only want AMD to be competitive so they can get their Nvidia cards for less money, they literally blame AMD for Nvidia being too expensive, even a lot of the mainstream tech tubers do this.

These smooth brains are soo smooth they don't understand they are the reason the GPU pricing is how it is, if one is stupid does one poses the necessary intelligence to realise one is stupid? Critical thinking is a big part of intelligence.

If no one buys AMD cards they are irrelevant to Nvidia.

Which sort of ties into this. If I can upgrade a reasonable amount for sensible price-to-performance, then I'll go AMD to stick it to the Nvidia cartel.

I've no problem with the 5090 being silly money, because I'm not in the market for it, similar in that I was never in the market for a Titan X and saw the 1080ti as the halo consumer-grade card. it feels like the naming scheme has shifted, which is fine but the pricing has as well.

As said above, if AMD can nail this then I'm in. If not then I'll wait it out.
 
I just buy whatever genuinely seems best for the money. I am not a fan of relying on upscaling tech for performance because as nice as they can be (especially DLSS) they can still be buggy and frame generation in particular doesn't actually aid in the gameplay experience even if the frame rate is higher.

Ray Tracing is nice but hardly a killer feature yet and so I went with a 7900XTX which I got for a smidge over £800 in August 2023, for that time it was a steal as the 4080 was still mostly over £1000.
 
But then nvidia would then drop prices of their tier of similar performing cards in response.

When faced with similar performing products, the consumer will probably go for the nvidia option.

It’s a bit of a no win situation for AMD.
Yeah, I am torn with what to do in a few years. Currently rocking a sapphire nitro 7900 GRE, however I may consider a 5700ti Super in a year or so.

I am very happy with AMD's current offering but would like at some point to jump on better RT. I have been happy to support them the last 5 years as they have offered me a very good product but I suspect the gap will grow next gen with no high tier, when are we looking at RDNA 5 (or UDMA as it will be) ?

Honestly one of the biggest reasons I might not switch is Sapphire, really struggle not to want a Nitro, NVIDIA really don't seem to have very nice looking cards :-S.
 
Last edited:
So from HU perspective the 8GB cards should be no more than £200:


It has to be $200. Yeah, that's right. I think next generation 8 gigabyte graphics cards, $200 should be the maximum asking price.

Just need to wait and see what the vendors ask for now in January!
 
But then nvidia would then drop prices of their tier of similar performing cards in response.

Sure that would be the logical thing to do, but others have already said it. Nvidia hasn't been reducing prices to match AMD recently. Actually they've barely reduced prices at all in general.
I've been following the GPU prices for the past few months and admittedly AMD had reduced their prices by quite a lot, didn't think we'd be finding 7900XTXs for ~£750, but here we are.
Meanwhile 4080 are still £1k+, even with 4080S available at the same price or a smidge cheaper. And look at the rumoured 5000 series prices, the same if not worse.

If Nvidia did drop prices, then that's not a bad thing for the folks buying the GPUs. But atm, Nvidia has grown complacent to keep pricing high, which is AMDs chance, should they wish to take it.

When faced with similar performing products, the consumer will probably go for the nvidia option.

Performance depends on the application and settings, but in general the average GPU buyer is still buying Nvidia with far worse performance anyway.
Might not make logical sense, but it's that mindshare thing.

Personally, I just look at what performs best for the settings/games I play at the budget I set and buy that.
Funniiy enough last time I brought a new GPU, I waited for the AMD release, saw that Nvidia's option was a smidge better in specific titles and went with that.
But that was a decade ago, back when there was a ton of great options for mainstream budgets.
 
I agree. I probably should have thought about it for a moment before posting.

Nvidia won’t drop prices unless their market share in that tier space is under threat. Given their current dominance and folks buying nvidia regardless, you’re all right. No reason for nvidia to do anything.

There is a real opportunity for AMD to come in at the £500 mark and offer 7900XTX level performance, with 30% better ray tracing performance and lower power consumption. Then can take my money if they do.

Such is the GPU space at the moment, they’ll probably price it same as the nvidia equivalent performing cards and then high five each other in AMD’s GPU division head office.
 
I agree. I probably should have thought about it for a moment before posting.

Nvidia won’t drop prices unless their market share in that tier space is under threat. Given their current dominance and folks buying nvidia regardless, you’re all right. No reason for nvidia to do anything.

There is a real opportunity for AMD to come in at the £500 mark and offer 7900XTX level performance, with 30% better ray tracing performance and lower power consumption. Then can take my money if they do.

Such is the GPU space at the moment, they’ll probably price it same as the nvidia equivalent performing cards and then high five each other in AMD’s GPU division head office.

I agree with most of what you wrote. The only slight thing is the 500 quid. Now my viewpoint on that is due to what I already have in the machine and what I paid for it so don't take it as a statement that fits all users. However, with a 6950XT in the system that I paid around 600 quid for 1½ years ago, I'm not sure I see the value in a 35% performance bumb for 500 quid and that is only if the 8800XT can offer XTX performance which I doubt it can in non RT workloads. I think, based on rumours(salt time), that a more realistic raster performance target would be just slight ahead or equal to the XT which would make it roughly 20% faster than a 6950XT and then I really do not see the value, again only because I already have a 6950XT. So my conclusion would be yes it could end up being a good GPU even at 7900XT performance levels if its less than 500 if your stuck on midtier RDNA2 or 3000 series or older. But for anyone above that, so 3080/6800XT or above, its 80% a pass in my mind at the time. Maybe some details will change my mind later on. BUT if it manages a 400 quid price tag, then I'm all over it. Keep dreaming I know :P
 
Back
Top Bottom