• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Dude you say this so often that it's time to stop deluding yourself, you're a serial changer just like most of us here ;)
You’re quite right. I don’t tend to keep a card longer than a year ;)

I’m secretly trying to convince the wife it’s ok to spend £2k on a 5090 as it’s my only hobby- I don’t booze or smoke and don’t spend much on clothes. Not sure I can manage to get her round that price though!
 
Somehow in your YT cyberpunk video I can see a massive difference in path tracing vs even RT- amazing. Maybe I should play through the game again and try to take in the sights? ;)

But I only get 20fps in path tracing :(

Just use fsr and FMF.......

;) :p

What will be the real test to see if you do truly notice the difference is to play with RT or PT for a week then go back to raster and see if you miss it, like I always say, for me it's like going back to 60 hz after experiencing 144/175hz, something just doesn't feel/look right and you start questioning why doesn't that have a shadow, why is this room illuminated despite no light source being around, why has this reflection disappeared, why is this shadow so sharp looking when it should be soft and blurred etc. etc.
 
Just use fsr and FMF.......

;) :p

What will be the real test to see if you do truly notice the difference is to play with RT or PT for a week then go back to raster and see if you miss it, like I always say, for me it's like going back to 60 hz after experiencing 144/175hz, something just doesn't feel/look right and you start questioning why doesn't that have a shadow, why is this room illuminated despite no light source being around, why has this reflection disappeared, why is this shadow so sharp looking when it should be soft and blurred etc. etc.
You can strike me off the naysayers because:

giphy.gif
 
Last edited:
I am on a 4090 and I still use upscaling to drive my 240hz OLED monitor because I cannot see the difference in quality in motion and the motion clarity of 150-200 FPS on an OLED is insane. I actually subscribed to PureDark Patreon for his DLSS mods on non supported games.

I would rather use a less sharp image from DLSS, lay on RT on top than use native TAA. Most games TAA is just horrible. Just look at Forbidden West TAA flickering on vegetation. Even DLSS at 58% render scale makes it look better.

The higher FPS is just a bonus. To me, just losing the option and customisation of DLSS (enabling the use of DLDSR and RT) for the prospect of being able to run textures just a notch higher is just not attractive enough.

DLSS is better than FSR, way better, i don't know how many times i have to say that before people get it in to their heads that i'm not saying this isn't true, i know it is, i had an RTX card for more than 3 years....

My point is and always has been...

DLSS is a reason not to buy AMD, it is not a reason to jack the price up to silly.


I have joined the red team, Purely for WoW and some Star Citizen now and then :)

Another one... :)

AMD won against Intel partly because of Intel's own missteps which gave AMD time to get it's act together. Nvidia is not Intel and won't give them this option.

Rumors are the 5090 is 70% faster than 4090 and with AMD languishing on 7900XTX levels of performance, Nvidia will leave them in the dust before long.

True. Intel got complacent, having said that for about as long as these two have been competing AMD have been making better CPU's than Intel for at least as long as its the other way round, AMD started to make better CPU's in the late 1980's and right through the 1990's, Intel actually use a lot of AMD's IP in their CPU's, have done for decades.
There is no 64Bit without AMD.

Yep, I'm curious if they also go the chiplet route with their GPU's too but instead of having them spaced out like Ryzen, Having them connected at the silicon level like Apple does with their top end silicon.

This was meant to be RDNA 4, they shelved it, for now.

AMD are IMO the best semiconductor architectects, RDNA 3 is architecturally more advanced than Ada, just as Zen is more advanced that what Intel have, what AMD are not are software engineers, never have been, Nvidia are by far the best software engineers.
You know what, if Nvidia and AMD got together to work on a GPU it would be ______ mind-blowing....

pJytwl1.jpg
 
AMD won against Intel partly because of Intel's own missteps which gave AMD time to get it's act together. Nvidia is not Intel and won't give them this option.

Rumors are the 5090 is 70% faster than 4090 and with AMD languishing on 7900XTX levels of performance, Nvidia will leave them in the dust before long.
They could have been decent this gen if not for the greed.
 
They could have been decent this gen if not for the greed.
I think the cards themselves are great. I don't think the price is the problem as the Sapphire Nitro 7900 XTX when overclocked with a 15% increase to power limit narrows the difference to a 4090 to around 15% and usually without an FPS counter you will never be able to tell the difference vs a stock RTX 4090.

AMD cards also have less CPU driver overhead than Nvidia cards. I upgraded from RTX 2080 Ti to 3080 Ti to 4090 which saw me upgrading my 9900k to 12700k and now a 7800X3D just because each GPU upgrade started to show bottlenecks on each of these CPUs. If the 5090 is 70% faster than the 4090, I don't think the 7800X3D can keep up so will need to upgrade that as well. AMD cards fare well in such scenarios.

The software stack is really the only major issue. They need to get up to speed on hardware upscaling, RT performance at a minimum. Even Intel is better than AMD in this area. For me, as someone who upgrades GPUs every generation, adding 4GB of VRAM and calling it a day achieves nothing.

It's so frustrating looking at a good GPU being held back by lack of software.
I am actually considering a 7900XTX for my living room OLED just to see how AMD frametime performance compares to Nvidia and for tinkering with the card in general but I am only doing this as the 7900XTX is my secondary card. I would never buy it as a primary card due to the software deficiencies.
 
Last edited:
The upscaling is the main thing that stops me from switching to amd, well that and DLDSR and now RTX HDR. As per usual, we can thank devs for nvidia creating solutions to problems that shouldn't exist in the first place.
 
That was another eye opener for me, the 24/7 OC i run on my 7800 XT makes it almost identical to a 4070 Ti, its 98 or 99% as fast in raster, an £800 GPU, it is fast. At that it still run's very cool and very quiet.

Yes the 4070 Ti overclocks too, by 4 or 5%.
 
I think the cards themselves are great. I don't think the price is the problem as the Sapphire Nitro 7900 XTX when overclocked with a 15% increase to power limit narrows the difference to a 4090 to around 15% and usually without an FPS counter you will never be able to tell the difference vs a stock RTX 4090.

AMD cards also have less CPU driver overhead than Nvidia cards. I upgraded from RTX 2080 Ti to 3080 Ti to 4090 which saw me upgrading my 9900k to 12700k and now a 7800X3D just because each GPU upgrade started to show bottlenecks on each of these CPUs. If the 5090 is 70% faster than the 4090, I don't think the 7800X3D can keep up so will need to upgrade that as well. AMD cards fare well in such scenarios.

The software stack is really the only major issue. They need to get up to speed on hardware upscaling, RT performance at a minimum. Even Intel is better than AMD in this area. For me, as someone who upgrades GPUs every generation, adding 4GB of VRAM and calling it a day achieves nothing.

It's so frustrating looking at a good GPU being held back by lack of software.
I am actually considering a 7900XTX for my living room OLED just to see how AMD frametime performance compares to Nvidia and for tinkering with the card in general but I am only doing this as the 7900XTX is my secondary card. I would never buy it as a primary card due to the software deficiencies.
Overclocking is not guaranteed for each and every card (and not for average Joe), but if that would be the case and you could do it... AMD fu**ed up again by offering less - besides the hot spot bug and the huge power consumption in light loads.

But even if you scale the same in RT/PT (15% gains), there's still a huge gap (almost 50%) left between 7900xtx and 4090. I'd also say that dedicat hw for RT and upscaling is also a plus. AMD solution seems to work kinda ok(ish) with low levels of RT.

With that in mind, hardware still isn't that great for the price paid and while continuing this strategy may give them a few more percentages of market share, it will not take them far.
 
Last edited:
Overclocking is not guaranteed for each and every card (and not for average Joe), but if that would be the case and you could do it... AMD fu**ed up again by offering less - besides the hot spot bug and the huge power consumption in light loads.

But even if you scale the same in RT/PT (15% gains), there's still a huge gap (almost 50%) left between 7900xtx and 4090. I'd also say that dedicat hw for RT and upscaling is also a plus. AMD solution seems to work kinda ok(ish) with low levels of RT.

With that in mind, hardware still isn't that great for the price paid and while continuing this strategy may give them a few more percentages of market share, it will not take them far.
RDNA 5 needs to have 4090 level of RT hardware to compete.
 
All I see is arguing. Has it always been like this? Someone saying It's game over if AMD don't cater to people who regularly splurge £650-2000 on a god damn graphics card. Nvidia are killing the market imo. Normal people don't have that sort of money FFS. I think AMD are taking a step back, and are going to cater to the working class, which is smart if you ask me.
 
If AMD can get 4080 performance for around £600 with their next gen I'd wager they'll sell a lot as from looking around more and more people are getting tired of Nvidia's silly prices.
Yes but normal people don't have £600 to spend on a graphics card. That seems to be the basis for maybe low/medium now. There's a reason people prefer consoles. *Edit
*People could spend £600 on a graphics card, but what's the point? Just so you can turn a couple of settings up? When you have a long term girlfriend and 3 kids to look after it changes the game. So AMD selling much cheaper cards makes total sense to me. Not everyone has massive £2500 Nvidia balls dangling.
 
Last edited:
The upscaling is the main thing that stops me from switching to amd, well that and DLDSR and now RTX HDR. As per usual, we can thank devs for nvidia creating solutions to problems that shouldn't exist in the first place.
Also because on an OLED, the difference between 80 FPS and 120 FPS is huge unlike an LCD. At 150 or more FPS, the motion looks almost like real life. If you want to take advantage of this, you need DLSS, period
Overclocking is not guaranteed for each and every card (and not for average Joe), but if that would be the case and you could do it... AMD fu**ed up again by offering less - besides the hot spot bug and the huge power consumption in light loads.

But even if you scale the same in RT/PT (15% gains), there's still a huge gap (almost 50%) left between 7900xtx and 4090. I'd also say that dedicat hw for RT and upscaling is also a plus. AMD solution seems to work kinda ok(ish) with low levels of RT.

With that in mind, hardware still isn't that great for the price paid and while continuing this strategy may give them a few more percentages of market share, it will not take them far.
High end users typically don't care about power consumption. The 4090 was originally supposed to consume around 600W in their engineering samples but was toned down to 450W when Nvidia realised AMD could not compete. I would have been happy with a potential 500W XTX which competed with 4090

It will not scale like that in RT. The Nvidia is 70% or more faster. No amount of overclock will change that.

Also even if it did have good RT, you are are stuck with FSR and IMO that just looks very bad in motion. I turned it on in Forbidden West and while the image seemed sharper than DLSS, in motion things just completely break down due to shimmering in foliage. It's very distracting. Thanks to AMD's insistence on locking down FSR, you cannot even customise the render scale from 67. It really needs to be at 80% or so.

The fact that you can customise the DLSS render scale on Nvidia cards in itself makes their cards a better buy. I use DLAA in RDR2, a game which hasn't been updated in over 2 years just because DLSS is so customisable.
 
Last edited:
Also because on an OLED, the difference between 80 FPS and 120 FPS is huge unlike an LCD. At 150 or more FPS, the motion looks almost like real life. If you want to take advantage of this, you need DLSS, period

High end users typically don't care about power consumption. The 4090 was originally supposed to consume around 600W in their engineering samples but was toned down to 450W when Nvidia realised AMD could not compete. I would have been happy with a potential 500W XTX which competed with 4090

It will not scale like that in RT. The Nvidia is 70% or more faster. No amount of overclock will change that.

Also even if it did have good RT, you are are stuck with FSR and IMO that just looks very bad in motion. I turned it on in Forbidden West and while the image seemed sharper than DLSS, in motion things just completely break down due to shimmering in foliage. It's very distracting. Thanks to AMD's insistence on locking down FSR, you cannot even customise the render scale from 67. It really needs to be at 80% or so.

The fact that you can customise the DLSS render scale on Nvidia cards in itself makes their cards a better buy. I use DLAA in RDR2, a game which hasn't been updated in over 2 years just because DLSS is so customisable.
This is incorrect, you can adjust the render scale using FSR. I'm currently using FSR 2.1 in cyberpunk 2077 with the render scale set at 80.
Yrj9lSP.png
 
This is incorrect, you can adjust the render scale using FSR. I'm currently using FSR 2.1 in cyberpunk 2077 with the render scale set at 80.
Yrj9lSP.png
Not supported on all games. You can only do it if the developer allows it. With DLSS, you can adjust it in all games. The vast majority of FSR games don't allow it. Horizon Forbidden West is a Nixxes title and even that doesn't allow it.
 
Also because on an OLED, the difference between 80 FPS and 120 FPS is huge unlike an LCD. At 150 or more FPS, the motion looks almost like real life. If you want to take advantage of this, you need DLSS, period

High end users typically don't care about power consumption. The 4090 was originally supposed to consume around 600W in their engineering samples but was toned down to 450W when Nvidia realised AMD could not compete. I would have been happy with a potential 500W XTX which competed with 4090

It will not scale like that in RT. The Nvidia is 70% or more faster. No amount of overclock will change that.

Also even if it did have good RT, you are are stuck with FSR and IMO that just looks very bad in motion. I turned it on in Forbidden West and while the image seemed sharper than DLSS, in motion things just completely break down due to shimmering in foliage. It's very distracting. Thanks to AMD's insistence on locking down FSR, you cannot even customise the render scale from 67. It really needs to be at 80% or so.

The fact that you can customise the DLSS render scale on Nvidia cards in itself makes their cards a better buy. I use DLAA in RDR2, a game which hasn't been updated in over 2 years just because DLSS is so customisable.

For RDR 2, the best thing to do is use DLDSR and DLSS quality, it's by far the best, largely because higher assets/lod etc. don't load in unless it's at 4k or above.
 
Also because on an OLED, the difference between 80 FPS and 120 FPS is huge unlike an LCD. At 150 or more FPS, the motion looks almost like real life. If you want to take advantage of this, you need DLSS, period

High end users typically don't care about power consumption. The 4090 was originally supposed to consume around 600W in their engineering samples but was toned down to 450W when Nvidia realised AMD could not compete. I would have been happy with a potential 500W XTX which competed with 4090

It will not scale like that in RT. The Nvidia is 70% or more faster. No amount of overclock will change that.

Also even if it did have good RT, you are are stuck with FSR and IMO that just looks very bad in motion. I turned it on in Forbidden West and while the image seemed sharper than DLSS, in motion things just completely break down due to shimmering in foliage. It's very distracting. Thanks to AMD's insistence on locking down FSR, you cannot even customise the render scale from 67. It really needs to be at 80% or so.

The fact that you can customise the DLSS render scale on Nvidia cards in itself makes their cards a better buy. I use DLAA in RDR2, a game which hasn't been updated in over 2 years just because DLSS is so customisable.

AMD are working on a much improved version of FSR.

1080P ultra performance, so worst case...

2.2

B2qSGmx.gif


3.1


zo3Ed1w.gif


 
AMD are working on a much improved version of FSR.

1080P ultra performance, so worst case...

2.2

B2qSGmx.gif


3.1


zo3Ed1w.gif



We heard the same with FSR 2 then 2.1 and then 2.2 and true in their sponsored showcases for the new versions, there was slight improvements but outside of that, all the 2.x versions are just as bad as each other so don't get your hopes up, I am suspecting it will be better but the core problem with FSR is that it is down to the devs to get the best from it so it's not quite as plug and play as what dlss is.
 
Does make me wonder - everyone is certain AMD won't have a high end for a while. But these were from the same leakers who got trolled on Twitter with fake slides. It will be interesting to see what the reality is and whether RDNA5 gets pulled forward.
 
Back
Top Bottom