• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Gaming GPUs are competing for wafer allocation with CPUs and AI hardware which means they're squeezed against higher margin products.
Prices will stay high until AI deflates or there is going to be overcapacity for high density nodes, which is not very likely sadly...

True for higher end GPU's, but there is the option for the bottom half of the product stack to stay one process node behind the others.
 
Sorry for another off topic post,but is there a way to delete your own posts? Mine are full of wisdom and helpful information of course,my cat wants to know.
 
Last edited:
If they can manage 7900XT and above performance for £400-500, I'll likely get it for my next upgrade. I feel like I'm done with chasing performance at the high end now, my 3070Ti has done more than a half decent job at gaming at 1440p for 3 years. Even if I upgrade to a 4K monitor, 7900XT levels of raster performance and 2x higher RT performance will probably be enough to get me at least 4k 60fps on even RT heavy titles with FSR.
Not enough of a jump for you tbh. I’d be aiming at XTX or 4080 levels to forget about upgrading a gpu for 4 years.
 
I have a 4080 and play at 4k, no hope in hell that's lasting 4 years before I need to upgrade. 16gb is already becoming an issue with a couple of games, love my rt and dlss, but I'm hoping rdna5 will be my saviour.
 
I have a 4080 and play at 4k, no hope in hell that's lasting 4 years before I need to upgrade. 16gb is already becoming an issue with a couple of games, love my rt and dlss, but I'm hoping rdna5 will be my saviour.
Nah you'll be fine with Medium settings for a good while, hell Medium looks damn good today tbf! Anything higher seems to be largely superfluous.
 
I have a 4080 and play at 4k, no hope in hell that's lasting 4 years before I need to upgrade. 16gb is already becoming an issue with a couple of games, love my rt and dlss, but I'm hoping rdna5 will be my saviour.
That was my experience on the 30 series, first with the 3070@1440p then 3080@4K on the other system-but I don't use RTX.

However my lad (twitch shooter) uses DLSS2+3.5(game dependant) for the performance boost it brings, tweaks settings on the 4070@1440p to keep fps as high as possible under 12Gb he hates stutter when it shares with the system, he doesn't rate DLSS3 at all, although he's going to look into try to tweak PT@1080p/1440p with mods and see if he can use it in CP for his 3rd, 4th or 7th playthrough???

Think he's @mrk 's lovechild. :eek: :cry:

4080/S should have launched with 20GB min imo as Nv driver overhead only increases the more vram is shared with system ram as shown through the 40 series.

RTX is mitigating the problem to an extent, more will follow, but as @JediFragger said it'll see you through on the right settings-enjoy it, out of interest which games are giving the issue?:)
 
he doesn't rate DLSS3 at all,
That's because the 4070 doesn't have the power to sustain a minimum of 60fps pre-FG (DLSS3), so the input latency is considerably higher and the experience is poorer. A minimum of a 4080 is necessary to mitigate this which would then result in a 60fps baseline at 1440p when path tracing, so a post-FG framerate of 90+ and input latency that's actually acceptable but not 100% perfect. Only a 4090 or above is capable of fully acceptable input latency with FG enabled on a path traced game at 1440p - That is of course unless DLSS upscaling is set to Performance. Before dll version 3.7, Performance mode at 1440p would introduce some signs of distance shimmering on specular highlights, this may not be the case any more but I haven't personally checked for this yet.

So if that 4070 can use DLSS Performance with tweaked settings and then get at least 60fps in CP with PT/RR (DLSS 3.5) enabled then the post-FG experience should be mostly excellent. Assuming dll version 3.7 is being used for FG, Upscaling and now RR (3.5) too.
 
That's because the 4070 doesn't have the power to sustain a minimum of 60fps pre-FG (DLSS3), so the input latency is considerably higher and the experience is poorer. A minimum of a 4080 is necessary to mitigate this which would then result in a 60fps baseline at 1440p when path tracing, so a post-FG framerate of 90+ and input latency that's actually acceptable but not 100% perfect. Only a 4090 or above is capable of fully acceptable input latency with FG enabled on a path traced game at 1440p - That is of course unless DLSS upscaling is set to Performance. Before dll version 3.7, Performance mode at 1440p would introduce some signs of distance shimmering on specular highlights, this may not be the case any more but I haven't personally checked for this yet.

So if that 4070 can use DLSS Performance with tweaked settings and then get at least 60fps in CP with PT/RR (DLSS 3.5) enabled then the post-FG experience should be mostly excellent. Assuming dll version 3.7 is being used for FG, Upscaling and now RR (3.5) too.

Do you see the 4090 lasting for about 5 years at 1440p ultra settings? (Presume 4K at >144fps will last for 3 years?)

RDNA5 has a LOT of catching up to do!
 
Last edited:
Do you see the 4090 lasting for about 5 years at 1440p ultra settings? (Presume 4K at >144fps will last for 3 years?)

RDNA5 has a LOT of catching up to do!
If I stay at 3440x1440 then yes, even if a game in 3-4 years time comes out with twice the graphical demand of Cyberpunk 2077, then I can simply go from DLSS Quality to DLSS Performance as that takes me to over my framecap of 139fps (Gsync) whereas currently it sits at 120fps. I've already accounted for these variables of future usage of the 4090 in newer games that use path tracing. The chances of a future game being twice as demanding as Cyberpunk though are extremely low, We've already reached the peak of path tracing anyway and the rest is just technology maturity and developers getting comfortable with toolsets so I actually see things improving over the year not getting more demanding.

Even if I go 5120x2160 in a future monitor upgrade then the answer still is a yes. I'v only used 4k ultrawide so far using DLDSR and that has a performance overhead to factor in since the driver is doing additional processing and even that ran reasonably well at 100fps path traced, so a native 4K without the DSR overhead would be above that so should still be good factoring in the maturity benefits of future path tracing. Non path traced games will remain a walk in the park as they currently are even without frame gen on a 4090.

If I wanted 240Hz leveraged assuming I go for a 240Hz or greater display and I was dead set on "needing" to reach that framerate/cap then a 5090 may be needed, but I don't see that ever happening because I'm not one of those people who must reach that sort of refresh rate and fps max, past 120fps it's diminishing returns anyway and I prefer my 175Hz OLED running at 144Hz due to the native 10-bit mode anyway so it will remain like that.

Couple that with the advances in DLSS each update, DLSS Performance now looks almost as good temporally as Quality, for example, along with ReSTIR GI for Ray reconstruction that makes things more efficient and higher quality at the same time. All of this stuff is being baked into Unreal Engine 5 too as both nvidia and CDPR are helping that along due to the 15 year contract on using UE5 only by CDPR. We;re going to see some big gains in UE5 in coming years thanks to this.

I think people often forget just how far and away Nvidia made the 4090 in terms of raw power compared to everything else, it's been untouchable since launch 2 years ago, and will remain top tier for years to come really. No other card in all history has had that large of a performance delta from its lower model siblings.
 
Last edited:
If I stay at 3440x1440 then yes, even if a game in 3-4 years time comes out with twice the graphical demand of Cyberpunk 2077, then I can simply go from DLSS Quality to DLSS Performance as that takes me to over my framecap of 139fps (Gsync) whereas currently it sits at 120fps. I've already accounted for these variables of future usage of the 4090 in newer games that use path tracing. The chances of a future game being twice as demanding as Cyberpunk though are extremely low, We've already reached the peak of path tracing anyway and the rest is just technology maturity and developers getting comfortable with toolsets so I actually see things improving over the year not getting more demanding.

Even if I go 5120x2160 in a future monitor upgrade then the answer still is a yes. I'v only used 4k ultrawide so far using DLDSR and that hase a performance overhead to factor in and even that ran reasonably well at 100fps path traced, so a native 4K without the DSR overhead would be above that so should still be good factoring in the maturity benefits of future path tracing. Non path traced games will remain a walk in the park as they currently are even without frame gen on a 4090.

If I wanted 240Hz leveraged assuming I go for a 240Hz or greater display and I was dead set on "needing" to reach that framerate/cap then a 5090 may be needed, but I don't see that ever happening because I'm not one of those people who must reach that sort of refresh rate and fps max, past 120fps it's diminishing returns anyway and I prefer my 175Hz OLED running at 144Hz due to the native 10-bit mode anyway so it will remain like that.

Couple that with the advances in DLSS each update, DLSS Performance now looks almost as good temporally as Quality, for example, along with ReSTIR GI for Ray reconstruction that makes things more efficient and higher quality at the same time. All of this stuff is being baked into Unreal Engine 5 too as both nvidia and CDPR are helping that along due to the 15 year contract on using UE5 only by CDPR. We;re going to see some big gains in UE5 in coming years thanks to this.

I think people often forget just how far and away Nvidia made the 4090 in terms of raw power compared to everything else, it's been untouchable since launch 2 years ago, and will remain top tier for years to come really. No other card in all history has had that large of a performance delta from its lower model siblings.
It’s the 1080ti successor then. Best gpu that will transcend 2-3 generations.

That’s interesting that path tracing is already maxed out and the 4090 handles that tremendously well.

Do you think the ceiling has been hit with traditional rasterisation?

You are enticing me to upgrade!
 
Last edited:
Do you see the 4090 lasting for about 5 years at 1440p ultra settings? (Presume 4K at >144fps will last for 3 years?)

RDNA5 has a LOT of catching up to do!

He would like to think so as he plans to keep his card the full 5 years of his warranty. I highly doubt it would if you want to play cutting edge games like Cyberpunk. Just imagine if the next Cyberpunk game comes out in 3 years time or something like that which takes RT/PT to the next level. No way mrk won't buy a new GPU based on all his posts since he got a 4090 :p

That said 99% of the games would still run fine I think. Just latest games pushing the envelope might be an issue like Cyberpunk is now when you want to run it with PT and want 100fps or something.
 
Back
Top Bottom