• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

If I stay at 3440x1440 then yes, even if a game in 3-4 years time comes out with twice the graphical demand of Cyberpunk 2077, then I can simply go from DLSS Quality to DLSS Performance as that takes me to over my framecap of 139fps (Gsync) whereas currently it sits at 120fps. I've already accounted for these variables of future usage of the 4090 in newer games that use path tracing. The chances of a future game being twice as demanding as Cyberpunk though are extremely low, We've already reached the peak of path tracing anyway and the rest is just technology maturity and developers getting comfortable with toolsets so I actually see things improving over the year not getting more demanding.

Even if I go 5120x2160 in a future monitor upgrade then the answer still is a yes. I'v only used 4k ultrawide so far using DLDSR and that has a performance overhead to factor in since the driver is doing additional processing and even that ran reasonably well at 100fps path traced, so a native 4K without the DSR overhead would be above that so should still be good factoring in the maturity benefits of future path tracing. Non path traced games will remain a walk in the park as they currently are even without frame gen on a 4090.

If I wanted 240Hz leveraged assuming I go for a 240Hz or greater display and I was dead set on "needing" to reach that framerate/cap then a 5090 may be needed, but I don't see that ever happening because I'm not one of those people who must reach that sort of refresh rate and fps max, past 120fps it's diminishing returns anyway and I prefer my 175Hz OLED running at 144Hz due to the native 10-bit mode anyway so it will remain like that.

Couple that with the advances in DLSS each update, DLSS Performance now looks almost as good temporally as Quality, for example, along with ReSTIR GI for Ray reconstruction that makes things more efficient and higher quality at the same time. All of this stuff is being baked into Unreal Engine 5 too as both nvidia and CDPR are helping that along due to the 15 year contract on using UE5 only by CDPR. We;re going to see some big gains in UE5 in coming years thanks to this.

I think people often forget just how far and away Nvidia made the 4090 in terms of raw power compared to everything else, it's been untouchable since launch 2 years ago, and will remain top tier for years to come really. No other card in all history has had that large of a performance delta from its lower model siblings.

Another one to bookmark. Thanks mate :D
 
Do you think the ceiling has been hit with traditional rasterisation?
It was reached years ago, just that ray/path tracing hardware didn't exist to show us what that looked like properly in terms of clean visuals and high performance. A 4080 (minimum) or above shows us that when using the full suits of DLSS3.x tech and games like Alan Wake 2 and Cyberpunk highlight that.

I say path tracing peak as in show off what it can do right now, that peak will change, not for the worse but in terms of maturity as devs get comfortable with using it more properly just like how UE4 games look ed like cack for years then towards the end of its core use in new games, looked amazing. the same is being seen now with UE5, the early UE5 games didn't look anything like what Epic showed off but new upcoming games are actually getting there.

It's mind blowing to me that a game's lighting and shadows can look like this:

501XzGe.jpg


DpsbDDc.jpg


tGDBGIv.jpg


And these are using a selection of texture mods that increase VRAM by an additional 2GB as well without a single fps impact...

And then you have areas like Jig-Jig Street where that entire area just before the entrance is puts a big demand on the GPU, yet a 4090 ploughs through it at over 100fps even during heavy gunfights:

 
Last edited:
It was reached years ago, just that ray/path tracing hardware didn't exist to show us what that looked like properly in terms of clean visuals and high performance. A 4080 (minimum) or above shows us that when using the full suits of DLSS3.x tech and games like Alan Wake 2 and Cyberpunk highlight that.

I say path tracing peak as in show off what it can do right now, that peak will change, not for the worse but in terms of maturity as devs get comfortable with using it more properly just like how UE4 games look ed like cack for years then towards the end of its core use in new games, looked amazing. the same is being seen now with UE5, the early UE5 games didn't look anything like what Epic showed off but new upcoming games are actually getting there.

It's mind blowing to me that a game's lighting and shadows can look like this:

501XzGe.jpg


DpsbDDc.jpg


tGDBGIv.jpg


And these are using a selection of texture mods that increase VRAM by an additional 2GB as well without a single fps impact...

And then you have areas like Jig-Jig Street where that entire area just before the entrance is puts a big demand on the GPU, yet a 4090 ploughs through it at over 100fps even during heavy gunfights:

That looks so realistic and amazing Mrk! Why did my game look duff?- probably as I could only do RT ultra with FSR- which frankly sucks vs the evidence you’ve posted.

I’ll wait for a cheap 4090 used or string it out till RDNA5/60 series.
 
Looks like a ps 4 game tbh :cry:







:p

Thing with AW 2 and CP 2077 PT is that is still just the tip of the iceberg in terms of being held back somewhat. I do agree though, don't think we'll see much that will really be more demanding than them titles except for full RTX remix/pt nvidia sponsored titles and some UE 5 titles where HW RT mode is enabled/used, so far most of the UE 5 titles have only been using software RT.
 
Last edited:
Looks like a ps 4 game tbh :cry:







:p

Thing with AW 2 and CP 2077 PT is that is still just the tip of the iceberg in terms of being held back somewhat. I do agree though, don't think we'll see much that will really be more demanding than them titles except for full RTX remix/pt nvidia sponsored titles and some UE 5 titles where HW RT mode is enabled/used, so far most of the UE 5 titles have only been using software RT.

I am converted!

praise-the-lord-320-x-178-gif-5lc33j7i5vgxt800.gif
 
Last edited:
@mrk you're getting on for as bad as me with Cyberpunk as i am with Star Citizen :p :D
These days I jump in to just drive around and cause mayhem just like I did with GTA3/4/5 back in the day after the game was complete. Sometimes you just need to be a menace in a virtual world to relax after a busy day.

That video there (at least on my phone) says the game in motion looks almost nothing like those still shots.

I'm not sure I follow, as in doesn't look as good or? A phone screen probably isn't the best way to determine this :p The game looks exactly the same in motion as it does in screenshots:




Which is purely Nvidia's fault for gimping the 4080.
It's only that large a gap by design. It's not rocket science why they did it either.
Due to them not wanting to repeat the RTX 30 series scenes from before no doubt, why buy a 3090 at twice the cost when a 3080 is a stone's throw in performance lol. A halo card isn't a halo card if the lower model performs nearly as good as it!

Annoying for gamers sure, but I can see the business case for it.
 
Last edited:
I'm not sure I follow, as in doesn't look as good or? A phone screen probably isn't the best way to determine this :p The game looks exactly the same in motion as it does in screenshot

Maybe the place you uploaded too compresses, I don't know. Just that video does not look as good as the still on my phone. I am not on a PC right now.
 
It's only that large a gap by design. It's not rocket science why they did it either.
This also seems to be the first time in a decade where Nvidia did not launch the 4080 Ti or 4090 Ti. Usually they always end the generation after releasing these cards. And this is with AMD still competing with 4080. Nvidia has no restrictions pricing 5090 and 5080 at whatever they want and whichever performance tier they want.
 
If I stay at 3440x1440 then yes, even if a game in 3-4 years time comes out with twice the graphical demand of Cyberpunk 2077, then I can simply go from DLSS Quality to DLSS Performance as that takes me to over my framecap of 139fps (Gsync) whereas currently it sits at 120fps. I've already accounted for these variables of future usage of the 4090 in newer games that use path tracing. The chances of a future game being twice as demanding as Cyberpunk though are extremely low, We've already reached the peak of path tracing anyway and the rest is just technology maturity and developers getting comfortable with toolsets so I actually see things improving over the year not getting more demanding.

Even if I go 5120x2160 in a future monitor upgrade then the answer still is a yes. I'v only used 4k ultrawide so far using DLDSR and that has a performance overhead to factor in since the driver is doing additional processing and even that ran reasonably well at 100fps path traced, so a native 4K without the DSR overhead would be above that so should still be good factoring in the maturity benefits of future path tracing. Non path traced games will remain a walk in the park as they currently are even without frame gen on a 4090.

If I wanted 240Hz leveraged assuming I go for a 240Hz or greater display and I was dead set on "needing" to reach that framerate/cap then a 5090 may be needed, but I don't see that ever happening because I'm not one of those people who must reach that sort of refresh rate and fps max, past 120fps it's diminishing returns anyway and I prefer my 175Hz OLED running at 144Hz due to the native 10-bit mode anyway so it will remain like that.

Couple that with the advances in DLSS each update, DLSS Performance now looks almost as good temporally as Quality, for example, along with ReSTIR GI for Ray reconstruction that makes things more efficient and higher quality at the same time. All of this stuff is being baked into Unreal Engine 5 too as both nvidia and CDPR are helping that along due to the 15 year contract on using UE5 only by CDPR. We;re going to see some big gains in UE5 in coming years thanks to this.

I think people often forget just how far and away Nvidia made the 4090 in terms of raw power compared to everything else, it's been untouchable since launch 2 years ago, and will remain top tier for years to come really. No other card in all history has had that large of a performance delta from its lower model siblings.
What's even more crazy is that 4090 is not even the full die! There is yet another 15% performance left in the tank and they could raise the power target to 550W quite easily given the overengineered coolers on the 4090. Essentially NVIDIA can release a 4090 Ti, which is 20%-25% ahead of 4090 easily if they wanted do.
 
That looks so realistic and amazing Mrk! Why did my game look duff?- probably as I could only do RT ultra with FSR- which frankly sucks vs the evidence you’ve posted.

I’ll wait for a cheap 4090 used or string it out till RDNA5/60 series.

FSR has got nothing to do with how the game looks. It's the combination fo 4K and RT/PT and he's using a mod (Nova LUT) that makes the lighting look different from the vanilla game. At 1440P I've tested it with all the upscalers and there is no difference at all when taking screenshots. In motion the game does not look as good unless you specifically play in the daytime.
 
Last edited:
FSR has got nothing to do with how the game looks. It's the combination fo 4K and RT/PT and he's using a mod (Nova LUT) that makes the lighting look different from the vanilla game. At 1440P I've tested it with all the upscalers and there is no difference at all when taking screenshots. In motion the game does not look as good unless you specifically play in the daytime.

I am not playing at 4K, I am playing at 3440x1440, but at 4K there is zero visual difference, both look great, 1440 has higher framerates.

Also it's Ray Reconstruction that makes the biggest difference, it literally changes the white balance and granular shadowing to be more realistic vs not enabling Ray Reconstruction ever since the ReSTIR GI update to RR in the game (currently only this game supports it though but expect that to change with UE5 soon when games start using HW RT).


In motion FSR looks noticeably worse than XeSS and DLSS in Cyberpunk (and others). Temporal stability and specular aliasing are the areas of concern with FSR and it's FSR 3.1 that is supposed to solve that. This isn't a matter of opinion, it is observable fact in FSR's current versions.
 
Back
Top Bottom