• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Unreal Engine 5 - unbelievable.

Yup if it's a viable alternative to path tracing in actual games and runs like shown given that the demo was on a PS5 then game over for PC GPUs, you don't need an RTX 50 series to run that and get that visual quality with light and realtime direct/indirect shadow casting from any light source.

Which ultimately means any card that can run UE5 well will benefit as opposed to only top end RTX cards that can path trace acceptably.
 
Currently only 2 games go over 20GB VRAM and both of them are known bugs, Hogwarts and Outlaws :p

No game will go near 24GB VRAM any time soon!

Remember I've been playing 5160x2160 via DLDSR for years now on the AW3423DW and zero issues on that, DLDSR has a heavier overhead as well so native 5120x2160 will be even easier.
 
Last edited:
Currently only 2 games go over 20GB VRAM and both of them are known bugs, Hogwarts and Outlaws :p

No game will go near 24GB VRAM any time soon!

So we've got a precedent, there will be several more buggy launches over the next few years.

People were saying the same thing about VRAM a couple of years ago, and then suddenly we saw even mid-high tier nVidia GPUs hitting the limit more often.

There's also the problem that if you want to stay with nVidia and keep that magic 100fps, the next card down from the 5090 will have significantly less VRAM and may even be slower than the 4090, making the 5090 your only option.
 
Last edited:
Only 2 outliers isn't really a fair brush to paint with though. Hogwarts will never get fixed because they've abandoned it, and Outlaws is because Ubisoft are on a mission to ruin their reputation as of late. Other devs both AAA nd indie alike have proven to be more than capable of releasing games that don't spill away memory like it's oil and coolant in a 2000s BMW being used daily in 2024 :p
 
Last edited:
Only 2 outliers isn't really a fair brush to paint with though. Hogwarts will never get fixed because they've abandoned it, and Outlaws is because Ubisoft are on a mission to ruin their reputation as of late. Other devs both AAA nd indie alike have proven to be more than capable of releasing games that don't spill away memory like it's oil and coolant in a 2000s BMW being used daily in 2024 :p

So if you need to upgrade because your 4090 can't maintain 100fps in some newer titles at the settings you want to play at, what are your options other than a 5090? I doubt a 5080 will have 24GB of VRAM, and again it will probably only offer about the same performance as what you have.

The above, i.e. MegaLight, is obviously only going to be in UE5 games, and it's not like you only have to plan for games using that engine.
 
Last edited:
That won't be the case though, Unreal Engine 5 is currently the single most demanding engine on PC and all UE5 games run at 100fps+ when leveraging all the RTX tech. There is no game on the horizon that is meant to be more demanding, and on the contrary the new UE5.5 stuff shown off actually improves performance not makes it worse lol.

By the time "maybe" m,ore VRAM is needed, both 4090 and 5090 will be old news anyway.
 
Last edited:
That won't be the case though, Unreal Engine 5 is currently the single most demanding engine on PC and all UE5 games run at 100fps+ when leveraging all the RTX tech. There is no game on the horizon that is meant to be more demanding, and on the contrary the new UE5.5 stuff shown off actually improves performance not makes it worse lol.

I don't think you can say with 100% certainty that it won't be, and while it may be the most demanding, that doesn't mean there won't be games coming out in the next two years on other engines that are more demanding than what we currently have on UE5, or at least nowhere near as optimised so you need more brute force.

I'd prefer that to be the case, but I've seen similar arguments over the years and there's always something that comes along to throw a spanner in the works.
 
Last edited:
What other engines though? there is nothing, Snowdrop's latest revision runs 100fps+ and is already advanced, a hardware RT path is being added to it andwhen that happens it will still be running 100fps+ because it's using RTX cores which is quite efficient vs the current software path which is more demanding.

Any other in house engine used has now shifted to UE5, in house costs devs too much to maintain, and the only new engine only just released is Unity 6 and their demo was running on a 4090 at 4K so that tells you all that needs to be known anyways really.

The only time a game doesn't run at 100fps on a 4090 is when it's a bugged game. Are you saying people should be buying the latest and greatest every time to brute force through a handful of bugged games? Because that's just silly :cry:

Look at Starfield right now, it launched and ran at sub 80fps on a 4090 at 3440x1440, I played the DLC this week and was playing it at nearly 200fps at 4K with just DLSS being used lol.
 
Last edited:
Are you saying people should be buying the latest and greatest every time to brute force through a handful of bugged games? Because that's just silly :cry:

No, what I'm saying is that there will always be something that comes along that changes people's minds and forces their hand. It's obviously better for the consumer if a current card is still playing everything at 100+fps, best/optimum settings at 4K or 5K UW in 2 years time, but I'd be surprised if that's the case even in games that aren't buggy.

We've seen the same or similar arguments in every generation, although the 4090 is an outlier because it's so far ahead of everything else. The quality of DLSS performance with the presets now available has also helped enormously.
 
Last edited:
That hasn't been the case though with the 4090 since launch, and won't be the case going forwards for quite some time yet. Like I said, no game on a roadmapped horizon release schedule is using any new engine that might require so much more power. Only one exception I can think of and that is Cyberpunk 2 but CDPR are working exclusively with Nvidia branched UE5 so it will be tightly optimised for RTX cards.

I'm strongly confident that the 4090 has at least another 2 years left in it ready for the actual next generation of GPUs come 2027~ time.
 
That hasn't been the case though with the 4090 since launch, and won't be the case going forwards for quite some time yet. Like I said, no game on a roadmapped horizon release schedule is using any new engine that might require so much more power. Only one exception I can think of and that is Cyberpunk 2 but CDPR are working exclusively with Nvidia branched UE5 so it will be tightly optimised for RTX cards.

I'm strongly confident that the 4090 has at least another 2 years left in it ready for the actual next generation of GPUs come 2027~ time.

Looking at a roadmap doesn't really tell you anything, it might indicate that, but it's absolutely not enough to say that there won't be cases where games start to drop below 100fps at the same or similar settings you're using on your 4090.

I'm confident that in most cases it will, but I can see there being a few titles where a 5090 is required if you're not happy with 60-90fps. I'm also confident you'll have moved on from your 4090 by 2027 ;)
 
Last edited:
If they are badly optimised at launch sure I agree, but that's no reason to buy a 5090... We've been burned enough times now that no money should be spent buying games that run this way at launch and then require a whole year of patches to then get up to expected performance.
 
Yup if it's a viable alternative to path tracing in actual games and runs like shown given that the demo was on a PS5 then game over for PC GPUs, you don't need an RTX 50 series to run that and get that visual quality with light and realtime direct/indirect shadow casting from any light source.

Which ultimately means any card that can run UE5 well will benefit as opposed to only top end RTX cards that can path trace acceptably.

But it also means the extra processing power of PC dGPUs can be used for other effects too.
 
Back
Top Bottom