• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

UE5 and current GPUs

Associate
Joined
15 Oct 2018
Posts
1,556
It's starting to look like the promised UE5 graphics revolution is going to come at a steeper hardware cost than many us expected or hoped for.

At the beginning of the year the 4090 was considered wasted on anything less than a high refresh 4K display.


Going by this, the 4090 can't even be considered a 4K60 GPU without a helping hand from DLSS.

Anyone else sweating about these new UE5 titles? I was hoping a 7900 XT was investment enough to enjoy the UE5 era at UWQHD / 4K60.
 
It's starting to look like the promised UE5 graphics revolution is going to come at a steeper hardware cost than many us expected or hoped for.

At the beginning of the year the 4090 was considered wasted on anything less than a high refresh 4K display.


Going by this, the 4090 can't even be considered a 4K60 GPU without a helping hand from DLSS.

Anyone else sweating about these new UE5 titles? I was hoping a 7900 XT was investment enough to enjoy the UE5 era at UWQHD / 4K60.


Epic themselves said UE5 was built to be used with upscaling tech and it's a new engine. So If any GPU today can play a UE5 game at high frames without upscaling then it would be amazing
 
Last edited:
It's starting to look like the promised UE5 graphics revolution is going to come at a steeper hardware cost than many us expected or hoped for.

At the beginning of the year the 4090 was considered wasted on anything less than a high refresh 4K display.


Going by this, the 4090 can't even be considered a 4K60 GPU without a helping hand from DLSS.

Anyone else sweating about these new UE5 titles? I was hoping a 7900 XT was investment enough to enjoy the UE5 era at UWQHD / 4K60.
4k@60 was going to be too much for 7900xt/4070ti. The only one lasting longer would have been 4090 and that's about it.



From 1:09:40 onwards .

It addresses UE5 as well
 
Aside from Fort Solis which unfortunately is as bad in the second half of the game as it is good in the first half, and is not for me due to QTEs but that is another story, I don't think I've seen anything on UE5 yet game wise which looks that great on UE5 let alone justifies the frame rate.

I think part of the problem with UE5 as well is that you have technology in there which is fundamental to the rendering overhead which developers either don't have the experience, skill or resources to really get the potential out of.
 
4k@60 was going to be too much for 7900xt/4070ti. The only one lasting longer would have been 4090 and that's about it.



From 1:09:40 onwards .

It addresses UE5 as well

For how demanding that game is to me it looks like absolute crap, a 4090 struggling to do 60fps and despite YT compression textures look muddy and visuals don't even look great. Looks more like an early PS4 game and worse still some PS4 games look better than that.
 
For how demanding that game is to me it looks like absolute crap, a 4090 struggling to do 60fps and despite YT compression textures look muddy and visuals don't even look great. Looks more like an early PS4 game and worse still some PS4 games look better than that.
High res texture have minimal impact on framerate.
 
Regardless of what you think of Aveum's graphics, its performance is down to how UE5 is working

Firstly game uses Lumen: a new feature in UE5, it's a real time global illumination and indirect lighting reflection and software Ray tracing system with potential for infinite scale (for example if you want to do light bounces on objects several kilometres from the game character you can)




Secondly the game uses Nanite: That's a feature for adding a geometric mesh system into the game with automatic and dynamic LOD assets. The idea behind Nanite is that it allows you have a lot more complex geometry in your game or just more geometry including being able to have higher geometric assets far away from the game character





So even if you don't like the game or its textures or whatever, its performance is low because it's using a feature that's designed to place more geometry in a game than any previous engines and it's doing real time software Ray tracing for lighting and reflections. Unfortunately Lumen in UE5 does not support hardware acceleration, so AMD and Nvidia GPUs are not able to speed up the Ray Tracing it's doing by using specialised cores or GPU features or AI - the Ray Tracing is all done in software by the game engine and just uses your normal GPU cores to render the scene - that's why its performance is pretty bad compared to RTX based games that support hardware acceleration
 
Last edited:
They gotta do something to keep us skiing up hill and having to buy new GPUs. ;)

on a serious note, it all sounds quite interesting to me, esp as someone skipping the 4x series........ i assume the 5x series will do it via more specialised cores and as such will be much quicker.....

its often the way when new features come in, it often takes 1, or in the case of stuff like raytracing even 2 generations after initial support before a feature becomes really useable without taking a huge hit.
 
Regardless of what you think of Aveum's graphics, its performance is down to how UE5 is working

Firstly game uses Lumen: a new feature in UE5, it's a real time global illumination and indirect lighting reflection and software Ray tracing system with potential for infinite scale (for example if you want to do light bounces on objects several kilometres from the game character you can)




Secondly the game uses Nanite: That's a feature for adding a geometric mesh system into the game with automatic and dynamic LOD assets. The idea behind Nanite is that it allows you have a lot more complex geometry in your game or just more geometry including being able to have higher geometric assets far away from the game character





So even if you don't like the game or its textures or whatever, its performance is low because it's using a feature that's designed to place more geometry in a game than any previous engines and it's doing real time software Ray tracing for lighting and reflections. Unfortunately Lumen in UE5 does not support hardware acceleration, so AMD and Nvidia GPUs are not able to speed up the Ray Tracing it's doing by using specialised cores or GPU features or AI - the Ray Tracing is all done in software by the game engine and just uses your normal GPU cores to render the scene - that's why its performance is pretty bad compared to RTX based games that support hardware acceleration
Compare the original demo of UE5 with MOVIE QUALITY ASSETS and using PER PIXEL SHADOWS to this. If Epic wasn't cheating and it actually ran it real time, it performs much better in an older version of the engine...
 
Judging by performance in Remnant 2 with FG + DLSS Quality I think there is plenty of headroom for improvement before things get unplayable and that's with a 4080 at 4k. It is a shame DLSS and FG is now relied upon, but it does seem that games are pushing the ability of pure raster to the limit and without these new advances performance would be unrealiable, however this could also be looked at at laziness on the devs side and really we ought to be getting twice the performance from our high end cards.
 
vram.png



@CAT-THE-FIFTH , 12GB not a problem:))

min-fps-3840-2160.png
 
Last edited:
vram.png



@CAT-THE-FIFTH , 12GB not a problem:))

min-fps-3840-2160.png

The AMD cards are doing better the Nvidia ones. Maybe Nvidia needs to get some new drivers out. Even at qHD when FPS is better:

performance-2560-1440.png


The RX7900XT can break 60FPS at qHD,and the RTX4070TI can't and is 10FPS slower. The RTX4070 is only 10FPS slower than an RTX4070TI.The RX6800 is just below an RTX4070(the RX6800XT is faster) and the RX6800XT is nearly 20% faster than an RTX3080 10GB. In fact Ampere isn't doing that great it appears! :(

I checked GameGPU to see if the results were off at TPU - they are not and GameGPU is testing a less intense scene it appears.

a9s8CuG.png


Wasn't the argument that people would never buy an RX7900XT for a UE5 game? I thought the RTX4070TI 12GB was the perfect dGPU for the next few years at all resolutions for UE5. Apparently not! The difference between the RTX4070 and RTX4070TI is less than the RTX4070TI and your RTX4080 16GB! :p
 
Last edited:
Don’t think Nvidia have released game-ready drivers for it, though. Will be interesting to see how that changes things.

Maybe it will fix performance. But,I thought the perfect GPU(RTX4070TI) would have the just in time perfect released drivers for UE5 games? Not that crummy AMD rubbish like the RX7900XT and the rubbish software. Don't tell me AMD....sorry...Nvidia drivers s...!:p
 
Last edited:
Back
Top Bottom