• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
Not sure, I tried it via the Executable version someone created and put up for 90p :p
Do you have that link? I'll pay 90p for it :)

I found it link now but it requires the crapola Epic Megagames launcher. Nope, id rather just pay for an exe version than installing that crud again
 
Last edited:
Do you have that link? I'll pay 90p for it :)

I found it link now but it requires the crapola Epic Megagames launcher. Nope, id rather just pay for an exe version than installing that crud again

It's on Art Station which is 1 of the main sites Unreal devs and artists use to publish their works.

Make an account and go to this page of Art Station, Downloads were free up until recently but traffic increased massively so they started charging due to it hammering the site.

 
Last edited:
The Old West demo is out now, it's free too and has now been updated to use UE 5.2 + support all the upscalers and Frame Gen. I tried it out just now.


I have to say, it definitely has the "early days of this engine" vibe, as with all the other demos and stuff people have been releasing. There's something very plastic/artifical about the way things look from the way light reacts to the way the camera moves. It's very "unreal" if that makes sense.

On top of that, to get 60fps+ when using the Photo Real preset, I had to enable frame gen as well as DLSS, and even then it ranges from 60 to 90 depending on where you look. The lighting and texture quality, even though this uses full Nanite and Lumen, isn't that impressive up close, and the depth of field doesn't react quick enough to feel natural, nore is it subtle enough to look natural, just feels like it's trying too hard to mimic a camera lens, but not pulling it off just right.

With frame gen off and scaling set to the internal temporal method, it's 20-40 fps

You can see in the below screenshots that only one or two physical cores are doing the bulk of the work, so 2-4 threads factoring in HT. Efficient CPU utilisation in UE 5.2 is definitely not where you would expect it to be. The GPU utilisation appears to be 97% regardless of whether at native res, or upscaled or using frame gen.

I know this is a demo, but these issues are what we will be seeing in AAA games coming out using UE5.2 I am 90% certain, especially seeing UE 5 showcase in the new Robocop game where things look the same, with zero attention to voice lip syncing or facial animations being shown, something UE5 is supposed to be amazing at... Maybe CDPR will do it right with the next Witcher and Cyberpunk but even their track record isn't great from recent years.

I'm less and less hyped for what UE5 has to offer the more I actually download and experience first hand.

Na8VZcE.jpg


Od5IJuz.jpg


DLSS + Frame Gen on:
sBpAbOq.jpg


DLSS + Frame Gen off, TAU on:
wQKBPTF.jpg


Low complexity scene, yet 50fps??? Ah yes of course, only 20% of the CPU is being effectively used....
BiV9Mos.jpg


One thing seems to be clear though, if you want over 60fps at 1440P in UE 5.2 in a game that uses lumen and nanite and use the higher preset settings, then you need a 4090 lol unless the engine gets massively optimised.
 
Last edited:
One thing seems to be clear though, if you want over 60fps at 1440P in UE 5.2 in a game that uses lumen and nanite and use the higher preset settings, then you need a 4090 lol unless the engine gets massively optimised.

I find it funny that Red Dead Redemption 2 at 4K, Or 4K ultrawide below, Max settings etc... is an absolutely stunning game, Add in some texture mods and it gets even more drool worthy while also performing really well, Add in DLSS2 and it helps even more.

I'm really not impressed with UE5+'s performance it looks good but when you compare it to games that also look really friggin good it's clear Epic really need to get their engineers/coders to work on figuring out the massive performance issues.

RPvTAKh.jpg
 
Last edited:
Last edited:
This is with everything maxed out at 3440x1440, 4090 FE, AW3423DW with HDR 1000 enabled, DLSS Quality and FG enabled. Only thing I disabled are the post processing effects which were hurting the overall IQ (as they often do!)

Overall, I'm impressed with the lighting which is my biggest want from next gen engines along with animation but animation isn't the focus of this demo. The SS is from the most taxing area I could find in the demo.

CP2077 with RTX Overdrive + HDR still delivers the best lighting in a game/demo so far though.


image.png
 
Last edited:
What CPU do you have? Only one thread appears to be doing all the work there yet it's running at 70 degrees?! As a comparison on my 12700KF above, 2 cores 4 threads doing all the work but running at under 50 degrees which is typical of any game I've played really on this setup
 
Last edited:
What CPU do you have? Only one thread appears to be doing all the work there yet it's running at 70 degrees?! As a comparison on my 12700KF above, 2 cores 4 threads doing all the work but running at under 50 degrees which is typical of any game I've played really on this setup

Did they get Bethesda Game Studios to help out? That is basically the same as Creation Engine does! :rolleyes:

We don’t know the level of the demo optimisation or if gpu needs drivers to better handle the engine etc. also this is everything on photo realism or whatever the max slider is for each option.

If the engine is very CPU limited it's not helping one bit.
 
Last edited:
The Old West demo is out now, it's free too and has now been updated to use UE 5.2 + support all the upscalers and Frame Gen. I tried it out just now.


I have to say, it definitely has the "early days of this engine" vibe, as with all the other demos and stuff people have been releasing. There's something very plastic/artifical about the way things look from the way light reacts to the way the camera moves. It's very "unreal" if that makes sense.

On top of that, to get 60fps+ when using the Photo Real preset, I had to enable frame gen as well as DLSS, and even then it ranges from 60 to 90 depending on where you look. The lighting and texture quality, even though this uses full Nanite and Lumen, isn't that impressive up close, and the depth of field doesn't react quick enough to feel natural, nore is it subtle enough to look natural, just feels like it's trying too hard to mimic a camera lens, but not pulling it off just right.

With frame gen off and scaling set to the internal temporal method, it's 20-40 fps

You can see in the below screenshots that only one or two physical cores are doing the bulk of the work, so 2-4 threads factoring in HT. Efficient CPU utilisation in UE 5.2 is definitely not where you would expect it to be. The GPU utilisation appears to be 97% regardless of whether at native res, or upscaled or using frame gen.

I know this is a demo, but these issues are what we will be seeing in AAA games coming out using UE5.2 I am 90% certain, especially seeing UE 5 showcase in the new Robocop game where things look the same, with zero attention to voice lip syncing or facial animations being shown, something UE5 is supposed to be amazing at... Maybe CDPR will do it right with the next Witcher and Cyberpunk but even their track record isn't great from recent years.

I'm less and less hyped for what UE5 has to offer the more I actually download and experience first hand.

Na8VZcE.jpg


Od5IJuz.jpg


DLSS + Frame Gen on:
sBpAbOq.jpg


DLSS + Frame Gen off, TAU on:
wQKBPTF.jpg


Low complexity scene, yet 50fps??? Ah yes of course, only 20% of the CPU is being effectively used....
BiV9Mos.jpg


One thing seems to be clear though, if you want over 60fps at 1440P in UE 5.2 in a game that uses lumen and nanite and use the higher preset settings, then you need a 4090 lol unless the engine gets massively optimised.
Don't dev still need to make individual optimizations? If the demo maker drops everything into a couple of cores, I guess the engine doesn't know to do everything by itself?
 
Status
Not open for further replies.
Back
Top Bottom